In my office they were given one new requirement for me. that is spark with scala unit testing. In windows machine we are able to test my application.
So, my manager was suggested me Hadoop Mini Cluster.
so can you please elaborate how can we implemet in hadoop mini lcuster in our local mode.
here, we need spark, hdfs, and hive.
becasue our application code and data is hit into the hive so hive is mandatory.
Is there any possible solutions can you please let me know your comments.