Error running spark Command : not found value sc

Hi All,

When I open and Run the spark-shell command, and type the following
scala>val data = 1 to 100
scala> val distData = sc.parallelize(data)
I get the following error:

:19: error: not found: value sc

I tried importing the apache libraries, but not able to solve the issue.
I have the latest version of java, maven and scala installed on my system (WIndows 10) and all the class paths are set appropriately
Your help is required.

when you run spark shell did it display message that Spark Context is available ?
Check below snapshot

This is what I get when I load the spark-shell and type sc

You just have to scroll up and see where it is failing.

I think I know what the error is. I don’t have Hadoop installed on my PC.

Can you please provide me the link from where I can download Hadoop for Windows 10 also what class path do I need to set after installing Hadoop.

Any guide would be of great help.

i have same problem. you should go to spark folder in usr/lib/spark/bin and run spark-shell

A post was split to a new topic: Getting Error :90: error: not found: value spark