Scala error -apache is not a member of package org

scala> import org.apache.spark.SparkConf
:11: error: object apache is not a member of package org
import org.apache.spark.SparkConf

I am getting teh above error ^

@j_thomas This happens whenever spark-shell is not launched successfully or spark core library is not included in self-contained application. Can you check whether spark-shell has been launched successfully?

Scala is launched successfully. As you can see I am typing that on the scala prompt
In the $ prompt I typed scala, and I got the scala prompt

@j_thomas It doesn’t mean that if you get scala prompt then your spark-shell has been launched successfully. Even if you get some exceptions/errors while launching spark-shell you will still see scale-prompt. So please check if there are any error in the logs on console.

Scala launched successfully, but I am having problem creating val sc.
see below is a part of the error.
scala> val sc = new SparkContext(conf)
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:

add this line to build.sbt
libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.3”
(1.6.3 is old version . you can use latest version is 2.1.1)

and run sbt package command from scala project directory
it will download alll required spark jars.

Hii there, am new to scala, i created a project through inteli j IDE
and wrote a small code to read json file, unfortunately when i run it gives "object apache is not member of package import org.apache.spark.sql.SparkSession "
I googled and tried few options like adding these line in sbt but it never worked, can some one pls help here
my sbt folder looks like

name := “Assignment-1”
version := “0.1”
scalaVersion := “2.13.1”

This does not work in lab. In lab sbt console gives error
“object apache is not a member of package org”
for import org.apache.spark.SparkContext