Spark-shell or pyspark not working

Hi ,

Spark is not working using spark-shell or spark-shell --master local, getting errrors, please fix this

[awsphani@gw01 ~]$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
17/01/06 17:59:50 INFO SecurityManager: Changing view acls to: awsphani
17/01/06 17:59:50 INFO SecurityManager: Changing modify acls to: awsphani
17/01/06 17:59:50 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disa
bled; users with view permissions: Set(awsphani); users with modify permissions: Set(awsphani)
17/01/06 17:59:51 INFO HttpServer: Starting HTTP Server
17/01/06 17:59:51 INFO Server: jetty-8.y.z-SNAPSHOT
17/01/06 17:59:51 INFO AbstractConnector: Started SocketConnector@0.0.0.0:55978
17/01/06 17:59:51 INFO Utils: Successfully started service ‘HTTP class server’ on port 55978.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.2
/
/
Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
17/01/06 17:59:53 INFO SparkContext: Running Spark version 1.6.2
17/01/06 17:59:53 INFO SecurityManager: Changing view acls to: awsphani
17/01/06 17:59:53 INFO SecurityManager: Changing modify acls to: awsphani
17/01/06 17:59:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disa
bled; users with view permissions: Set(awsphani); users with modify permissions: Set(awsphani)
17/01/06 17:59:53 INFO Utils: Successfully started service ‘sparkDriver’ on port 36099.
17/01/06 17:59:54 INFO Slf4jLogger: Slf4jLogger started
17/01/06 17:59:54 INFO Remoting: Starting remoting
17/01/06 17:59:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.16.1.100:40890]
17/01/06 17:59:54 INFO Utils: Successfully started service ‘sparkDriverActorSystem’ on port 40890.
17/01/06 17:59:54 INFO SparkEnv: Registering MapOutputTracker
17/01/06 17:59:54 INFO SparkEnv: Registering BlockManagerMaster
17/01/06 17:59:54 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-e451d9ad-7de6-4a57-aea8-368f34b5a438
17/01/06 17:59:54 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
17/01/06 17:59:54 INFO SparkEnv: Registering OutputCommitCoordinator
17/01/06 17:59:54 INFO Server: jetty-8.y.z-SNAPSHOT
17/01/06 17:59:54 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at 6 17:59:55 INFO DiskBlockManager: Shutdown hook called
17/01/06 17:59:55 INFO ShutdownHookManager: Shutdown hook called
17/01/06 17:59:55 INFO ShutdownHookManager: Deleting directory /tmp/spark-72a7c40d-0811-4006-9a03-598243aafd8e
17/01/06 17:59:55 INFO ShutdownHookManager: Deleting directory /tmp/spark-87112ceb-0503-49bf-85cf-d1cdebffcc6e
17/01/06 17:59:55 INFO ShutdownHookManager: Deleting directory /tmp/spark-87112ceb-0503-49bf-85cf-d1cdebffcc6e/userFiles-044def0d-fd73-4da5-a232-c4703c84c15b
[awsphani@gw01 ~]$

Thanks

There is no issues. I just initiated pyspark. Please use this commmand.

pyspark --conf “spark.ui.port=10403” --master local

local is bydefault, no need to mention.

thanks Raghu,
still spark with scala is not working

[awsphani@gw01 ~]$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00007a4159000000, 7161774
08, 0) failed; error=‘Cannot allocate memory’ (errno=12)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 716177408 bytes for committing reserved memory

.

An error report file with more information is saved as:

/home/awsphani/hs_err_pid14570.log

can you please check?

Thanks

use same conf option that will work. You can use any 5 digit as port number.

spark-shell --conf “spark.ui.port=10928” --master local

Thanks Raghu, this works now

Try now. It is back to normal.