Spark is throwing error in Big-data labs

Hi Durga,
I am trying to execute pyspark command and following error is displayed
17/01/31 10:31:47 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an a
vailable port or increasing spark.port.maxRetries.

I am facing issues using spark-shell command too.

Additional Error Details

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Python 2.7.5 (default, Sep 15 2016, 22:37:39)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
17/01/31 10:31:46 INFO SparkContext: Running Spark version 1.6.2
17/01/31 10:31:46 INFO SecurityManager: Changing view acls to: padmabanare
17/01/31 10:31:46 INFO SecurityManager: Changing modify acls to: padmabanare
17/01/31 10:31:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(padmabanare); users with modify permissions: Set(padmabanare)
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
17/01/31 10:31:47 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an a
17/01/31 10:31:47 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1.
vailable port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
17/01/31 10:31:47 INFO SparkContext: Successfully stopped SparkContext
Traceback (most recent call last):
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/shell.py”, line 43, in
sc = SparkContext(pyFiles=add_files)
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/context.py”, line 115, in init
conf, jsc, profiler_cls)
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/context.py”, line 172, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/context.py”, line 235, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File “/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py”, line 1064, in call
File “/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py”, line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an
available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)

You can try this - Getting Error While Running PySpark