java.lang.OutOfMemoryError: unable to create new native thread

pyspark

#1

Hi Team,

I am using ITVersity lab . But since morning I am unable to strat pyspark. Its giving me below error:

18/12/05 14:15:25 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.lang.OutOfMemoryError: unable to create new native thread

I was able to start pyspark before but suddenly started getting above error. Could you please help fix the issue.

Thanks,
Shweta


#2

@Shweta_Tanwar

Launch pyspark using below command.

pyspark --conf spark.ui.port=11890