Spark-shell - sqlContext - error: ambiguous reference to overloaded definition

apache-spark

#1

Please disregard the below question, I was using enter key instead of tab. That was the problem.


Hi, After launching spark-shell, When trying to access sqlContext API’s getting “error: ambiguous reference to overloaded definition”.

scala> sqlContext.load
:26: error: ambiguous reference to overloaded definition,
both method load in class SQLContext of type (source: String, schema: org.apache.spark.sql.types.StructType, options: Map[String,String])org.apache.spark.sql.DataFrame
and method load in class SQLContext of type (source: String, schema: org.apache.spark.sql.types.StructType, options: java.util.Map[String,String])org.apache.spark.sql.DataFrame
match expected type ?
sqlContext.load
^

scala> sqlContext.read
res2: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@fd01c56

scala> sqlContext.read.
asInstanceOf format isInstanceOf jdbc json load option options orc parquet schema table text toString

scala> sqlContext.read.json
:26: error: ambiguous reference to overloaded definition,
both method json in class DataFrameReader of type (jsonRDD: org.apache.spark.rdd.RDD[String])org.apache.spark.sql.DataFrame
and method json in class DataFrameReader of type (jsonRDD: org.apache.spark.api.java.JavaRDD[String])org.apache.spark.sql.DataFrame
match expected type ?
sqlContext.read.json
^


#2