org.apache.spark.sql.AnalysisException: Table not found: in SPARK

Hi Team @itversity ,

I have register the temp table and it is registered but it is giving error :slight_smile:
org.apache.spark.sql.AnalysisException: Table not found: in SPARK
Below is my code:
case class Orders(order_id: Int, order_date: String, order_customer_id: Int, order_status: String);
val ord = sc.textFile("/user/jonuchauhan/textfile/orders").map(x=> Orders(x.split(",")(0).toInt,x.split(",")(1),x.split(",")(2).toInt,x.split(",")(3)));
val ords = ord.toDS();
ords.registerTempTable(“kkk”);

Br,
Jonu chauhan

val ords = ord.toDF();

Hi ,
I have converted to dataframe also but it is not working!
@itversity

are you getting any error???. try to print and see the result after every command

Hi ,

org.apache.spark.sql.AnalysisException: Table not found: in SPARK :‘kkk’

were you able to print the results of ord and ords successfully…

yes
scala> case class Orders(order_id: Int, order_date: String, order_customer_id: Int, order_status: String);
defined class Orders
scala> val ord = sc.textFile("/user/jonuchauhan/textfile/orders").map(x=> Orders(x.split(",")(0).toInt,x.split(",")(1),x.split(",")(2).toInt,x.split(",")(3)));
ord: org.apache.spark.rdd.RDD[Orders] = MapPartitionsRDD[8] at map at :44
scala> val k = ords.toDF();
k.registerTempTable(“oll”);
val p = sq.sql(“select * from oll”);
org.apache.spark.sql.AnalysisException: Table not found: oll;
@itversity @perraju

What’s your spark Version?.

apparently there is no logic or syntax issue in your code. And works fine on Spark 1.6.0

My observation:
Need to run the below Imports, even before creating a DF .This fixed the table not found exception in SparkSQL
import org.apache.spark.sql.SQLContext
val SqlContext = new SQLContext(sc)
import SqlContext.implicits._

I ran into a similar issue with Table not found exception today ( in Lab)
In mycase, I gave the above Import commands after creating the DFs in which case SparkSQL queries errored with Table not found exception, although they were registered as TempTable.

@jonu_chauhan

Method 1:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val df_2 = sc.parallelize(Seq((1L, 3.0, “a”), (2L, -1.0, “b”), (3L, 0.0, “c”))).toDF(“x”, “y”, “z”)
Use the below link

check each step whether you are doing correct or not