Unable to run query on temp table using sqlContext.sql

I created a data frame using a text file and then ran the registerTempTable command.
Data Frame was created successfully and no error message was displayed after the registerTempTable command either.

But when I try to run select query on the temp table using sqlContext.sql(“select * from limit 10”),
I get an exception - Table Not Found. I cross-checked that I am using the right temp table name, so I am not sure why the table not found exception is being thrown. Can someone please help me understand what I am missing?

you are missing table name in the below statement
sqlContext.sql(“select * from limit 10”),

I had the table name in the original query on the console. It didn’t get copied here for some reason.
So, no table name can’t be the issue.
Can you advise of any other issue that you think might be causing the exception?

I am facing the same issue with another Temp table as well. Please let me know which step am I missing.

case class Products(
product_id: Int,
product_category_id: Int,
product_name: String,
product_description: String,
product_price: Float,
product_image: String)

val productsDF = sc.textFile("/user/nehaluthra09/cloudera/problem2/products").map(rec => {
val r = rec.split(’|’)
Products(r(0).toInt, r(1).toInt, r(2), r(3), r(4).toFloat, r(5))}).toDF.filter($“product_price” < 100)

scala> productsDF.count()
res1: Long = 831

scala> val sqlContext = new SQLContext(sc)
sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@1ae73d9e

scala> productsDF.registerTempTable(“ProductsData”)

scala> sqlContext.sql(“select * from ProductsData limit 10”)
org.apache.spark.sql.AnalysisException: Table not found: ProductsData;
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:305)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:314)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:309)

Can someone please help me with this?

Here is the working code. You need to first import sqlContext.implicits._ before using toDF and register it as temp table.

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._

case class Products(
product_id: Int,
product_category_id: Int,
product_name: String,
product_description: String,
product_price: Float,
product_image: String)

val productsDF = sc.textFile("/user/nehaluthra09/cloudera/problem2/products").map(rec => {
val r = rec.split('|')
Products(r(0).toInt, r(1).toInt, r(2), r(3), r(4).toFloat, r(5))}).toDF.filter($"product_price" < 100)

productsDF.registerTempTable("ProductsData")

sqlContext.sql("select * from ProductsData limit 10")

It worked. Thank you so very much!