Spark SQL - Read SequenceFile

Hi,

I am trying to read data from a sequence file and create data frame then, i am trying to save this temp table. But it is not creating the temp table.I have imported all necessary packages.

Code:

val seqData=sc.sequenceFile("/user/vimaldoss18/problem5/sequence/",classOf[org.apache.hadoop.io.Text],classOf[org.apache.hadoop.io.Text])

val segDF=seqData.map(rec=>{
val r=rec._2.toString.split("\t")
Orders(r(0).toInt,r(1),r(2).toInt,r(3))
}).toDF()

segDF.registerTempTable(“order_seq”)

sqlContext.sql(“select * from order_seq”) throws below error

org.apache.spark.sql.AnalysisException: Table not found: order_seq;
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:305)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:314)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:309)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:281)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)

@vimal.doss18 Please refer to the below thread

Let us know if you still have any issues. Thanks!

@ashok_singamaneni Thanks Ashok… as you mentioned I switched between SQLContext and HiveContext… It worked for me after restarting my spark-shell.