Getting Error while Joining Data Using Native Sql in Scala in BigData Labs

While practising in BigData Labs ,I am facing error while trying to Join DataSets using Native Sql in Scala

It is throwing error while trying to Create Schema.

scala> import sqlContext.createSchemaRDD
:62: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext
import sqlContext.createSchemaRDD

Can anyone please assist me on this.

below are the complete steps which i am doing to create temp table in scala spark:

scala> val dataRDD=sc.textFile("/user/balarsandeep/sandeep/sqoop/sqoop_import/orders")

17/02/19 17:29:13 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 336.8 KB, free 1432.3 KB)
17/02/19 17:29:13 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 28.4 KB, free 1460.7 KB)
17/02/19 17:29:13 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:51144 (size: 28.4 KB, free: 511.0 MB)
17/02/19 17:29:13 INFO SparkContext: Created broadcast 3 from textFile at :40
dataRDD: org.apache.spark.rdd.RDD[String] = /user/balarsandeep/sandeep/sqoop/sqoop_import/orders MapPartitionsRDD[17] at textFile at :40

scala> val dataRDDMap=dataRDD.map(a=>(a.split(",")))
dataRDDMap: org.apache.spark.rdd.RDD[Array[String]] = MapPartitionsRDD[18] at map at :42

scala> case class Orders(o_id:Int,o_date:String,o_c_id:Int,o_status:String)
defined class Orders

scala> val orders=dataRDDMap.map(a=>Orders(a(0).toInt,a(1),a(2).toInt,a(3)))
orders: org.apache.spark.rdd.RDD[Orders] = MapPartitionsRDD[19] at map at :46

scala> import sqlc.createSchemaRDD
:42: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext
import sqlc.createSchemaRDD
^
scala> orders.registerTempTable(“Orders_Scala”)
:49: error: value registerTempTable is not a member of org.apache.spark.rdd.RDD[Orders]
orders.registerTempTable(“Orders_Scala”)
^