Error while creating ORC file


I am getting an error when am trying to create ORC file.
Error: assertion failed: The ORC data source can only be used with HiveContext.
Can someone please help me out with the error?

Here are the steps I am running:
scala> val prodcsvDF = => {
| val r = rec.split(",")
| Prodcsv(r(0).toInt, r(1), r(2), r(3).toInt, r(4).toFloat)
| }).toDF
prodcsvDF: org.apache.spark.sql.DataFrame = [productid: int, productcode: string, name: string, quantity: int, price: float]

|productid|productcode| name|quantity| price|
| 1001| PEN| Pen Red| 5000| 1.23|
| 1002| PEN| Pen Blue| 8000| 1.25|
| 1003| PEN|Pen Black| 2000| 1.25|
| 1004| PEC|Pencil 2B| 10000| 0.48|
| 1005| PEC|Pencil 2H| 8000| 0.49|
| 1006| PEC|Pencil HB| 0|9999.99|

scala> prodcsvDF.write.orc("/user/yogeshdewan83/scenario81out")
java.lang.AssertionError: assertion failed: The ORC data source can only be used with HiveContext.
at scala.Predef$.assert(Predef.scala:179)
at org.apache.spark.sql.hive.orc.DefaultSource.createRelation(OrcRelation.scala:58)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:242)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


The program worked today so please ignore aforementioned error … I started a new session today and in Itvesity lab environment setup, sqlContext by default points to hivecontext so I am able to write ORC file.