Spark SQL error

Hi I am trying to execute a Spark SQL with HiveContext.
I need to execute it in batch mode,Not in Interactive mode.

here is my code:
from pyspark import SparkConf,SparkContext,SQLContext
from pyspark.sql import *
from pyspark.sql import HiveContext
conf=SparkConf().setAppName(“sparksqlexample”)
sc=SparkContext(conf=conf)
sqlContext=HiveContext(sc)
sql1=""“CREATE TABLE SPARK_SQL_TBL(id int,name string)”""
sqlContext.sql(sql1)

Command to execute:
$SPARK_HOME/bin/spark-submit --master local /root/spark-pgms/sample.py

I am getting below error:
File “/root/spark-pgms/sample.py”, line 8, in
sqlContext.sql(sql1)
File “/root/spark/python/lib/pyspark.zip/pyspark/sql/context.py”, line 516, in sql
File “/root/spark/python/lib/pyspark.zip/pyspark/sql/context.py”, line 619, in _ssql_ctx
Exception: (“You must build Spark with Hive. Export ‘SPARK_HIVE=true’ and run build/sbt assembly”, Py4JJavaError(u’An error occurred while calling None.org.apache.spark.sql.hive.HiveContext.\n’, JavaObject id=o18))

Where should I set SPARK_HIVE=true what is sbt?

Please let me know How to resolve this error.