Spark SQL error

Hi I am trying to execute a Spark SQL with HiveContext.
I need to execute it in batch mode,Not in Interactive mode.

here is my code:
from pyspark import SparkConf,SparkContext,SQLContext
from pyspark.sql import *
from pyspark.sql import HiveContext
sql1=""“CREATE TABLE SPARK_SQL_TBL(id int,name string)”""

Command to execute:
$SPARK_HOME/bin/spark-submit --master local /root/spark-pgms/

I am getting below error:
File “/root/spark-pgms/”, line 8, in
File “/root/spark/python/lib/”, line 516, in sql
File “/root/spark/python/lib/”, line 619, in _ssql_ctx
Exception: (“You must build Spark with Hive. Export ‘SPARK_HIVE=true’ and run build/sbt assembly”, Py4JJavaError(u’An error occurred while calling\n’, JavaObject id=o18))

Where should I set SPARK_HIVE=true what is sbt?

Please let me know How to resolve this error.