Not able to use hivecontext in spark shell

Heading

Hello Durga sir
I am trying to following the following video where there is an introduction on spark and we are firing query from spark sql context. but after creating temporary link of hive-site.xml in spark conf file when i am trying to run the query in spark shell it is failing . also spark shell is not starting with hive context . it give following logs :-

Heading

os.arch=amd64
os.version=2.6.32-573.el6.x86_64
derby.system.home=null
Database Class Loader started - derby.database.classpath=’'
17/06/29 01:13:22 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
17/06/29 01:13:23 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
17/06/29 01:13:29 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
SQL context available as sqlContext.

It will be tough to troubleshoot issues on your virtual machine.

Most likely some of the services might be down, especially mysql or derby where hive metastore is stored.