Unable to use existing hive in spark

Hi, I am trying to access hive tables from spark code. I have installed hive 0.14 version and put the hive-site.xml spark conf folder. I am using below simple code

val hqlContext = new org.apache.spark.sql.hive.HiveContext(spark)
hqlContext.sql(“CREATE TABLE IF NOT EXISTS src (key INT, value STRING)”)

but from console log it seems that spark is not able to recognize underlying hive metastore and trying to access default derby metastore. Below is my console logs:

17/03/04 04:21:09 INFO ObjectStore: ObjectStore, initialize called
17/03/04 04:21:10 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
17/03/04 04:21:10 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
17/03/04 04:21:16 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
17/03/04 04:21:19 INFO Datastore: The class “org.apache.hadoop.hive.metastore.model.MFieldSchema” is tagged as “embedded-only” so does not have its own datastore table.
17/03/04 04:21:19 INFO Datastore: The class “org.apache.hadoop.hive.metastore.model.MOrder” is tagged as “embedded-only” so does not have its own datastore table.
17/03/04 04:21:24 INFO Datastore: The class “org.apache.hadoop.hive.metastore.model.MFieldSchema” is tagged as “embedded-only” so does not have its own datastore table.
17/03/04 04:21:24 INFO Datastore: The class “org.apache.hadoop.hive.metastore.model.MOrder” is tagged as “embedded-only” so does not have its own datastore table.
17/03/04 04:21:25 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
17/03/04 04:21:25 INFO ObjectStore: Initialized ObjectStore
17/03/04 04:21:25 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/03/04 04:21:26 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
17/03/04 04:21:26 INFO HiveMetaStore: Added admin role in metastore
17/03/04 04:21:26 INFO HiveMetaStore: Added public role in metastore
17/03/04 04:21:27 INFO HiveMetaStore: No user is added in admin role, since config is empty
17/03/04 04:21:27 INFO HiveMetaStore: 0: get_all_databases
17/03/04 04:21:27 INFO audit: ugi=cloudera ip=unknown-ip-addr cmd=get_all_databases
17/03/04 04:21:27 INFO HiveMetaStore: 0: get_functions: db=default pat=*
17/03/04 04:21:27 INFO audit: ugi=cloudera ip=unknown-ip-addr cmd=get_functions: db=default pat=*
17/03/04 04:21:27 INFO Datastore: The class “org.apache.hadoop.hive.metastore.model.MResourceUri” is tagged as “embedded-only” so does not have its own datastore table.
Exception in thread “main” java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
at org.apache.spark.sql.UDFRegistration.(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.(SQLContext.scala:330)
at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:90)
at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
at org.scala.spark.WordCount$delayedInit$body.apply(WordCount.scala:22)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at org.scala.spark.WordCount$.main(WordCount.scala:7)
at org.scala.spark.WordCount.main(WordCount.scala)
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
… 20 more}

You need to check the permissions of /tmp/hive

Hi Sir!!!

Yes, I also thought in that way only but I already gave this directory 777 permission but still facing this issue. Below is the screenshot for the same.

Thanks,
Kapil(9582546615)

In this case it is referring to local filesystem /tmp/hive, I guess