Not able to see hive database in Spark using HiveContext


#1

Hello,
I have used
val hc = new org.apache.spark.sql.hive.HiveContext(sc)
hc.sql(“show databases”).show

This gives:
±------+
| result|
±------+
|default|
±------+

When I launch hive> I can see that there are 4 databases in hive.
Can anyone please tell what is the mistake I have done?


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster


#2

I have solved out this problem myself. We need to copy hive-site.xml into /usr/lib/spark/conf or create a soft link for the same.
To create a soft link:
sudo ln -s /usr/lib/hive/conf/hive-site.xml /usr/lib/spark/conf/hive-site.xml


#3

Great to know and thank you for providing the solution. Which VM image you are using?


With our labs, one need not have to troubleshoot these issues, as Spark, HDFS, Hive and other technologies are well integrated.



#4

I am using Cloudera Quickstart.