Symbolic Link issue in hive conf

Hi,

i tired to link hive-site xml from hive conf to spark conf.I got this below error

[rajeshwaranb@gw01 conf]$ ln -s /etc/hive/conf/hive-site.xml /etc/spark/conf/hive-site.xml
ln: failed to create symbolic link ‘/etc/spark/conf/hive-site.xml’: File exists

Looks already exists in spark conf folder.but not link with hive conf.

/etc/spark/conf
-rw-r–r-- 1 spark spark 733 May 27 14:16 hive-site.xml
-rw-r–r-- 1 spark spark 621 Jun 9 08:29 log4j.properties

How to link this xml file?.I have to run hivecontext.Please advise

If file already exists in the spark conf folder then no need to create the soft link. We either need to copy the file or make soft link. Advantage of soft link is if there are any changes in the original file in hadoop conf then it will automatically get reflected in spark conf folder.

1 Like

Thx.but i’m not able to run any sql/hive context queries.When i run below commend.i got this error.

scala> sqlContext.sql(“select * from departments”);

org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table departments. org.apache.hadoop.security.AccessControlException: Permission denied: user=rajeshwaranb, access=EXECUTE, inode="/user/vishaljoneja/sqoop_import/departments":vishaljoneja:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecke

r.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenode

@Rajarajeshwaran - Please use your own database to fetch the data.

if the departments table in default database is created by others then you won’t have access to fetch those.
e.g
scala> sqlContext.sql(“select * from gnanaprakasam.departments”);

Here gnanaprakasam is database.

Thank you.It’s working now