Permission denied issue with SQL context

Hi team,

I facing the below issue while i am trying the access the table though sqlContext:

sqlContext.sql(“select * from departments”).collect().foreach(println)

Apparently, the above code is trying to access the dHDFS location “/user/vishaljoneja/sqoop_import/departments”. Don’t know why.

scala> sqlContext.sql(“select * from departments”).collect().foreach(println)
17/01/17 17:01:58 INFO ParseDriver: Parsing command: select * from departments
17/01/17 17:01:59 INFO ParseDriver: Parse Completed
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table departments. org.apache.hadoop.security.AccessControlException: Permission denied: user=shubhaprasadsamal, access=EXECU
TE, inode="/user/vishaljoneja/sqoop_import/departments":vishaljoneja:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

Please suggest.

Thanks,
Shitansu

Any suggestion will be highly appreciated as one of my friends is also facing the same issue.

Thanks,
Shitansu

Create your own database and hive tables. Then use spark sql like this

sqlContext.sql(“select * from database_name.departments”).collect().foreach(println)