Hive Access Error

I’m not able to access any HIVE table in HIVEContext. when trying to access Hive, I’m getting “Permission denied” error

joinAggData = sqlContext.sql(“select o.order_date, round(sum(oi.order_item_subtotal), 2), count(distinct o.order_id) from orders o join order_items oi
… on o.order_id = oi.order_item_order_id group by o.order_date order by o.order_date”)
Traceback (most recent call last):
File “”, line 2, in
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/context.py”, line 580, in sql
return DataFrame(self._ssql_ctx.sql(sqlQuery), self)
File “/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py”, line 813, in call
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/utils.py”, line 45, in deco
return f(*a, **kw)
File “/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py”, line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o38.sql.
: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table orders. org.apache.hadoop.security.AccessControlException: Permission denied: user=saraswatpankaj, access=EXECUTE, inode="/user/pradhanamit77/retail_db/orders":pradhanamit77:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)

@srpankaj12 - You have to use database name in front of table name. Currently it’s referring default database in that table would have created by others.

e.g
gnanaprakasam.orders

Or run sqlContext.sql(“use DATABASE_NAME”) then run the query.

1 Like

tried like prefixing database name with a (.) like default.orders. hitting again the same error.

tried accessing hive thru CLI, no luck on that …
hive (default)>
> describe default.products;
FAILED: SemanticException Unable to fetch table products. org.apache.hadoop.security.AccessControlException: Permission denied: user=saraswatpankaj, access=EXECUTE, inode="/user/parulshine92/retail_db/products":parulshine92:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)

@srpankaj12 - You have to use your own database. The table you are referring created by other user in default database.