Hive: creating external table: Permission denied

Hi Guys,

While creating a partition on an external table. i am getting this issue. please help.
i think there is some authorization error. i am logged in an as the user mohangowda and in the error it is pointing to a different user name.
Table created :
create external table logdata(col1 string, col2 string) partitioned by (month string) location ‘/user/mohangowda/’;

Add partition command :
ALTER TABLE logdata ADD PARTITION (month=‘jan2016’) location ‘/user/mohangowda/jan2016’;
Error i am getting:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=gnanaprakasam(dont know how this user name is coming here), access=EXECUTE, inode="/user/mohangowda/jan2016":mohangowda:hdfs:drwx------

Thanks

@mohanp01 - What group/user access shows for ?
hadoop fs -ls /user/mohangowda/jan2016

@itversity @Vinay - Could you please look into this.

for ls command it shows the user as mohangowda and the group as hdfs.
and nice to see the reply from the user gnanaprakasam himself.

oh and now the user name in my error has changed to : sekarelumala

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=sekarelumalai, access=EXECUTE, inode="/user/mohangowda/jan2016":mohangowda:hdfs:drwx------

It seem to be bug to specify the location for partition in the external table when we restrict the permissions. I am able to create managed table with partitions.

@itversity

I am still facing issues with adding partition to already created table. Can you please help on this one.
alter command:

alter table logdata add partition(month=‘jan2016’) location ‘/user/mohangowda/jan2016’;

Error i am getting :

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=kondareddyb(again a different user name), access=EXECUTE, inode="/user/mohangowda/jan2016":mohangowda:hdfs:drwx------

hello @mohanp.sit , is your problem resolved?
i am also facing the same issue and every time i fire this ALTER query i get different usernames.

Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=pratyush04, access=EXECUTE, inode="/user/abinashparida/avro/order_month=2014-01":abinashparida:hdfs:drwx------

@itversity : sir can you please look into this.

thank you.

It seems to be not possible to alter external table when permissions are restricted. It is better to use managed table.

You can check whether there is any bug reported or you can raise jira ticket to the apache hive project.

hello sir , i am performing this query on managed table only, based on your tutorial video for partitioning in hive.

Can you share the complete script?

Hello Sir,
please find below:

hive> CREATE TABLE ORDERS(
order_id int,
order_date bigint,
order_customer_id int,
order_status string
)
PARTITIONED BY (order_month string)
STORED AS AVRO
LOCATION '/user/abinashparida/avro’
TBLPROPERTIES(‘avro.schema.url’ = ‘/user/abinashparida/avro_schema/orders.avsc’)
;
OK
Time taken: 0.442 seconds
hive> describe orders;
OK
order_id int
order_date bigint
order_customer_id int
order_status string
order_month string

Partition Information

col_name data_type comment

order_month string
Time taken: 0.064 seconds, Fetched: 10 row(s)

hive> alter table orders add partition (order_month = ‘2014-01’);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=sekarelumalai, access=EXECUTE, inode="/user/abinashparida/avro/order_month=2014-01":abinashparida:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

When you run hive queries, it will run as hive user. Hive user will not have complete access to location you have specified in this case /user/ainashparida/avro.

You just remove location and it should work fine.

1 Like

Sir,
what is the hive default directory in big-data labs , when i do a hadoop fs -ls /user/hive/warehouse . it says no directory found .
where to locate the hive-site.xml .

/apps/hive/warehouse is the hive default directory in hortonworks distributions.

got it, thank you sir.