Sqoop hive import not working

Hi,
I’m running sqoop import with flag hive-import. For one of the table order_items its working but for other tables like products, departments, orders its failing.
Not sure why its checking on others users space

Command:
sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --table orders --hive-import

Exception
FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table orders. org.apache.hadoop.security.AccessControlException: Permission denied: user=yogesh1505, access=EXECUTE, inode="/user/amanpreetbhatia/retail_db/orders":amanpreetbhatia:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

By default, it will try to import all the tables into Hive.Default database.

I would recommend you to use --create-hive-tables and --hive-database <Your_Database_Name>.

Agree!! It’s import to default hive DB. but why 1 table order_items is working & others are not.

@yogeshnchaudhari - Probably order_items is NOT available in default so it would have created during your sqoop import. Other table might have created by others in default database.

1 Like

Thanks for reply

But as I am not using – create-hive-table param. It should override table

Do not use default database. Create your own database and pass it using --hive-database for sqoop hive imports.

1 Like