Error while executing sqoop import_all_tables to import tables to hive



Hi, I am trying to execute sqoop import-all-tables on big data Labs with the following Arguments -

**sqoop import-all-tables \**

** --num-mappers 1 **
** --connect “jdbc:mysql://” **
** --username=retail_dba **
** --password=itversity **
** --hive-import **
** --hive-overwrite **
** --create-hive-table **
** --compress **
** --compression-codec **
** --outdir java_file**

I am getting the below error -

Logging initialized using configuration in jar:file:/usr/hdp/!/
FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table categories. Permission denied: user=nandanasgn, access=EXECUTE, inode="/user/narendrareddy/catego
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(**
** at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(**
** at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(**
** at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(**
** at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(**
** at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(**
** at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(**
** at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$**
** at org.apache.hadoop.ipc.RPC$**
** at org.apache.hadoop.ipc.Server$Handler$**
** at org.apache.hadoop.ipc.Server$Handler$**
** at Method)**
** at**
** at**
** at org.apache.hadoop.ipc.Server$**

I understand that I can execute import-all-tables only on the default hive db. That is why I am not specifying --hive-home or --hive-database.
I am not able to figure out what I have been missing in the command.

Please correct my understanding and help me with this exception.


You should be able to use the --hive-import and --hive-database options as specified here to import all tables to hive:

However, I just tried the following command, and it only imported the first table:

sqoop import-all-tables --connect jdbc:mysql:// --username retail_dba --password itversity --m 1 --hive-database dbv --hive-import --hive-overwrite --create-hive-table --as-textfile --compression-codec=snappy --outdir java_out


Hi Vinod,

Thanks for information on the –hive-database option. I tried the below command, and could import all the tables from mysql db to hive db.

sqoop import-all-tables --connect “jdbc:mysql://” --username retail_dba --password itversity --m1 --hive-database zzzzz_sqoop_import_a
ll_nn --hive-import --hive-overwrite --create-hive-table --as-textfile --compression-codec=snappy --outdir java_out

verification on mysql console -

verification on the hive console -

Thank you once again for resolving this one.