Error while executing sqoop import_all_tables to import tables to hive

hive
sqoop

#1

Hi, I am trying to execute sqoop import-all-tables on big data Labs with the following Arguments -

**sqoop import-all-tables \**

** --num-mappers 1 **
** --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” **
** --username=retail_dba **
** --password=itversity **
** --hive-import **
** --hive-overwrite **
** --create-hive-table **
** --compress **
** --compression-codec org.apache.hadoop.io.compress.SnappyCodec **
** --outdir java_file**

I am getting the below error -

Logging initialized using configuration in jar:file:/usr/hdp/2.5.0.0-1245/hive/lib/hive-common-1.2.1000.2.5.0.0-1245.jar!/hive-log4j.properties
FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table categories. org.apache.hadoop.security.AccessControlException: Permission denied: user=nandanasgn, access=EXECUTE, inode="/user/narendrareddy/catego
ries":narendrareddy:hdfs:drwx------
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)**
** at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)**
** at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)**
** at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)**
** at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)**
** at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)**
** at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)**
** at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)**
** at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)**
** at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)**
** at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)**
** at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)**
** at java.security.AccessController.doPrivileged(Native Method)**
** at javax.security.auth.Subject.doAs(Subject.java:422)**
** at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)**
** at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)**

I understand that I can execute import-all-tables only on the default hive db. That is why I am not specifying --hive-home or --hive-database.
I am not able to figure out what I have been missing in the command.

Please correct my understanding and help me with this exception.


#2

You should be able to use the --hive-import and --hive-database options as specified here to import all tables to hive:

https://issues.apache.org/jira/browse/SQOOP-912

However, I just tried the following command, and it only imported the first table:

sqoop import-all-tables --connect jdbc:mysql://nn01.itversity.com:3306/retail_db --username retail_dba --password itversity --m 1 --hive-database dbv --hive-import --hive-overwrite --create-hive-table --as-textfile --compression-codec=snappy --outdir java_out


#3

Hi Vinod,

Thanks for information on the –hive-database option. I tried the below command, and could import all the tables from mysql db to hive db.

sqoop import-all-tables --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --m1 --hive-database zzzzz_sqoop_import_a
ll_nn --hive-import --hive-overwrite --create-hive-table --as-textfile --compression-codec=snappy --outdir java_out

verification on mysql console -
image

verification on the hive console -

Thank you once again for resolving this one.


#4