Unable to run sqoop

Unable to run sqoop command in big data labs

here is query

sqoop import --connect jdbc:mysql://quickstart:3306/retail_db --username=retail_dba
–password=cloudera -table=categories --where “\category_id\ between 1 and 22”
–hive-import
–m1

Here is the error from console:-

17/01/29 22:01:58 INFO mapreduce.ImportJobBase: Transferred 403 bytes in 20.0793 seconds (20.0704 bytes/sec)
17/01/29 22:01:58 INFO mapreduce.ImportJobBase: Retrieved 22 records.
17/01/29 22:01:58 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
17/01/29 22:01:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t LIMIT 1
17/01/29 22:01:58 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/hdp/2.5.0.0-1245/hive/lib/hive-common-1.2.1000.2.5.0.0-1245.jar!/hive-log4j.properties
FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table categories. org.apache.hadoop.security.AccessControlException: Permission denied: u
ser=tarunkumard, access=EXECUTE, inode="/user/subramanyamsibbala/categories":subramanyamsibbala:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

@Tarun_Das Either you have
a) --hive-create-table if your Hive Database does not have categories table
or
b) --hive-overwrite assuming categories table already existing in Hive “default” database
or you have to supply the hive database name which has categories table by using -hive-database

where can i see Hive Managed table created?

There is a contradiction between your requirement and the given command

If you are trying to execute this SQOOP import in Big data labs obviously it will fail because its pointing to cloudera Quickstart VM , so please change it to nn01.itversity.com

1 Like

That helps thank you that might be the issue