Where can i see Hive Managed table created in Big data labs

In cloudera VM we see Hive Managed table created at /user/hive/warehouse/ ,my question is where does hive managed tables created at Big datalabs

@Tarun_Das, I hope /apps/hive/warehouse directory

doesnt able to see this

[tarunkumard@gw01 ~]$ cd /apps/hive/warehouse
-bash: cd: /apps/hive/warehouse: No such file or directory

do a
hadoop fs -ls /apps/hive/warehouse

Thank you i anyhow abled to figure that out but my actual issue is when i run sqoop cmd i get table already exists ,

sqoop import --connect jdbc:mysql://nn01.itversity.com/retail_db --username=retail_dba --password=itversity --table=categories --where "category_id between 1 and 22” --hive-import --m 1

so i planned to delete categories table using

hdfs dfs -rm -R hdfs://nn01.itversity.com:8020/user/tarunkumard/categories

I ran again the above sqoop command,i see map reduce job ran successfully but at end i get below message

Logging initialized using configuration in jar:file:/usr/hdp/2.5.0.0-1245/hive/lib/hive-common-1.2.1000.2.5.0.0-1245.jar!/hive-log4j.properties
FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table categories. org.apache.hadoop.security.AccessControlException: Permission denied: user=tarunkumard, access=EXECUTE, inode="/user/subramanyamsibbala/categories":subramanyamsibbala:hdfs:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)

Hi ,
what are you exactly trying to do ?
importing the table to HDFS or to HIVE?
if to HDFS then you are missing the --target-dir (target directory)

if you are importing to hive, then is it an existing table or you want to create a new table ?
for existing table you are missing:
–hive-table (table name in hive)
–hive-database

for new table you are missing:
–create-hive-table

Thank you ,yes i am trying to push data to existing Hive table

Got same permission issue this time i used

sqoop import --connect jdbc:mysql://nn01.itversity.com/retail_db --username=retail_dba --password=itversity --table=categories --where “category_id between 1 and 22” --hive-home /apps/hive/warehouse/categories_managed --hive-import --hive-table categories --m 1

–hive-home is not for giving the directory name.
I will recommend to create database for your self and load into it.

Please check this video, which cover it in detail

hello sir,

created hive db and did set cli command then executed

sqoop import-all-tables --connect jdbc:mysql://nn01.itversity.com/retail_db --username=retail_dba --password=itversity --compression-codec=snappy --as-parquetfile --warehouse-dir=/apps/hive/warehouse/tarunkumard.db --hive-import --m 1

but i keep on getting file not found

Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/subramanyamsibbala/avsc/categories.avsc
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)

there are multiple issues here. I am not sure why you are getting file does not exist. I did not get that error.

Here are the issues:

  • compression-codec cannot be snappy, you have to give fully qualified path
  • hive-import with parquet and avro are not working