Sqoop import of all tables - cannot see data in the destination

sqoop import-all-tables --connect “jdbc:mysql://nn01.itversity.com” --username retail_dba --P
–target-dir user/seesnehabits/retail_db

I tried this command in the Big data labs console after logging in into gw01 with my username and password. Although I did not get an error, I do not see retail_db data in the destination folder? Do i need make changes to the syntax of the command? Please suggest.

Hope, you are checking in the same path user/seesnehabits/retail_db and not in /user/seesnehabits/retail_db.
I had obseved both are considered different paths in such scenarios…

Try running hadoop fs -ls user/seesnehabits/retail_db

I don’t see data in either path. From . my understanding they both refer to the same path :slight_smile:

Umm…I think I got…u need to specify the source database…
jdbc:mysql://nn01.itversity.com:portNo/retail_db

Mention portNo where mysql is running.
If you are unaware of the port…try 3306(by default one)

Try running the command now after changing this…

I think you must give /user/seesnehabits/retail_db instead of user/seesnehabits/retail_db in the target-dir.

i think you will need database name in the jdbc url.

Also , not sure if its a typo but you are missing a “/” in

–target-dir user/seesnehabits/retail_db

it will create a directory “user/seesnehabits/retail_db” under your username in hdfs like “/user/seesnehabits/user/seesnehabits/retail_db”

I think for import-all-tables we must use --warehouse-dir instead of --target-dir please try.

Yes, i realized i did not provide the db name and -warehouse-dir option. Thanks anyway.