Not all tables are import in hive using sqoop

sqoop import-all-tables --m 1 --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --hive-import --hive-overwrite --create-hive-table --hive-database aa_scoop_import --compress

only categories table is imported in database aa_scoop_import

[abhishakeagarwal@gw01 abhi]$ sqoop import-all-tables --m 1 --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --hive-import --hive-overwrite --create-hive-table --hive-database aa_scoop_import --compress
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/06/20 04:16:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/06/20 04:16:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/06/20 04:16:01 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/06/20 04:16:01 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/06/20 04:16:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/06/20 04:16:02 INFO tool.CodeGenTool: Beginning code generation
17/06/20 04:16:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t LIMIT 1
17/06/20 04:16:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t LIMIT 1
17/06/20 04:16:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-abhishakeagarwal/compile/d213a2d1ec60a1aa79c9cbb665908752/categories.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/20 04:16:03 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-abhishakeagarwal/compile/d213a2d1ec60a1aa79c9cbb665908752/categories.jar
17/06/20 04:16:03 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/06/20 04:16:03 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/06/20 04:16:03 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/06/20 04:16:03 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/06/20 04:16:04 INFO mapreduce.ImportJobBase: Beginning import of categories
17/06/20 04:16:05 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
17/06/20 04:16:05 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
17/06/20 04:16:05 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
17/06/20 04:16:14 INFO db.DBInputFormat: Using read commited transaction isolation
17/06/20 04:16:14 INFO mapreduce.JobSubmitter: number of splits:1
17/06/20 04:16:15 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495691438758_26093
17/06/20 04:16:15 INFO impl.YarnClientImpl: Submitted application application_1495691438758_26093
17/06/20 04:16:15 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:8088/proxy/application_1495691438758_26093/
17/06/20 04:16:15 INFO mapreduce.Job: Running job: job_1495691438758_26093
17/06/20 04:16:21 INFO mapreduce.Job: Job job_1495691438758_26093 running in uber mode : false
17/06/20 04:16:21 INFO mapreduce.Job: map 0% reduce 0%
17/06/20 04:16:26 INFO mapreduce.Job: map 100% reduce 0%
17/06/20 04:16:27 INFO mapreduce.Job: Job job_1495691438758_26093 completed successfully
17/06/20 04:16:28 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=160783
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=87
HDFS: Number of bytes written=576
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=6268
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=3134
Total vcore-milliseconds taken by all map tasks=3134
Total megabyte-milliseconds taken by all map tasks=4813824
Map-Reduce Framework
Map input records=58
Map output records=58
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=48
CPU time spent (ms)=1170
Physical memory (bytes) snapshot=241618944
Virtual memory (bytes) snapshot=3272200192
Total committed heap usage (bytes)=197656576
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=576
17/06/20 04:16:28 INFO mapreduce.ImportJobBase: Transferred 576 bytes in 23.0111 seconds (25.0314 bytes/sec)
17/06/20 04:16:28 INFO mapreduce.ImportJobBase: Retrieved 58 records.
17/06/20 04:16:28 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
17/06/20 04:16:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t LIMIT 1
17/06/20 04:16:28 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/usr/hdp/2.5.0.0-1245/hive/lib/hive-common-1.2.1000.2.5.0.0-1245.jar!/hive-log4j.properties
OK
Time taken: 1.565 seconds
Loading data to table aa_scoop_import.categories
Table aa_scoop_import.categories stats: [numFiles=1, numRows=0, totalSize=576, rawDataSize=0]
OK
Time taken: 0.493 seconds
Note: /tmp/sqoop-abhishakeagarwal/compile/d213a2d1ec60a1aa79c9cbb665908752/customers.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
[abhishakeagarwal@gw01 abhi]$ ls -l
total 44
-rw-r–r-- 1 abhishakeagarwal students 14122 Jun 20 04:16 categories.java
-rw-r–r-- 1 abhishakeagarwal students 27624 Jun 20 04:16 customers.java
[abhishakeagarwal@gw01 abhi]$

hive (default)> use aa_scoop_import;
OK
Time taken: 0.021 seconds
hive (aa_scoop_import)> show tables;
OK
categories
Time taken: 0.028 seconds, Fetched: 1 row(s)
hive (aa_scoop_import)>

I think you don’t need to give --create-hive-table explicitly. it will create all the tables by itself. Please remove that and try.