Getting error is running import_all-tables

Hello,

I was trying to import all tables in a target directory instead of warehouse directory. This was to practice certification exam since it gave a permission issue when I tried to import the data into hive. Can someone please tell me what am I doing wrong?

[sharmapurnima901@gw02 ~]$ sqoop import-all-tables \

–connect “jdbc:mysql://ms.itversity.com/retail_db”
–username retail_user
–password itversity
–target-dir /user/sharmapurnima901/cloudera/retail_stage.db
–compress
–compression-codec snappy
–as-avrodatafile
-m 1
19/11/09 21:01:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.5.0-292
19/11/09 21:01:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Error parsing arguments for import-all-tables:
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: --target-dir
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: /user/sharmapurnima901/cloudera/retail_stage.db
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: --compress
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: --compression-codec
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: snappy
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: --as-avrodatafile
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: -m
19/11/09 21:01:11 ERROR tool.BaseSqoopTool: Unrecognized argument: 1

–target-dir is not a valid option with import-all-tables, so can’t use it

Ok, The option works with text files though. It is only the AVRO format which doesn’t work. I was trying to do one of the exercises which asks for creating hive table from one of the tables and I got permission issues in creating a table in hive. That’s why I used target directory. I was wondering why would it work with text files.

I tried using warehouse directory but it still doesn’t work. It gives an error regarding conversion to AVRO format. Below is my code:

sqoop import-all-tables
–connect “jdbc:mysql://ms.itversity.com/retail_db”
–username retail_user
–password itversity
–warehouse-dir /user/hive/warehouse/sharmapurnima901
–compress
–compression-codec snappy
–as-avrodatafile
-m 1

Below is part of the error:

19/11/10 11:08:33 INFO mapreduce.Job: Task Id : attempt_1565300265360_34382_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.