Unable to override existing sqoop job in CDH quickstart VM

Hello,
I have created a sqoop job as

[cloudera@quickstart sqoop]$ sqoop job --create orders_import – import --connect “jdbc:mysql://${hostname}:3306/retail_db” --table orders --target-dir /user/cloudera/cca175/review/sqoop/orders --username root --password cloudera -m 5

I have triggered the above sqoop job and successfully completed. But when I tried to override to import the data into avro format with a new file name it throws an exception.

[cloudera@quickstart sqoop]$ sqoop job --exec orders_import – --delete-target-dir --target-dir /user/cloudera/cca175/review/sqoop/orders_as_avro --as-avrodatafile

Warning: /usr/lib/sqoop/…/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/01/23 14:36:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.8.0
Enter password:
17/01/23 14:36:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/01/23 14:36:38 INFO tool.CodeGenTool: Beginning code generation
17/01/23 14:36:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
17/01/23 14:36:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
17/01/23 14:36:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/da508ace8c9dcaa1eae5a2005fb413db/orders.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/01/23 14:36:41 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/da508ace8c9dcaa1eae5a2005fb413db/orders.jar
17/01/23 14:36:41 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/01/23 14:36:41 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/01/23 14:36:41 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/01/23 14:36:41 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/01/23 14:36:41 INFO mapreduce.ImportJobBase: Beginning import of orders
17/01/23 14:36:41 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/01/23 14:36:42 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/01/23 14:36:42 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/127.0.0.1:8032
17/01/23 14:36:43 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/cca175/review/sqoop/orders already exists
17/01/23 14:36:43 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/cca175/review/sqoop/orders already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:213)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:268)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

[cloudera@quickstart sqoop]$

Is there any limitation in the parameters overriding during the Sqoop JOB?.

Can you paste the command you are running?

sqoop job commands which I have used are in the initial post.

[cloudera@quickstart sqoop]$ sqoop job --create ordersimport – import --connect “jdbc:mysql://${hostname}:3306/retail_db” --table orders --target-dir /user/cloudera/cca175/review/sqoop/orders --username root --password cloudera -m 5

[cloudera@quickstart sqoop]$ sqoop job --exec ordersimport – --delete-target-dir --target-dir /user/cloudera/cca175/review/sqoop/orders_as_avro --as-avrodatafile

You have to use this --delete-target-dir