Unable to execute sqoop import from mysql to hive for avrodatafile

Hi. I am unable to run the following command in big-data labs, when importing from mysql to hive warehouse as avrodatafile:
sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username=retail_dba --password=itversity --table departments --as-avrodatafile --warehouse-dir=/apps/hive/warehouse/imp_avro.db -m 2

It gives the following error for all the map-reduce jobs:
INFO mapreduce.Job: Task Id : attempt_1480307771710_5877_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V

If anyone know the issue, please help.

sqoop import -Dmapreduce.job.user.classpath.first=true --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username=retail_dba --password=itversity --table departments --as-avrodatafile --warehouse-dir=/apps/hive/warehouse/imp_avro.db -m 2

This should work.
For additional info check this link


Thank you @mohanp.sit for your quick reply.
That was helpful.

Sqoop export is not working. I am using below script, Please correct me if i am doing anything wrong here

[ssibbala78@gw01 ~]$ sqoop export --connect jdbc:mysql://nn01.itversity.com/retail_export --username retail_dba --password itversty --table orders_sss --export-dir /user/ssibbaa78/sqoopnew/orders_txt --input-lines-terminated-by ‘\n’ --input-fields-terminated-by ‘,’

Error message:
17/01/21 00:23:51 INFO tool.CodeGenTool: Beginning code generation
17/01/21 00:23:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user ‘retail_dba’@‘gw01.itversity.com’ (using password: YES)
java.sql.SQLException: Access denied for user ‘retail_dba’@‘gw01.itversity.com’ (using password: YES)


@ssibbala78 Please use the port 3306 in the connection String