Sqoop Import in Avro or sequence file format

Hello Team;

I imported the data into HDFS using SQOOP avrodatafile successfully by using the below code:

sqoop import -Dmapreduce.job.user.classpath.first=true -m 1 --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --table departments --target-dir /user/shubhaprasadsamal/training/sqoop_import/department_avro --as-avrodatafile

But the data count of the part-m-00000.avro file and the database is not matching. Please find the below result:

Data count from HDFS location:

[shubhaprasadsamal@gw01 ~]$ hadoop fs -cat /user/shubhaprasadsamal/training/sqoop_import/department_avro/part*|wc -l

Data count from database:

[shubhaprasadsamal@gw01 ~]$ sqoop eval --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity \

–query "select count(1) from departments"
Warning: /usr/hdp/ does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/01/10 13:39:08 INFO sqoop.Sqoop: Running Sqoop version:
17/01/10 13:39:08 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/01/10 13:39:08 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.


Is it the desired result and the data count of the file is smaller due to garbled data in HDFS location ?

Getting the same result with sequencefile as well.

Please suggest. Thanks,


Hi shubhaprasad,

I faced the same issue when I tried to use the wc command to fetch the count from a part file of an avro import. Based on my research in google, it looks like wc is a short form for word count in UNIX like operating systems. The command in our case wc -l prints the count of lines in the file. The lines may not be arranged in an ideal format (record wise) in case of avro or sequence (binary) file formats and hence the issue.

This command should (I am yet to try) however work perfectly for textfile formats.


Yes, Santy.
I have tried for text file and the command is working fine i.e. the data count between both mySQL and file are same.

But don’t know about avro/sequence.


Yes, wc -l give number of lines in a file. In text input format, hadoop store each record in a separate line and hence it work.

But in more advanced formats, records are stored along with metadata. Record need not be separated with new line character and hence you are not supposed to use wc command. Only way to validate is by copying the avro file to local file system and use avro-tools to convert to json and review it.

1 Like