Append in hive not working


#1

query which I am using:
sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_import” --username retail_dba --password itversity --table departments_2 --target-dir /apps/hive/warehouse/sqoop_ajinkya89.db/departments --append --where “department_id > 7” --split-by department_id --outdir java_files

result:
2 Fitness
3 Footwear
4 Apparel
5 Golf
6 Outdoors
7 Fan Shop
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL

result is a success but everything is getting imported as null ?

Am I doing something wrong?


#2

@Ajinkya_Saraf13 Use Field Delimiters and run the sqoop command.


#3

I tried these two queries still the values are being imported as nulls

sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_import” --username retail_dba --password itversity --table departments --append --where “department_id > 7” --target-dir=/apps/hive/warehouse/ajinkyasqoop16.db/departments --split-by department_id --fields-terminated-by \001 --lines-terminated-by \n

sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_import” --username retail_dba --password itversity --table departments --append --where “department_id > 7” --target-dir=/apps/hive/warehouse/ajinkyasqoop16.db/departments --split-by department_id --fields-terminated-by \u0001 --lines-terminated-by \n

this is what I was referring to
Storage Desc Params:
field.delim \u0001
line.delim \n
serialization.format \u0001
while using delimiters, this is how the table is described in hive


#4

@Ajinkya_Saraf13 Hi

Please use –fields-terminated-by ‘\001’. The ‘\001’ will internally be converted to '\u0001’


#5

Thanks a lot. It worked


#6