Fields-terminated-by not working in import commands

[rkathiravan@gw01 ~]$ sqoop-import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” \

–username retail_dba
–password itversity
–table departments
–target-dir “/user/rkathiravan/rkathiravan_sqoop_import/departments”
-m 2
– outdir “/user/rkathiravan/javafiles”
–fields-terminated-by ‘|’
–lines-terminated-by '\n’
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/02/28 18:37:54 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/02/28 18:37:54 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/02/28 18:37:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/02/28 18:37:54 INFO tool.CodeGenTool: Beginning code generation
17/02/28 18:37:55 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM departments AS t LIMIT 1
17/02/28 18:37:55 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM departments AS t LIMIT 1
17/02/28 18:37:55 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-rkathiravan/compile/8020db44b5bc399572d2ee67fc710dc0/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/28 18:37:56 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-rkathiravan/compile/8020db44b5bc399572d2ee67fc710dc0/departments.jar
17/02/28 18:37:56 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/02/28 18:37:56 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/02/28 18:37:56 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/02/28 18:37:56 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/02/28 18:37:56 INFO mapreduce.ImportJobBase: Beginning import of departments
17/02/28 18:37:57 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
17/02/28 18:37:57 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
17/02/28 18:37:57 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
17/02/28 18:38:04 INFO db.DBInputFormat: Using read commited transaction isolation
17/02/28 18:38:04 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(department_id), MAX(department_id) FROM departments
17/02/28 18:38:04 INFO db.IntegerSplitter: Split size: 49; Num splits: 2 from: 2 to: 100
17/02/28 18:38:04 INFO mapreduce.JobSubmitter: number of splits:2
17/02/28 18:38:05 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1485099329996_7493
17/02/28 18:38:05 INFO impl.YarnClientImpl: Submitted application application_1485099329996_7493
17/02/28 18:38:05 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:8088/proxy/application_1485099329996_7493/
17/02/28 18:38:05 INFO mapreduce.Job: Running job: job_1485099329996_7493
17/02/28 18:38:11 INFO mapreduce.Job: Job job_1485099329996_7493 running in uber mode : false
17/02/28 18:38:11 INFO mapreduce.Job: map 0% reduce 0%
17/02/28 18:38:16 INFO mapreduce.Job: map 50% reduce 0%
17/02/28 18:38:17 INFO mapreduce.Job: map 100% reduce 0%
17/02/28 18:38:18 INFO mapreduce.Job: Job job_1485099329996_7493 completed successfully
17/02/28 18:38:18 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=320152
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=245
HDFS: Number of bytes written=84
HDFS: Number of read operations=8
HDFS: Number of large read operations=0
HDFS: Number of write operations=4
Job Counters
Launched map tasks=2
Other local map tasks=2
Total time spent by all maps in occupied slots (ms)=11926
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=5963
Total vcore-milliseconds taken by all map tasks=5963
Total megabyte-milliseconds taken by all map tasks=9159168
Map-Reduce Framework
Map input records=7
Map output records=7
Input split bytes=245
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=89
CPU time spent (ms)=1980
Physical memory (bytes) snapshot=462106624
Virtual memory (bytes) snapshot=6521618432
Total committed heap usage (bytes)=403701760
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=84
17/02/28 18:38:18 INFO mapreduce.ImportJobBase: Transferred 84 bytes in 21.7021 seconds (3.8706 bytes/sec)
17/02/28 18:38:18 INFO mapreduce.ImportJobBase: Retrieved 7 records.
[rkathiravan@gw01 ~]$ hadoop fs -cat /user/rkathiravan/rkathiravan_sqoop_import/departments/part*
2,Development
3,Footwear
4,Apparel
5,Golf
6,Outdoors
7,Fan Shop
100,Hadoop Training

here fields-terminated-by didnt work. why??

@Revathi_K - I have recreated the issue. In some places you have given \ continuously, it has to be one space. e.g --table departments \

After correcting above issues, then I re-produced your error. To resolve the problem I have removed extra space before out-dir. Due to this it didn’t considered lines after that. Default stored in comma delimiter.

sqoop-import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db”
–username retail_dba
–password itversity
–table departments
–target-dir “/user/gnanaprakasam/gnanaprakasam_sqoop_import/departments”
-m 2
–outdir “/user/gnanaprakasam/javafiles”
–fields-terminated-by ‘|’
–lines-terminated-by ‘\n’

1 Like

thanks. i have left a space in outdir.

now working.

one more doubt:

actually i have given my outdir as /user/rkathiravan/javafiles
but i could not see any javafiles in it. java files are stored where i run my import.
cant i direct to hadoop location? should i give only gateway location?

@Revathi_K - Here even though we have given hadoop location, it default writing into local where we are running sqoop commands.

It’s look like we can’t give hadoop location to store java files.

Yep. I too thought the same.
Thanks. .