Data in file using weird char as separtor:
Command used to get the data:
[hamidhassan@gw02 ~]$ hadoop fs -tail /apps/hive/warehouse/hamidhassan_sqoop_import.db/daily_revenue/000000_0
Log file error:
18/04/02 09:19:09 INFO mapreduce.Job: Job job_1520592249193_39135 failed with state FAILED due to: Task failed task_1520592249193_
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/04/02 09:19:09 INFO mapreduce.Job: Counters: 13
Failed map tasks=1
Killed map tasks=3
Launched map tasks=4
Data-local map tasks=3
18/04/02 09:19:09 ERROR mapreduce.ExportJobBase: Export job failed!
18/04/02 09:19:09 ERROR tool.ExportTool: Error during export: Export job failed!
he url to track the job: http://rm01.itversity.com:8088/proxy/application_1520592249193_39
Any idea, what the problem is ?
or how to define the special delimeter in export script ?