Error: GC overhead limit exceeded




I am trying to load data to a Hive partitioned bucketized table but getting error on itversity lab. Please help.

2018-05-13 14:45:50,760 Stage-1 map = 100%, reduce = 50%, Cumulative CPU 116.27 sec
2018-05-13 14:45:53,838 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 52.39 sec
MapReduce Total cumulative CPU time: 52 seconds 390 msec
Ended Job = job_1525279861629_7258 with errors
Error during job, obtaining debugging information…
Examining task ID: task_1525279861629_7258_m_000001 (and more) from job job_1525279861629_7258
Examining task ID: task_1525279861629_7258_r_000000 (and more) from job job_1525279861629_7258

Task with the most failures(4):

Task ID:


Diagnostic Messages for this Task:
Error: GC overhead limit exceeded
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

Sign up for our state of the art Big Data cluster for hands on practice as developer. Cluster have Hadoop, Spark, Hive, Sqoop, Kafka and more.


Can you share the command that you are using?