Is it possible to view logs during certification?


If we just invoke the following spark shell during the certification exam, where and how do we view the logs?
pyspark --master yarn


Which Logs you want? you mean the resource manager which has cores and executors etc.?


Whenever there is a failed job or an exception


Yes. You can. I have used mapred job to debug sqoop export failures.


Did you use the tracking URL to see the reason for failure ? I have heard that you cannot access that URL. Can you please elaborate on how you checked the sqoop export logs?


I did not use the URL. I have used mapred job command to debug the sqoop export failures.


Thanks. I tried the following command.
mapred job -logs job_1525279861629_9203

@dgadiraju - Can you share any troubleshooting tips and tricks in terms of checking logs, common mistakes etc?


I have tried and it is working fine in lab.

mapred job -logs job_id task_id
ex., mapred job -logs job_1513875462555_0018 attempt_1513875462555_0018_m_000000_0

mapred job -logs job_1500459105861_9355 attempt_1500459105861_9355_m_000003_0

–You will see the details like below…

java.lang.RuntimeException: Can’t parse input data: ‘430542014-04-18 00:00:00.04914CLOSED’
at orders_sara.__loadFromFields(
at orders_sara.parse(
at org.apache.hadoop.mapred.MapTask.runNewMapper(
at org.apache.hadoop.mapred.YarnChild$
at Method)
at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.lang.NumberFormatException: For input string: “430542014-04-18 00:00:00.04914CLOSED”