Map-reduce with third party API

I am using some third party APIs in my Map-Reduce code. I tried few things but didn’t worked out for me. (I am using Cloudera 5.9.)

  1. I tried with fat jar approach, and my code worked for me, but this is not an good way to use (my jar size in very heavy).

so i thought of separating the third party jars and share them using distributed cache.

  1. I tried using the some options like below.
    i) I used "hadoop jar <jar_file_name> <main_class_name> -libjars " command. --> Didn’t worked and getting ClassNotFoundException
    ii) I Kept the third party jars in local folder and used the path in -libjars option. --> Didn’t worked and getting ClassNotFoundException
    iii) I Kept the third party jars in HDFS and used the path in -libjars option. --> Didn’t worked and getting ClassNotFoundException
    iv) I Updated the code to use “DistributedCache.addFileToClassPath()”. --> Didn’t worked and getting ClassNotFoundException
    v) I Updated the code to use “job.addCacheFile()”. --> Didn’t worked and getting ClassNotFoundException
    vi) I Updated the “/etc/hadoop/conf/hadoop-env.sh” and added a new line "export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/home/mani/test/lib/* " and then restarted the cluster and but still didn’t worked.
    vii) I tried running the command like "hadoop jar <jar_file_name> <main_class_name> -D mapred.child.env=“LD_LIBRARY_PATH=path/to/my/lib/*” ". --> Didn’t worked and getting ClassNotFoundException
    viii) I Also tried “-Dyarn.application.classpath” but still same issue.

I have gone trough some forums and Cloudera blog posts, and some other websites. Everyone telling the same story but i am not able to get the output even after following their posts correctly.

Can someone please help me to find out the solution.

Thanks
Sarvesh