Cleared CCA175 on 15th April 2018

Cleared CCA175 on 15th April 2018
0.0 0

#1

Hello All,

I have successfully completed CCA175 and thanks to Durga sir and Arun sir I was able to complete it without difficulty. I have few suggestions to give:

  1. Practice every question on Durga sir’s video. Here are our courses

  • Click here for $35 coupon for CCA 175 Spark and Hadoop Developer using Python.
  • Click here for $35 coupon for CCA 175 Spark and Hadoop Developer using Scala.
  • Click here for signing up for our state of the art Hadoop and Spark Cluster

  1. Before going into examination you must solve questions on Arun sir’s blog.

These two steps is must for anyone who are going to give the exam.
Other than this, while in examination one must keep this things in mind or if you want you can consider it as the steps which need to be followed in exam.

  1. Skip sqoop questions for now, since I have found here that some of the people are facing issues while connecting to database. After going through other questions, come back to it otherwise you might lose your precious time.
  2. I got questions on sqoop and spark but you must go through flume and kafka once atleast, so in case they ask, you won’t face any issue.
  3. Cluster is appropriate for the examination but due to remote connection it becomes very slow even while dragging terminal from here and there.
  4. Use sublime text to write code
  5. Use ctrl + ‘+’ to increase the font in sublime terminal and if it doesn’t work then go to preferences->default values or something like that, there you will find there “font-size”: 10, change it to some other value and you will be good to go.
  6. Do not full-screen the windows in the exam otherwise you will face issue while switching tabs. Rather than that keep it at relatively small size and keep it in such a way that they overlap over each other partially. Then if you want to switch over it then you can just click on the bar of any tab and you can switch over easily.
  7. There are three parts in the question: Information, Input, Output. Read all the details carefully and then only proceed to the solution of question.
  8. Always copy and paste input and output path. After copying cross-check that if the correct path has been copied or not. Avoid using ctrl +c or ctrl +v, instead right click and then copy. This will ensure that the path has been copied properly.
  9. Check available yarn resource.
  10. Before running spark-shell always check the filesize of the input path, so that you can configure your spark-shell command properly.
  11. Always run spark-shell with --num-executors, --executor-cores and --executor-memory parameters. Keep in mind if you are running with more memory then it will take relatively more time to launch, so launch it accordingly.
  12. First launch spark-shell then while it is executing you can read questions accordingly.
  13. After running your command, check output directory.

I think I have covered most of the doubts people have in here. In case of any query you can ask frankly. If you feel like I am violating cloudera compliance confidentiality aggrement then please tell me. I’ll delete the content accordingly.

PS: I got 8/9 correct


Prepare for certifications on our state of the art labs which have Hadoop, Spark, Kafka, Hive and other Big Data technologies

  • Click here for signing up for our state of the art 13 node Hadoop and Spark Cluster


#2

Congrats shubhamdewangan.

On your 11th point

Always run spark-shell with --num-executors, --executor-cores and --executor-memory parameters. Keep in mind if you are running with more memory then it will take relatively more time to launch, so launch it accordingly.

Do we need to run spark-shell alone or with parameters?.

Thanks.


#3

What is expertise level of python skill needed for CCA175


#4

run as:
spark-shell --master --parameters


#5

I have written all the code in scala as I only know scala. I will not be able to help you in that area. But I think so basic knowledge should suffice.


#6

Did you use spark-sql or hive ql. How much time it took for you to practice for certification.


#7

@rjshekar90
You can run spark-shell only, no need to run spark-sql since spark-shell includes it automatically.
It took me around a month to learn from scratch. Actually you can learn it within 2 weeks but to get comfortable and remember the commands it takes some time. So in my view you can give 2 Weeks Learning + 2 Weeks practice.


#8

Same here , but some other forums it is being said that expect questions both scala & Python . Hence asking about expertise in python… Thanks for the response


#9

Questions may have template in python or scala, but you can solve in whichever programming language you are comfortable in…
What matters is the final output. Cloudera ppl would not verify the code, they check the final output only.


#10

Congrats !! I know we cannot generalize it but how long it took for you to prepare.I started since last week


#11

@shubhamdewangan - After you cleared the exam, In how many days you received the digital certificate?


#12

I think since the course changed recently, it is not mandatory to learn both programming langauages. Whichever you are comfortable with, you can learn that one.


#13

It took me about a month to prepare for whole exam, i.e., learning concepts + practicing.


#14

It took around 3 days.


#15

How was the difficulty level of the questions given the time is only 2 hours


#16

Hi,

can you please let me know what API documents are accessible while writing exam


#17

All the API’s available within cloudera distribution.


#18

Questions are not much difficult. By practice one can complete it on time easily.


#19

Can i get your email ID. I have some queries which i would like to write you in an email, if that’s fine with you.


#20

You can write me at dewanganshubham120@gmail.com
I’ll answer your question untill and unless it violets cloudera’s complaince.