I have cleared the CCA-175 , solved 8 out of 9 question


#1

i documented important points, do and don’t at

https://medium.com/@keshav120/my-experience-on-cca-175-cloudera-exam-preparation-to-certification-56ca86a284a7

.Please take time to read. Thanks


Prepare for certifications on our state of the art labs which have Hadoop, Spark, Kafka, Hive and other Big Data technologies

  • Click here for signing up for our state of the art 13 node Hadoop and Spark Cluster


#2

You got questions on flume/kafka/spark streaming?


#3

thanks a lot for the tips … could you elaborate on #3 (screen size) … this sounds like a big concern … i have 15 inch mac (i use this keyboard) + an external monitor. do you know if we can have 2 screens but have the monitor enabled and the mac screen disabled and just used for typing? basically are you saying on a 15 inch screen half of your screen is the cloudera cluster and half is the chrome browser with the exam questions?


#4

No, exam loaded on approx 60 % of the area of the monitor and you can not maximize this. what i meant, lets suppose you have 15th inch monitor then exam loaded into approx 10 inch span area which is small. you can use the external monitor but your face should have visible anyhow to proctor. Just login 15 mints before and you can clear your doubt with proctor. He/she will help you.


#5

2 flume, 1 hive, and 6 spark. you can go through my posted link for more.


#6

bro, you got 2 questions from flume?


#7

sorry , i meant sqoop. So, 2 sqoop, 1 hive and 6 spark


#8

Thanks bro, can you give some info what kind of Hive we need to focus? any examples


#10

How do you log into spark-shell during certification? Along with num-executors.
Can you please list down few complex commands that will be absolutely necessary during exam?


#11

just practice 10 times below blogs and do the udamy course by durga raju sir.
http://arun-teaches-u-tech.blogspot.com/


#12

spark-shell will be enough. Not require any parameters like num-executors, executors-memory, --master yarn.

Just launch with spark-shell and will be enough
For more info pls read the medium blog which i mentioned.in post


#13

@rkeshav120 Congratulations.
Were you starting a new terminal after every the end of every question?


#14

No, i did all the questions in just one terminal. but i opened two tabs in one terminal. one tab for running hadoop fs -ls/tail and sqoop kind of commands and another tab is for spark-shell. As i mentioned in the medium blog. you must need to reset your sqlContext configuration so that it can not be overridden for your next answer.


#15

Congrats @rkeshav120…can you please share ur number with me…I have my exam next week…need some guidance…will be really helpful.Thanks in advance.


#16

Congratulations @rkeshav120. Thanks for sharing your inputs.