Exams tips for CCA 175 takers


Hello everyone, it’a great tips. I am appearing for exam on 9th June, 2018. Just have few doubts: 1) Does exam covers Spark-Sql, Window partitions of spark-sql etc? I don’t have much confident in those concepts. Thanks in advance.


for the exam can we access hue during the exam?


Hello everyone, please can anyone help me with that issue : at the final stage I can’t do the aggregation and I got a warning :disappointed_relieved:

val orders = sc.textFile("/user/cloudera/data-master/retail_db/orders")

val orderItems = sc.textFile("/user/cloudera/data-master/retail_db/order_items")

//Get daily revenue by product considering closed and completed products.

val orders.take(10).foreach(println)

val ClosedAndCompleted = orders.filter( order => {
order.split(",")(3)==“CLOSED” || order.split(",")(3)==“COMPLETE”


val ordersPaired = ClosedAndCompleted.map(order => {



val orderItemspaired = orderItems.map(item => {


val order_join_orderItems = ordersPaired.join(orderItemspaired)


val daily_revenue_per_product = order_join_orderItems.map(order => {


val calcul_daily_revenue_per_product = daily_revenue_per_product.groupByKey()


val final_calcul_daily_revenue_per_product = calcul_daily_revenue_per_product.reduceByKey((total,element) => total+element)
scala> val final_calcul_daily_revenue_per_product = calcul_daily_revenue_per_product.reduceByKey((total,element) => total+element)
:43: error: type mismatch;
found : Iterable[Float]
required: String
val final_calcul_daily_revenue_per_product = calcul_daily_revenue_per_product.reduceByKey((total,element) => total+element)
Chat Conversation End
Type a message…


Hello Santy,
sorry to know about this, i am also concerned about this environment things.
can you please elaborate the problems that you faced . i am also worried about it.
@pramodvspk : hello pramod , what other way is there to open a .py file. if vi doesnt work.

thank you.


I faced same issue in exam. Can some one shed some light on this please. I tried opening .py file using vi and all I could see was blank.


Hello Santy, did you reach out to cloudera with this issue. I faced the same issue in the exam. Can you help if you have any pointers on this.


I did send out an email to Cloudera from their website (Contact Us) but did not receive a response as yet. Not sure if it was due to the holiday season. I am planning to give them a call and check on it.


@Avinash_Parida, you can use different editors such as nano or vim, but vi should be able to open a file.


@Bigdata_Aspirant, I am sorry to hear that, but if it is Cloudera’s mistake they will rectify it and give you a second chance.


Thank you sir. They should but they did not. I tried talking to them. I think I should quit this battle and focus on reattempt. This time try to score 100. But thank you for the support.


i am following the tutorials from itversity website and i dont think this particular video is there(or perhaps i missed it)
i will check youtube, but as i have almost covered till the end and all set for the certification exam, considering this video in youtube , is there any such other thing i am missing that is not in the itversity website.
@itversity : durga sir please confirm .
thank you.


Above video is at the end of the playlist.


@itversity : Hello Durga Sir,
this is what i find as the end of the playlist , am i going in a wrong direction .
please confirm.


@Avinash_Parida, You will find this video in Youtube playlist end, not in Itversity website.


@dksrinivasa , @itversity : is there any difference in terms of course in itversity and youtube channel , except for the last 4 videos which shows common issues faced.
is there any differnce in terms of tutorials provided?


Hey avinash, I’ve no idea of what’s an unix script format.Also i’ve not opened the file.

I’ve not tried the ./script.sh as I wasn’t aware of it. I chechked Durga’s videos after the exam and they were awesome.But does the last video(where he explained about running shell programs) work for spark as well?


Thanks a lot gnanaprakasam!


hi mike , i am yet to check durga’s last video, but yes shell scripts can invoke commands to connect to spark and execute it. so i expect if they have asked to run it as a shell script then they should have taken care of all the parameters required to run spark from shell.


@Avinash_Parida, If you want use script to invoke commands in spark, then follow as mentioned Durga in his code.

#Save this to a file with py extension (saveFile.py)
from pyspark import SparkContext, SparkConf
conf = SparkConf().setAppName(“pyspark”)
sc = SparkContext(conf=conf)
dataRDD = sc.textFile("/user/cloudera/sqoop_import/departments")
for line in dataRDD.collect():

#Run using this command
#master local will run in spark native mode
spark-submit --master local saveFile.py

Hope this may help you.

Kind Regards,


@mike was it a .sh file or .py file.
what i was thinking was if its a .sh file then you can directly run it using ./script.sh .
but yes if its a .py file then we have to run it as how @dksrinivasa mentioned.

@dksrinivasa : so the .py or .scala files wont have the SC already invoked within the script?
we have to add it in order to run it?
i was thinking as these were skeleton copies with some codes already mentioned , they would have used it in the script.