Facing SQLException Error while using sqlContext in Spark

I was waorking on “Hadoop Certification - CCA - Spark Introduction” video. I had created a softcopy of hive-site.xml in spark conf library. Th only different thing I did was that my spark configuration existed in /usr/lib directory. When I run the sqlContext command mentioned in the video at 13:24 I get the following error.

java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost/metastore

I am using Udacity Cloudera Training VM

What changes should I make in order to remove this error?


I am not sure. It is not easy to troubleshoot the issue. You should follow up with Udacity to resolve these issues.