I was waorking on “Hadoop Certification - CCA - Spark Introduction” video. I had created a softcopy of hive-site.xml in spark conf library. Th only different thing I did was that my spark configuration existed in /usr/lib directory. When I run the sqlContext command mentioned in the video at 13:24 I get the following error.
java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost/metastore
I am using Udacity Cloudera Training VM
What changes should I make in order to remove this error?