java.sql.SQLException Error while using sqlContext in Spark

I was waorking on “Hadoop Certification - CCA - Spark Introduction” video. I had created a softcopy of hive-site.xml in spark conf library. Th only different thing I did was that my spark configuration existed in /usr/lib directory. When I run the sqlContext command mentioned in the video at 13:24 I get the following error.

java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost/metastore

I am using Udacity Cloudera Training VM

What changes should I make in order to remove this error?

Thanks

Have you tried raising the issue with them?

No I have not raised the issue.

As a solution I tried downloading Cloudera VM 5.8 and tried it running using VM player but I am unable to run it. There are no error logs. It is just not powering on the VM

My system configuration is given as below.

Is there anything which I am doing incorrectly?

When I am opening my spark shell its showing that Hive context is there although I have now removed the hive-site.xml from spark conf file

In default database of hive I am having a countries table

But when I run the command as shown in the video of selecting * from the table

It throws an error that countries table is not present in the database

If I would keep the soft link of hive-site.xml then it would show the error which I have mentioned in the question.

How much memory you have given to your VM?

Error is table not found. You have to first create the table or prefix the table name with database name.

I have allocated 2 GB memory to vm

But I have attached a screenshot where it shows that the hive default database has the countries table.

Can it be the case that its referring to sql database and not hive database?

I have tried giving the database name prefixed with a table name but still I face the same issue

Command

Error

I have given 2 gb memory to vm