Cannot access mySQL retail_export or retail_import dbs from spark but able to access retail_db

bigdatalabs
mysql

#1

Hi All,

I am trying to write an spark SQL application for incremental load to hive from mySQL.
For this, i need to have write access to mySQL table so i created table in retail_export.
But i am unable to access that from spark.

val df_mysql11 = spark.read.
  format("jdbc").
  option("url","jdbc:mysql://nn01.itversity.com:3306/retail_export").
  option("driver","com.mysql.jdbc.Driver").
  option("dbtable","customers_spark_rajeshs_mini").
  option("user","retail_user").
  option("password","itversity").
  load()

java.sql.SQLException: Access denied for user ‘retail_user’@‘gw02.itversity.com’ (using password: YES)

Whereas i am able to access retail_db tables from spark.

val df_mysql = spark.read.
  format("jdbc").
  option("url","jdbc:mysql://ms.itversity.com:3306/retail_db").
  option("driver","com.mysql.jdbc.Driver").
  option("dbtable","orders").
  option("user","retail_user").
  option("password","itversity").
  load()


df_mysql.show
+--------+-------------------+-----------------+---------------+
|order_id|         order_date|order_customer_id|   order_status|
+--------+-------------------+-----------------+---------------+
|       1|2013-07-25 00:00:00|            11599|         CLOSED|
|       2|2013-07-25 00:00:00|              256|PENDING_PAYMENT|
|       3|2013-07-25 00:00:00|            12111|       COMPLETE|
|       4|2013-07-25 00:00:00|             8827|         CLOSED|
|       5|2013-07-25 00:00:00|            11318|       COMPLETE|
|       6|2013-07-25 00:00:00|             7130|       COMPLETE|

But in these DB tables i cannot increment records.

Please help me letting me know any table that is have write access as well as accessible from spark shell/ intellij (through spark jdbc connection).

?


#2

@BaLu_SaI @itversity1 @Itversity_Training @dgadiraju


#3

@rajeshs903
Please use the MySql details as mentioned in the below URL


#4

Hi vinod,
As you can see from the original post. I have mentioned I just want to know which table has write access and can be accessed from spark shell .
i know retail_db can be accessed from spark shell But we cannot add any new records to retail_db tables
and the databases like retail_import,retail_export have write access but cannot be accessed from spark shell.

My question is how to access retail_import or retail_export from spark shell ?
can someone try and provide the connection string ? you have the detail in my original post about what i have already tried.


#5

@vinodnerella can you please tell me which database has write access that can accessed from spark shell ??? @itversity1 @Itversity_Training


#6

Thanks for info on Cannot access mySQL retail_export or retail_import hadoop dbs from spark but able to access retail_db


#7

@rajeshs903, you need to change from nn01.itversity.com to ms.itversity.com. As part of your question, it is saying Hive related issue. I do not understand how this is related to Hive.