I have two questions below:
How to write a record ito a hive table using pyspark?
spark.sql(“insert into retail.orders_avro values (9199,‘2020-05-15’,90909,‘OPEN’,‘2020-05’)”)
This code is giving me the Error:
Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
Q2 ) You have been given following mysql database details as well as other info. user=retail_dba password=cloudera database=retail_db jdbc URL = jdbc:mysql://quickstart:3306/retail_db
Please accomplish following activities.
1. In mysql departments table please insert following record. Insert into departments values (9999, '"Data Science"1);
Note: This question is from the internet.
Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs
- Click here for access to state of the art 13 node Hadoop and Spark Cluster