Facing issue while connecting to sql server express via pyspark code

Hi,
I am trying to make connection to sql server express in order to get the data using pyspark application code but facing some issues.

below is my spark shell:

#!/bi/nbash
export SPARK_MAJOR_VERSION=2

spark-submit --files /etc/spark2/conf/hive-site.xml --driver-class-path /home/vikct001/user/vikrant/myjars/sqljdbc4.jar --master yarn --deploy-mode client /home/vikct001/user/vikrant/spark/read_sql_etl.py

and below is pyspark application code:
spark = SparkSession
.builder
.appName(“Spark SQL Parllel load example”)
.config(“spark.jars”,"/home/vikct001/user/vikrant/myjars/sqljdbc4.jar")
.config(“spark.dynamicAllocation.enabled”,“true”)
.config(“spark.shuffle.service.enabled”,“true”)
.config(“hive.exec.dynamic.partition”, “true”)
.config(“hive.exec.dynamic.partition.mode”, “nonstrict”)
.config(“spark.sql.shuffle.partitions”,“50”)
.config(“hive.metastore.uris”, “thrift://rm01.itversity.com:9083”)
.enableHiveSupport()
.getOrCreate()

query1 = "(select * from [csv_db].[dbo].[OrderDetailsTable]) as x"

df = spark.read.format(“jdbc”).options(url=“jdbc:sqlserver://LAPTOP-IO0IES16\SQLEXPRESS:1433;database=csv_db;user=vikct001;password=pass001”, dbtable=query1,partitionColumn=“Country”,lowerBound=1,upperBound=5,numPartitions=1).load()

df.show(5);

I am using same server name and userid and password to connect to sql server using python. It was connected successfully. but I am facing issue while connecting via spark.
I am getting below error message.

py4j.protocol.Py4JJavaError: An error occurred while calling o99.load.
: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host LAPTOP-IO0IES16, port 1433 has failed. Error: “null. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.”.

and if I run same script on cluster mode, I am getting different error message.

SPARK_MAJOR_VERSION is set to 2, using Spark2
19/07/29 15:33:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Exception in thread “main” org.apache.spark.SparkException: Application application_1563337199692_8316 finished with failed status

any help would be highly appreciated. Thanks

any clue? suggestion

@Vikrant_Singh_Rana You can’t use local MYSQL Credentials in labs.

Refer the below link for connecting to MYSQL using pyspark

Thanks for your kind help.
I am facing different issue not the this one.
and why can’t we connect to our local sql server with labs?

can we see this location using putty?
/usr/share/java/mysql-connector-java.jar