How to read csv file with comma (,) in the field using Pyspark?

pyspark

#1

How to split the csv file data containing a comma (,) in the field value.

Sample Input Data:

t = 653051,300.0,300.0,Fruits & Vegetables,Food,"To buy seasonal, fresh fruits to sell. "

Required Output:
t[0] = 653051
t[1] = 300.0
t[2] = 300.0
t[3] = “Fruits & Vegetables”
t[4] = “Food”
t[5] = “To buy seasonal, fresh fruits to sell.”

Please help


#2

df = spark.read.csv(‘file:///mypath…/myFile.csv’, sep=’,’, header=True)
df.show()