Spark JSON load issue

Command used:

data = x: json.loads(x))

Input file :
{“name”:“Andy”, “age”:30}
{“name”:“Justin”, “age”:19}

Error :
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File “/usr/hdp/”, line 111, in main
File “/usr/hdp/”, line 106, in process
serializer.dump_stream(func(split_index, iterator), outfile)
File “/usr/hdp/”, line 263, in dump_stream
vs = list(itertools.islice(iterator, batch))
File “”, line 1, in
File “/usr/lib64/python2.7/json/”, line 338, in loads
return _default_decoder.decode(s)
File “/usr/lib64/python2.7/json/”, line 365, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File “/usr/lib64/python2.7/json/”, line 383, in raw_decode
raise ValueError(“No JSON object could be decoded”)
ValueError: No JSON object could be decoded

You have to use the SQLContext for it.

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val df =“examples/src/main/resources/people.json”)