Spark submit job with Apache livy



Hi , any one knows Apache spark submit job using Apache Livy interface ,
any knowledge about Apache Livy, share me please , Thank you



I have configured Hue & Jupyter notebooks with Spark using Livy Interface & triggered jobs not direct Livy REST API interface. If you need any assistance please refer in below blog link:


thank for responding Ravi, i know the above link , but i need a configuration and code also , how to do the spark submit job in real time, do you have any screen shots there send me please thank you



In real-time we don’t need Livy for submitting spark jobs, we have ‘spark-submit’ & Oozie schedulers.
Livy is for REST API, Hue & Jupyter notebooks. I can share details of Hue & Jupyter as i’ve already done it. But for that you need administration rights to cluster. Just let me know.


Instal; Iivy Server and configure so that it is pointing to the cluster

Checkout Examples:

  1. Form a JSON structure with the required job parameters:
    { “className”: “org.apache.spark.examples.SparkPi”,
    “executorMemory”: “20g”,
    “args”: [2000],
    “file”: “/path/to/examples.jar”
  2. Specify master and deploy mode in the livy.conf file.
  3. To submit the SparkPi application to the Livy server, use the a POST /batches request.