Including Kafka dependencies while starting spark shell

Could somebody help me in understanding how to include Kafka Dependences like spark-streaming-kafka_2.10 (assuming they are copied in a location on the gw01 node).

I want to be able to test Kafka Consumer using Spark Streaming from Spark Shell

So how do I make kafka consumer API available while calling Spark Shell.

I tried passing jars=spark-streaming-kafka_2.10.jar while invoking spark-shell, but I cant make calls to Kafka Consumer API.


You need to give complete path for the jars.

Use this --jars "/usr/hdp/,/usr/hdp/,/usr/hdp/,/usr/hdp/"


It doesnt work. I get this error.

scala> import org.apache.kafka.clients.consumer.ConsumerRecord
:25: error: object kafka is not a member of package org.apache
import org.apache.kafka.clients.consumer.ConsumerRecord