Including Kafka dependencies while starting spark shell

Could somebody help me in understanding how to include Kafka Dependences like spark-streaming-kafka_2.10 (assuming they are copied in a location on the gw01 node).

I want to be able to test Kafka Consumer using Spark Streaming from Spark Shell

So how do I make kafka consumer API available while calling Spark Shell.

I tried passing jars=spark-streaming-kafka_2.10.jar while invoking spark-shell, but I cant make calls to Kafka Consumer API.

Thanks.

You need to give complete path for the jars.

Use this --jars "/usr/hdp/2.5.0.0-1245/spark/lib/spark-streaming_2.10-1.6.2.jar,/usr/hdp/2.5.0.0-1245/kafka/libs/spark-streaming-kafka_2.10-1.6.2.jar,/usr/hdp/2.5.0.0-1245/kafka/libs/kafka_2.10-0.8.2.1.jar,/usr/hdp/2.5.0.0-1245/kafka/libs/metrics-core-2.2.0.jar"

Hi,

It doesnt work. I get this error.

scala> import org.apache.kafka.clients.consumer.ConsumerRecord
:25: error: object kafka is not a member of package org.apache
import org.apache.kafka.clients.consumer.ConsumerRecord