Spark foreach failing to print the RDD read from HBase table



Hi All,

In the below snippet I’m trying to read the HBase table using spark application.

val conf = HBaseConfiguration.create()

val usersRDD = sc.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result])
usersRDD.foreach{case(_,result) =>
val key = Bytes.toString(result.getRow)
val name = Bytes.toString(result.getValue(“basic”.getBytes,“name”.getBytes))
val age = Bytes.toString(result.getValue(“basic”.getBytes,“age”.getBytes))
println("Row Key : " + key + " Name : " + name + " Age : " + age)

Here in the above snippet I’m able to obtain the table data but failing to iterate through it and print them.
I also got the count of usersRDD using count function, it returns correct no. of rows available. That makes sure that table data is fetched.

Please someone tell me where I’m making the mistake.


Firstly, when I am trying to do this
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
its saying error: object mapreduce is not a member of package org.apache.hadoop.hbase
So I am unable to use TableInputFormat on the first hand… Can you help me with this? Thank You


Please find the libraries mentioned below, which I added in my project.
Adding hbase-server should solve your problem. Because TableInputFormat is a part of hbase-server package.

libraryDependencies ++= Seq(
“org.apache.spark” % “spark-core_2.11” % “2.1.0”,
“org.apache.hbase” % “hbase-client” % “1.1.2”,
“org.apache.hbase” % “hbase-common” % “1.1.2”,
“org.apache.hbase” % “hbase-server” % “1.1.2”


still I am facing this issue:

i am using these maven dependencies:

In my code:


my spark submit script:

spark-submit --verbose --master yarn
–jars /usr/hdp/,
–class sparkhbasexample hbase-example-1.0.0.jar
Please help me resolve this


Can you please post your code complete snippet.