Object databricks is not a member of package com

#1

I got this error.May i know what is wrong here?.

scala> import com.databricks.spark.avro_;

:25: error: object databricks is not a member of package com
import com.databricks.spark.avro_;

0 Likes

#2

The databricks package is not present.
Follow the below step:

1.If you are using spark-shell use this to download the file.
launch spark like this:

spark-shell --master yarn --packages com.databricks:spark-avro_2.10:2.0.1

2.If you are using sbt and doing it in program then you have to add dependency in sbt like this:

libraryDependencies += “com.databricks” % “spark-avro_2.10” % “2.0.1”

3 Likes

#3

if we want to add more than one thrid party dependency shall we add them with a comma seperator like below

spark-shell --master yarn --packages com.databricks:spark-avro_2.10:2.0.1 , com.xyz:abc_2.10:2.2 , com.user:use_2.11

@anirvan_sen shall we do like above ?

please forgive me if the question is very basic.

Thanks
Esakki

0 Likes

#4

@esak Yeah you can do so but make sure you do not have any white spaces in between the packages, use comma only.

I once faced this issue while importing

1 Like

#5

Thanks for your reply Anirvan.Still throwing the error

scala> spark-shell --master yarn --packages com.databricks:spark-avro_2.10:2.0.1
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Ivy Default Cache set to: /home/rajeshwaranb/.ivy2/cache
The jars for the packages stored in: /home/rajeshwaranb/.ivy2/jars
:: loading settings :: url = jar:file:/usr/hdp/2.5.0.0-1245/spark/lib/spark-assembly-1.6.2.2.5.0.0-1245-hadoop2.7.3.2.5.0.0-1245.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.databricks#spark-avro_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.databricks#spark-avro_2.10;2.0.1 in central
found org.apache.avro#avro;1.7.6 in central
found org.codehaus.jackson#jackson-core-asl;1.9.13 in central
found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in central
found com.thoughtworks.paranamer#paranamer;2.3 in central
found org.xerial.snappy#snappy-java;1.0.5 in central
found org.apache.commons#commons-compress;1.4.1 in central
found org.tukaani#xz;1.0 in central
found org.slf4j#slf4j-api;1.6.4 in central
downloading https://repo1.maven.org/maven2/com/databricks/spark-avro_2.10/2.0.1/spark-avro_2.10-2.0.1.jar
[SUCCESSFUL ] com.databricks#spark-avro_2.10;2.0.1!spark-avro_2.10.jar (48ms)
downloading https://repo1.maven.org/maven2/org/apache/avro/avro/1.7.6/avro-1.7.6.jar
[SUCCESSFUL ] org.apache.avro#avro;1.7.6!avro.jar(bundle) (118ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar …downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[SUCCESSFUL ] org.codehaus.jackson#jackson-mapper-asl;1.9.13!jackson-mapper-asl.jar (133ms)
downloading https://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
[SUCCESSFUL ] com.thoughtworks.paranamer#paranamer;2.3!paranamer.jar (21ms)
downloading https://repo1.maven.org/maven2/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
[SUCCESSFUL ] org.xerial.snappy#snappy-java;1.0.5!snappy-java.jar(bundle) (149ms)
downloading https://repo1.maven.org/maven2/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
[SUCCESSFUL ] org.apache.commons#commons-compress;1.4.1!commons-compress.jar (39ms)
downloading https://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.6.4/slf4j-api-1.6.4.jar
[SUCCESSFUL ] org.slf4j#slf4j-api;1.6.4!slf4j-api.jar (21ms)
downloading https://repo1.maven.org/maven2/org/tukaani/xz/1.0/xz-1.0.jar
[SUCCESSFUL ] org.tukaani#xz;1.0!xz.jar (25ms)
:: resolution report :: resolve 3689ms :: artifacts dl 622ms
:: modules in use:
com.databricks#spark-avro_2.10;2.0.1 from central in [default]
com.thoughtworks.paranamer#paranamer;2.3 from central in [default]
org.apache.avro#avro;1.7.6 from central in [default]
org.apache.commons#commons-compress;1.4.1 from central in [default]
org.codehaus.jackson#jackson-core-asl;1.9.13 from central in [default]
org.codehaus.jackson#jackson-mapper-asl;1.9.13 from central in [default]
org.slf4j#slf4j-api;1.6.4 from central in [default]
org.tukaani#xz;1.0 from central in [default]
org.xerial.snappy#snappy-java;1.0.5 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 9 | 9 | 9 | 0 || 9 | 9 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
9 artifacts copied, 0 already retrieved (3100kB/12ms)
Welcome to

ERROR below:

scala> val dataFile = sqlContext.read.avro("/user/rajeshwaranb/problem5/avro")
:25: error: value avro is not a member of org.apache.spark.sql.DataFrameReader
** val dataFile = sqlContext.read.avro("/user/rajeshwaranb/problem5/avro")**

0 Likes

#6

You have just downloaded the files .Now you need to import the package like:

import com.databricks.avro._
val dataFile = sqlContext.read.avro("/user/rajeshwaranb/problem5/avro")

Now this will work! Check and confirm

0 Likes

#7

after downloaded the files. Still facing this issue import package

scala> import com.databricks.avro._
:25: error: object avro is not a member of package com.databricks
import com.databricks.avro._
^

0 Likes

#8

I am not able to think of any other solution.

0 Likes

#9

@anirvan_sen @Rajarajeshwaran
Here you have missed spark in import
import com.databricks.spark.avro._

0 Likes

#10

Use spark-shell with the following option:

spark-shell --master yarn --packages com.databricks:spark-avro_2.10:2.0.1

0 Likes

#11

The Databricks avro version should be same compatible to scala version of Spark

0 Likes

#12

@rohith_nallam how to check databricks version which is installed is compatible with scala version? How to find which databricks version installed on the cluster provided on exam?

0 Likes

#13

In the exam you don’t need to worry about versions.The packages are installed correctly.The only thing you need to know is how to import them while using avro

0 Likes

#14

can you please tell who to use the avro package in spark2-shell…Am getting error

0 Likes