Error writing avro data as parquet gzip


I am trying to save avro file as parquet gzip file using the below code
val a ="/user/ubuntuaws/problem5/avro")

and this is the error I am getting–

SLF4J: Failed to load class “org.slf4j.impl.StaticLoggerBinder”.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See for further details.

please advice

Even with this error files may be written to hdfs. Can you confirm if files doesn’t exist?

It might be some temporary issue. I was able to write the compressed file using the above commands only in my Cloudera VM.

scala> sqlContext.setConf(“spark.sql.parquet.compression.codec”, “gzip”)

17/06/13 10:32:47 WARN hdfs.DFSClient: Caught exception
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(
at java.lang.Thread.join(
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(
at org.apache.hadoop.hdfs.DFSOutputStream$

yes the files are present

Those warnings are related to logging, you can ignore them. If you want to get rid of those warnings download any one of slf4j-nop.jar slf4j-simple.jar, slf4j-log4j12.jar, slf4j-jdk14.jar or logback-classic.jar file from maven repository and place it in class path as mentioned in the link given in error.

Thanks Pranay and pradeep for your quick help!

Even I got the same warning but if we try the below option we do not get any warnings


Just the suggestion…

Thanks guys, much appreciated !