saveAsTable throws warning in hive

Hi,

I am trying to save DataFrame in to hive table. It is saved but it throws the warning while selecting the data from hive. Do we have any workaround for this? Please help.

sqlContext.sql(“select * from vimaldoss18_hive.departments”).saveAsTable(“vimaldoss18_hive.departments_from_spark”)

hive (vimaldoss18_hive)> select * from departments_from_spark;
OK
perator.java:427)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1762)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Jun 21, 2017 7:19:24 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 2 records.
Jun 21, 2017 7:19:24 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordReader: at row 0. reading next block
Jun 21, 2017 7:19:24 PM WARNING: org.apache.parquet.CorruptStatistics: Ignoring statistics because created_by could not be parsed (see PARQUET-251): parquet-mr version 1.6.0
org.apache.parquet.VersionParser$VersionParseException: Could not parse created_by: parquet-mr version 1.6.0 using format: (.+) version ((.) )?(build ?(.))
at org.apache.parquet.VersionParser.parse(VersionParser.java:112)
at org.apache.parquet.CorruptStatistics.shouldIgnoreStatistics(CorruptStatistics.java:60)
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetStatistics(ParquetMetadataConverter.java:263)
at org.apache.parquet.hadoop.ParquetFileReader$Chunk.readAllPages(ParquetFileReader.java:583)
at org.apache.parquet.hadoop.ParquetFileReader.readNextRowGroup(ParquetFileReader.java:513)
at org.apache.parquet.hadoop.InternalParquetRecordReader.checkRead(InternalParquetRecordReader.java:130)
at org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:214)
at org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue(ParquetRecordReader.java:227)

Can anyone please help me to have this question answered?

Hi, are you using sqlContext in spark context or hive context ?

After running the select query do you see the record printed? It is throwing the warning, it is not error.

@anirvan_sen Thanks for your reply! I have tried sqlContext pointing to both SQLContext and HiveContext. Both of them throws this.

@N_Chakote Thanks for your reply! Yes the records are getting printed with the below error message.

hive (problem5_vimala)> select * from orders limit 10;
OK
Can’t load log handler "java.util.logging.FileHandler"
java.io.FileNotFoundException: /tmp/parquet-0.log (Permission denied)
java.io.FileNotFoundException: /tmp/parquet-0.log (Permission denied)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at java.io.FileOutputStream.(FileOutputStream.java:133)
at java.util.logging.FileHandler.open(FileHandler.java:210)
at java.util.logging.FileHandler.rotate(FileHandler.java:661)
at java.util.logging.FileHandler.openFiles(FileHandler.java:538)
at java.util.logging.FileHandler.(FileHandler.java:263)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.logging.LogManager$5.run(LogManager.java:966)
at java.security.AccessController.doPrivileged(Native Method)
at java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:958)
at java.util.logging.LogManager.addLogger(LogManager.java:1165)
at java.util.logging.LogManager.demandLogger(LogManager.java:556)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at org.apache.parquet.Log.(Log.java:58)
at org.apache.parquet.hadoop.ParquetInputFormat.(ParquetInputFormat.java:94)
at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.(MapredParquetInputFormat.java:45)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:83)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:220)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:369)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:303)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:458)
at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOp

While storing in table also it throws below error message.

scala> ordersDF.saveAsTable(“problem5_vimala.orders”)

warning: there were 1 deprecation warning(s); re-run with -deprecation for details
SLF4J: Failed to load class “org.slf4j.impl.StaticLoggerBinder”.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

try this,

sqlContext.sql(“select *from table”).write.saveAsTable(“database.table”)

It won’t print any Warnings.