Hive table with parquet file error

I have all the file in the respective paths, but encountering the error while creating hive table. Same works fine with avro credentials. Can you help?

hive> CREATE EXTERNAL TABLE county_par
LOCATION '/user/vvinodh6153/county_par’
TBLPROPERTIES (‘avro.schema.url’=‘hdfs://ip-172-31-53-48.ec2.internal:8020/user/vvinodh6153/county_avsc/county_avro.avsc’);
FAILED: SemanticException [Error 10043]: Either list of columns or a custom serializer should be specified

[vvinodh6153@ip-172-31-38-183 ~]$ hadoop fs -cat /user/vvinodh6153/county_avsc/county_avro.avsc
“type” : “record”,
“name” : “county”,
“doc” : “Sqoop import of county”,
“fields” : [ {
“name” : “county_id”,
“type” : [ “null”, “int” ],
“default” : null,
“columnName” : “county_id”,
“sqlType” : “4”
}, {
“name” : “county_name”,
“type” : [ “null”, “string” ],
“default” : null,
“columnName” : “county_name”,
“sqlType” : “12”
} ],
“tableName” : “county”
[vvinodh6153@ip-172-31-38-183 ~]$

[vvinodh6153@ip-172-31-38-183 ~]$ hadoop fs -ls /user/vvinodh6153/county_par
Found 3 items
drwxr-xr-x - vvinodh6153 vvinodh6153 0 2017-01-09 03:59 /user/vvinodh6153/county_par/.metadata
-rw-r–r-- 3 vvinodh6153 vvinodh6153 778 2017-01-09 04:00 /user/vvinodh6153/county_par/344ef8c1-9914-40e1-b9a5-734bbca54714.parquet
-rw-r–r-- 3 vvinodh6153 vvinodh6153 724 2017-01-09 04:00 /user/vvinodh6153/county_par/61ad8377-0785-4652-a130-6fc00a0f12ea.parquet
[vvinodh6153@ip-172-31-38-183 ~]$

Hi Vinodh,

It looks like you are trying to create a table in HIVE in parquet format but you are supplying the avro schema in table properties. Avro and Parquet are two different formats and they cannot be mixed while creating the table. A table can either be STORED AS AVRO or PARQUET. Avro schema can be used to create a table stored as Avro only.

The above table creation would work fine if you store the table as Avro and not Parquet. However, if you want the table stored as Parquet, you can enter the field names and datatypes manually and remove the TBLPROPERTIES.


1 Like

Thanks santy, got it to work.