I have all the file in the respective paths, but encountering the error while creating hive table. Same works fine with avro credentials. Can you help?
hive> CREATE EXTERNAL TABLE county_par
STORED AS PARQUET
LOCATION '/user/vvinodh6153/county_par’
TBLPROPERTIES (‘avro.schema.url’=‘hdfs://ip-172-31-53-48.ec2.internal:8020/user/vvinodh6153/county_avsc/county_avro.avsc’);
FAILED: SemanticException [Error 10043]: Either list of columns or a custom serializer should be specified
hive>
[vvinodh6153@ip-172-31-38-183 ~]$ hadoop fs -cat /user/vvinodh6153/county_avsc/county_avro.avsc
{
“type” : “record”,
“name” : “county”,
“doc” : “Sqoop import of county”,
“fields” : [ {
“name” : “county_id”,
“type” : [ “null”, “int” ],
“default” : null,
“columnName” : “county_id”,
“sqlType” : “4”
}, {
“name” : “county_name”,
“type” : [ “null”, “string” ],
“default” : null,
“columnName” : “county_name”,
“sqlType” : “12”
} ],
“tableName” : “county”
}
[vvinodh6153@ip-172-31-38-183 ~]$
[vvinodh6153@ip-172-31-38-183 ~]$ hadoop fs -ls /user/vvinodh6153/county_par
Found 3 items
drwxr-xr-x - vvinodh6153 vvinodh6153 0 2017-01-09 03:59 /user/vvinodh6153/county_par/.metadata
-rw-r–r-- 3 vvinodh6153 vvinodh6153 778 2017-01-09 04:00 /user/vvinodh6153/county_par/344ef8c1-9914-40e1-b9a5-734bbca54714.parquet
-rw-r–r-- 3 vvinodh6153 vvinodh6153 724 2017-01-09 04:00 /user/vvinodh6153/county_par/61ad8377-0785-4652-a130-6fc00a0f12ea.parquet
[vvinodh6153@ip-172-31-38-183 ~]$