Hive--not able to create hive table from large dataset(airlines)

Hi,

Iam trying to create hive table using airlines dataset.
given below are the steps i used

val res = spark.read.parquet("/user/training/airlines_all/airlines-part")
res.write.saveAsTable(“hari_n.airline”)

iam getting below error" Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded"

I created hive table with partition on flightyear and
I was able to create table in hive by using subset of data .res.where(“year=2008”).write.saveAsTable(“hari_n.airlines_2008”)
using the table created above i was able to load airlines table for 2008

Can you please help me understand what is the best way to load complete file into hive table.