Sqoop hive-import issue in Hortonworks distribution

Hi,
I have hortonworks sandbox in virtual box. I have successfully created a table in hive by sqoop hive import command and while validating the data, I found that the data in the hadoop file system in the hive home directory (warehouse/school.tblstudent/part*) all the fields are see as concatenated and they are not delimited.
but when I queried the hive table (select * from tblstudent) it is seen as tab delimited. this appears to me as an issue because while exporting the same data into a mysql table it is erroring out because of the mismatch of number of columns.

then I did a normal import into another hadoop file system location I see all fields are delimited by “,”( a coma.)

I am attqching both the sqoop statements below; Please help.
Note: this is only practice not in work or production.

qoop import
–connect “jdbc:mysql://127.0.0.1/myschool”
–username root
–password hadoop
–table tblstudent
–driver com.mysql.jdbc.Driver
–hive-home /apps/hive/warehouse
–hive-import
–hive-table schools.tblstudent
–create-hive-table
–outdir java_files

sqoop import \

–connect “jdbc:mysql://127.0.0.1/myschool”
–username root
–password hadoop
–table tblstudent
–driver com.mysql.jdbc.Driver
–target-dir /user/root/school