Using Sqoop, how to export a file from local hard drive to mysql


#1

Hi,
Hi can I export a file that is located in my local hard drive to mysql using sqoop.
I have not issues when exporting from hdfs to mysql, but I need to manually copy the files from local hard drive to hdfs by following command: hdfs dfs -copyFromLocal

This is my sqoop script:
sqoop export --connect jdbc:mysql://hostname/dbname --username user --password password --table table --columns ‘DATA_SOURCE_FILE_ID,SYSTEM_ID’
–input-lines-terminated-by ‘\n’
–input-fields-terminated-by ‘\t’
–export-dir “file:///root/webcosts/workdir/output/prisma_details_part1/*”

Below is the error I am getting:
Warning: /usr/hdp/2.6.1.0-129/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/02/22 16:31:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.1.0-129
18/02/22 16:31:29 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/02/22 16:31:29 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/02/22 16:31:29 INFO tool.CodeGenTool: Beginning code generation
18/02/22 16:31:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM digital_all_sources AS t LIMIT 1
18/02/22 16:31:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM digital_all_sources AS t LIMIT 1
18/02/22 16:31:31 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.1.0-129/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/f2b28ac7f9cd6f082c4e0d9bbd56c6bb/digital_all_sources.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/02/22 16:31:37 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/f2b28ac7f9cd6f082c4e0d9bbd56c6bb/digital_all_sources.jar
18/02/22 16:31:37 INFO mapreduce.ExportJobBase: Beginning export of digital_all_sources
18/02/22 16:31:42 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.IllegalArgumentException: Wrong FS: file://sandbox.hortonworks.com:8020/root/webcosts/workdir/output/prisma_details_part1, expected: file:///
java.lang.IllegalArgumentException: Wrong FS: file://sandbox.hortonworks.com:8020/root/webcosts/workdir/output/prisma_details_part1, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:665)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:87)
at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1547)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1590)
at org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:676)
at org.apache.hadoop.fs.Globber.listStatus(Globber.java:69)
at org.apache.hadoop.fs.Globber.glob(Globber.java:217)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1696)
at org.apache.sqoop.mapreduce.ExportJobBase.getFileType(ExportJobBase.java:134)
at org.apache.sqoop.mapreduce.ExportJobBase.getInputFileType(ExportJobBase.java:525)
at org.apache.sqoop.mapreduce.JdbcExportJob.configureInputFormat(JdbcExportJob.java:68)
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:436)
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)