Created 07-31-2019 11:12 PM
Trying to import data from oracle DB and getting error
.... 19/07/31 13:07:10 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-myuser/compile/375d3de163797c05cd7b480fddcfe58c/QueryResult.jar 19/07/31 13:07:10 ERROR tool.ImportTool: Import failed: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "null" at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3281) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301) ....
My sqoop command looks like...
sqoop import \ -Dmapreduce.map.memory.mb=3144 -Dmapreduce.map.java.opts=-Xmx1048m \ -Dyarn.app.mapreduce.am.log.level=DEBUG \ -Dmapreduce.map.log.level=DEBUG \ -Dmapreduce.reduce.log.level=DEBUG \ -Dmapred.job.name="Ora import table $tablename" \ -Djava.security.egd=file:///dev/urandom \ -Djava.security.egd=file:///dev/urandom \ -Doraoop.timestamp.string=false \ -Dmapreduce.map.max.attempts=10 \ $oracle_cnxn_str \ --as-parquetfile \ --target-dir /some/hdfs/path \ -query "$sqoop_query" \ --split-by $splitby \ --where "1=1" \ --num-mappers 12 \ --delete-target-dir
Not sure what to make of this error message. Any debugging suggestions or fixes?
Created 07-31-2019 11:28 PM
Issue was that the --target-dir path included some variables in the start of the path and ended up with the path looking like
//some/hdfs/path
and the "empty" // was confusing sqoop.
Created 07-31-2019 11:28 PM
Issue was that the --target-dir path included some variables in the start of the path and ended up with the path looking like
//some/hdfs/path
and the "empty" // was confusing sqoop.