Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Oozie sqoop job failing - Unable to read options file

Oozie sqoop job failing - Unable to read options file

Rising Star

I've got a problem with running Sqoop job in Oozie. When I try to run a command like this one:

--options-file /tmp/dss_conn_parms.txt --table BD.DMS --hive-import --hive-table BD.DMS_OOZIE_1 --m=1

Options file dss_conn_parms.txt:

import
--connect 
jdbc:oracle:thin:@//123.17.8.12:1234/dspr
--username
user
--password
password

I upload a file with options on my local file system and HDFS. But I got an error like this:

1392 [main] ERROR org.apache.sqoop.Sqoop  - Error while expanding arguments
java.lang.Exception: Unable to read options file: <a href="http://HOST:8888/filebrowser/view/tmp/dss_conn_parms.txt">hdfs:///tmp/dss_conn_parms.txt</a>
	at org.apache.sqoop.util.OptionsFileUtil.expandArguments(OptionsFileUtil.java:102)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:204)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.FileNotFoundException: hdfs:/tmp/dss_conn_parms.txt (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at java.io.FileReader.<init>(FileReader.java:72)
	at org.apache.sqoop.util.OptionsFileUtil.expandArguments(OptionsFileUtil.java:70)
	... 20 more
2016-10-11 11:32:15,226 ERROR [main] sqoop.Sqoop (Sqoop.java:runTool(209)) - Error while expanding arguments
java.lang.Exception: Unable to read options file: <a href="http://HOST:8888/filebrowser/view/tmp/dss_conn_parms.txt">hdfs:///tmp/dss_conn_parms.txt</a>
	at org.apache.sqoop.util.OptionsFileUtil.expandArguments(OptionsFileUtil.java:102)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:204)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.FileNotFoundException: hdfs:/tmp/dss_conn_parms.txt (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at java.io.FileReader.<init>(FileReader.java:72)
	at org.apache.sqoop.util.OptionsFileUtil.expandArguments(OptionsFileUtil.java:70)
	... 20 more
Intercepting System.exit(1)
7 REPLIES 7

Re: Oozie sqoop job failing - Unable to read options file

@Mateusz Grabowski

Can you please specify the parameter file content as below?

import

--connect

jdbc:oracle:thin:@//ip/pw

--username

DSSAPI_BD

--password

xxx

Re: Oozie sqoop job failing - Unable to read options file

a new line between the next parameters.

Re: Oozie sqoop job failing - Unable to read options file

@Mateusz Grabowski Make sure the parameter file is located on hdfs under tmp. As you have indicated on --options-file /tmp/dss_conn_parms.txt

Something like this:

hdfs dfs -put dss_conn_parms.txt /tmp/

And test again.

Re: Oozie sqoop job failing - Unable to read options file

Rising Star

@Felix Albani Let me ask you one question - where I should upload options file? To a local file system or to HDFS? I read that the file should be located on local file system, not HDFS, because Sqoop is not able to read a file from HDFS.

Re: Oozie sqoop job failing - Unable to read options file

From the error you pasted I see is trying to fetch file from hdfs. Did you tried putting the file on hdfs yet?

Re: Oozie sqoop job failing - Unable to read options file

Rising Star

@Felix Albani

I did. But I know where is the problem. I've got multi node cluster and when I try to run Sqoop job in Oozie using Hue, YARN select one from my cluster and try to find option file on that node. Example:

I've got 4 nodes cluster:

Node1 (127.0.0.1) - I've got options file in folder /tmp/

Node2 (127.0.0.2) - no options file

Node3 (127.0.0.3) - no options file

Node4 (127.0.0.4) - I've got options file in folder /tmp/

And now when I try to run Sqoop job in Oozie using Hue, YARN choose one of that node. In log file we can see line like that:

sun.java.command=org.apache.hadoop.mapred.YarnChild 127.0.0.1 55363 attempt_1475646426219_0114_m_000000_0 21990232555522

and job will run perfectly, because there is options file on Node1. On the other hand, when YARN choose Node2 or Node3 the job will fail.

Do you know how can I solve this problem? Can choose node on which I want to run a job? I want to have options file only on one node, not four.

Highlighted

Re: Oozie sqoop job failing - Unable to read options file

New Contributor

Hi @Sindhu Even after giving new line between the next parameters. I am facing the same issue

"ERROR sqoop.Sqoop: Error while expanding arguments java.lang.Exception: Unable to read options file: /home/sqoopingestiondev/dlo/ingestion_script_generator/templates/ingestion/optionsfile at org.apache.sqoop.util.OptionsFileUtil.expandArguments(OptionsFileUtil.java:102)"

Can you please elaborate how this new line helps. Also please let me know if option file can be placed in HDFS.

In sqoop user guide it is mentioned that password file can be used from hdfs path then why option file cannot be used?. Also i can't see anything specific inside Sqoop user guide related to keeping option file in hdfs or local.