Support Questions

Find answers, ask questions, and share your expertise

for sql server sqoop import doesn't work when trying to pass the query from a file


This is the script.

<code>query=$(cat ${SQL_SCRIPT})
where_clause=" where dateadded >= '2016-05-01' and dateadded < '2016-06-01' and \$CONDITIONS"

sqoop import -D mapreduce.job.queuename=s_sourcedata \--connect 'jdbc:sqlserver://connection' \--compression-codec \--username name \--password pas \--query "${query}${where_clause}" \--as-parquetfile \--split-by dateadded \--delete-target-dir \--target-dir prioritypass_history \-m 1

It doesn't work this way, but if I change first string to

<code>query="select * FROM smth.[dbo].[tablename]"

it works.

My action looks like this

<code><action name="history" cred="hv_cred">

<shell xmlns="uri:oozie:shell-action:0.1">








</shell><ok to="end"/>

<error to="kill"/></action>

The thing is I used this same code to import data from oracle, changing only connection details. My only guess is that oozie doesn't like the fact the script is in folder oracle, but I'm not sure and don't know what to change it to if that's the case.


I don't use sqoop action because there are some libs missing on the claster and it doesn't work.


Expert Contributor

Hello Gjin,

It is always better to use specific action in oozie instead putting commands in shell action.

----> Can you put echo for sqoop command in shell script to check the exact sqoop command which oozie action will be executing?

----> Also please post the errors which you face when executing sqoop action, there are very few jars which are required for sqoop action (Sqoop jars from System share lib, Metastore/exec jar file and driver jar file for databse we you are trying to connect. These can be place in workflow/lib directory on HDFS).

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.