Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sqoop export sqlserver non-default schema

Sqoop export sqlserver non-default schema

New Contributor

We are trying to export to a non default sql server schema. We have found that this can be achieved with database specific argument '-- --schema [name]' when using 'sqoop -- export --connect etc..' and this works until you try to save this export as a job. so the following works:

1) sqoop export --connect "jdbc:sqlserver://xxxx\\xxxxxx;databaseName=name" --username user --password password --table tablename --export-dir /../../tablename --input-fields-terminated-by \\t --input-null-string \\\\N --input-null-non-string \\\\N -- --schema WORKSPACE

when we save it as a job, the job no longer can find the object in the sql server database

2)sqoop job -Dmapreduce.job.name=job_name --create job_name --meta-connect jdbc:mysql://xxx.corp:xxx/sqoop?user=user\&password=password -- export --connect "jdbc:sqlserver://xxxx\\xxxxxx;databaseName=name" --username user --password password --table tablename --export-dir /../../tablename --input-fields-terminated-by \\t --input-null-string \\\\N --input-null-non-string \\\\N -- --schema WORKSPACE

Prepending the schema to the table argument has not worked and including --schema as a non database specific argument has not worked. We are hoping there is a solution to this issue. We are running sqoop version 1.4.5.2.2.0.0-2041 on Hadoop version 2.6.0.2.2.0.0-2041

Thank you for your time.

1 REPLY 1

Re: Sqoop export sqlserver non-default schema

Expert Contributor

Have you checked the logs?

/var/log/hadoop/mapreduce

Or run the job with the --verbose option to get some insight into why the job is failing?

Don't have an account?
Coming from Hortonworks? Activate your account here