Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Using Sqoop to transfer data between HDFS and MS SQL Server

Highlighted

Using Sqoop to transfer data between HDFS and MS SQL Server

New Contributor

I have installed hadoop-0.20.2, sqoop-1.4.4.bin__hadoop-0.20 and Microsoft JDBC Drive (sqljdbc4.jar) on my test system. Hadoop and Sqoop are installed on Linux server. When I try to run "sqoop export" to bring data from HDFS file to a SQL Server table, I got the following error:

 

bin/sqoop export --connect 'jdbc:sqlserver://test.test.test.com;ins

tanceName=test;username=test;password=test;database=test' --table test --

export-dir /user/test.txt

Warning: /usr/lib/hbase does not exist! HBase imports will fail.

Please set $HBASE_HOME to the root of your HBase installation.

Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

13/08/28 17:00:52 INFO manager.SqlManager: Using default fetchSize of 1000

13/08/28 17:00:52 INFO tool.CodeGenTool: Beginning code generation

13/08/28 17:00:53 INFO manager.SqlManager: Executing SQL statement: SELECT t.* F

ROM [test] AS t WHERE 1=0

13/08/28 17:00:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is

.../test/hadoop/hadoop-0.20.2

13/08/28 17:00:53 INFO orm.CompilationManager: Found hadoop core jar at: /u01/ap

p/oracle/downloads/test/hadoop/hadoop-0.20.2/hadoop-0.20.2-core.jar

Note: /tmp/sqoop-test/compile/c9e9db8f124982cb394825ba0c566499/test.java uses

or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

13/08/28 17:00:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-test/compile/c9e9db8f124982cb394825ba0c566499/test.jar

13/08/28 17:00:55 INFO mapreduce.ExportJobBase: Beginning export of test

 

13/08/28 17:01:35 INFO input.FileInputFormat: Total input paths to process : 1

13/08/28 17:01:35 INFO input.FileInputFormat: Total input paths to process : 1

13/08/28 17:01:38 INFO mapred.JobClient: Running job: job_201308281647_0001

13/08/28 17:01:39 INFO mapred.JobClient:  map 0% reduce 0%

 

13/08/28 17:04:06 INFO mapred.JobClient: Task Id : attempt_201308281647_0001_m_0

00000_0, Status : FAILED

Error: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/secur

ity/Credentials;

13/08/28 17:04:09 INFO mapred.JobClient: Task Id : attempt_201308281647_0001_m_0

00001_0, Status : FAILED

Error: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/secur

ity/Credentials;

13/08/28 17:04:37 INFO mapred.JobClient: Task Id : attempt_201308281647_0001_m_0

00001_1, Status : FAILED

Error: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/secur

ity/Credentials;

13/08/28 17:04:41 INFO mapred.JobClient: Task Id : attempt_201308281647_0001_m_0

00000_1, Status : FAILED

Error: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/secur

ity/Credentials;

13/08/28 17:04:43 INFO mapred.JobClient: Task Id : attempt_201308281647_0001_m_0

00001_2, Status : FAILED

Error: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/secur

ity/Credentials;

13/08/28 17:05:00 INFO mapred.JobClient: Job complete: job_201308281647_0001

13/08/28 17:05:00 INFO mapred.JobClient: Counters: 3

13/08/28 17:05:00 INFO mapred.JobClient:   Job Counters

13/08/28 17:05:00 INFO mapred.JobClient:     Launched map tasks=7

13/08/28 17:05:00 INFO mapred.JobClient:     Data-local map tasks=7

13/08/28 17:05:00 INFO mapred.JobClient:     Failed map tasks=1

13/08/28 17:05:00 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 243.9523

seconds (0 bytes/sec)

13/08/28 17:05:00 INFO mapreduce.ExportJobBase: Exported 0 records.

13/08/28 17:05:00 ERROR tool.ExportTool: Error during export: Export job failed!

 

 

Can I get some advice?

 

Thanks a lot.

3 REPLIES 3

Re: Using Sqoop to transfer data between HDFS and MS SQL Server

Cloudera Employee

Hi sir,

the Hadoop release 0.20 is very old release that is lacking a lot of features. One feature that Sqoop requires is a security additions that were added in 1.x. As a result Sqoop won't work on bare 0.20, at least CDH3u1 or Hadoop 1.x is required. I would strongly suggest to upgrade your Hadoop cluster.

 

Jarcec

Highlighted

Re: Using Sqoop to transfer data between HDFS and MS SQL Server

New Contributor

Thanks for the advice. I will upgrade my Hadoop cluster and try again. Regards.

fw

Highlighted

Re: Using Sqoop to transfer data between HDFS and MS SQL Server

New Contributor

can you provide me with this information for sql server 2005

 

JDBC Driver Class: com.mysql.jdbc.Driver
JDBC Connection String: jdbc:mysql://mysql.server/database

Don't have an account?
Coming from Hortonworks? Activate your account here