Support Questions

Find answers, ask questions, and share your expertise

Sqoop cannot load a driver class, SQL Server, when creating a connection

avatar
Explorer

I'm using CDH5.0.2(automated install). Single node hadoop installation. CDH manager is installed on a different node.

I follow this guide to install JDBC driver for SQL server. 

after copying sqljdbc4.jar to /var/lib/sqoop2, i restarted both sqoop2 and hue service. but when i create a new sqoop transfer job using hue's web ui, the jdbc driver class can't find any sql server driver. When i typed in "sql" in the jdbc driver class, it can only find MySql and PostGres Sql driver. it cannot find SQL server driver.

 

I tried changing the owenership(to sqoop2:sqoop2) and also the permission to the sqljdbc4.jar(to 644) to match that is of postgresql-9.0-801.jdbc4.jar in /var/lib/sqoop2, and restart the sqoop2 and hue service again, and still no luck. It still can't seem to find SQL server driver when i typed in SQL. What should be the value of this field for MS SQL Server??

 

I would really appreciate any help. I've ben evaluating CDH5, and can't seem to figure out this problem. 

 

1 ACCEPTED SOLUTION

avatar
Expert Contributor
It seems the issue is that your partition column can't be found. Sqoop should, by default, use the primary key as your partition column. If your table doesn't have a primary key, then you'll have to set it your self in "Partition column name".

View solution in original post

19 REPLIES 19

avatar
Explorer

This sort of helps. There is no tomcat-conf/ directory or conf/catalina.properties file on the machine running sqoop2. Not sure why...I installed the service when I originally deployed the cluster. Cloudera Manager says the service is running and I can connect to the client at the command line just fine...

avatar
Expert Contributor
Is sqljdbc4.jar in /var/lib/sqoop2.

avatar
Explorer

Yes, it is in the directory. Here is an ls -l of what is contained there:

 

-rw-r--r-- 1 sqoop2 sqoop2 539705 Nov 13 17:08 postgresql-9.0-801.jdbc4.jar

-rw-r--r-- 1 sqoop2 sqoop2 584207 Sep 25 15:35 sqljdbc4.jar

-rw-r--r-- 1 sqoop2 sqoop2 563117 Aug 12 13:10 sqljdbc.jar

drwxr-xr-x 5 sqoop2 sqoop    4096 Nov 13 17:08 tomcat-deployment

 

avatar
Expert Contributor
What is your exact error message with stack trace? You can turn on verbose logging via "set option --name verbose --value true"

avatar
Expert Contributor
Also, please remove "sqljdbc.jar" as it could be taking precedence.

avatar
Explorer
I’ve removed the jar.

$ ls -l /var/lib/sqoop2

total 1104

-rw-r--r-- 1 sqoop2 sqoop2 539705 Nov 15 07:07 postgresql-9.0-801.jdbc4.jar

-rw-r--r-- 1 sqoop2 sqoop2 584207 Sep 25 15:35 sqljdbc4.jar

drwxr-xr-x 5 sqoop2 sqoop    4096 Nov 15 07:07 tomcat-deployment

 

 
Before I can specify a connection try and connect, doesn’t sqoop2 need to see the connector? When I do a show connector I still don’t see the MS connector, see below...
 

$ sqoop2

Sqoop home directory: /opt/cloudera/parcels/CDH-5.1.0-1.cdh5.1.0.p0.53/lib/sqoop2

Sqoop Shell: Type 'help' or '\h' for help.

 

sqoop:000> show connector

+----+------------------------+-----------------+------------------------------------------------------+

| Id |          Name          |     Version     |                        Class                         |

+----+------------------------+-----------------+------------------------------------------------------+

| 1  | generic-jdbc-connector | 1.99.3-cdh5.1.0 | org.apache.sqoop.connector.jdbc.GenericJdbcConnector |

+----+------------------------+-----------------+------------------------------------------------------+

sqoop:000>

avatar
Expert Contributor
Sqoop2 actually only has one connector, but can use different jdbc drivers to connect to a relational database. There's work to be done to make optimizations when transfering data, but the generic jdbc connector should work.

Could you try running your job with "set option --name verbose --value true"? Also, please provide the output of "show connection --all" and "show job --all".

avatar
Explorer

sqoop:000> create connection --cid 1

Creating connection for connector with id 1

Please fill following values to create new connection object

Name: leads

 

Connection configuration

 

JDBC Driver Class: com.microsoft.sqlserver.jdbc.SQLServerDriver

JDBC Connection String: jdbc:sqlserver://test:1433     

Username: test

Password: *******

JDBC Connection Properties: 

There are currently 0 values in the map:

entry# 

 

Security related configuration options

 

Max connections: 

New connection was successfully created with validation status FINE and persistent id 3

 

sqoop:000> create job --xid 3 --type import

Creating job for connection with id 3

Please fill following values to create new job object

Name: leads_import

 

Database configuration

 

Schema name: dbo

Table name: test

Table SQL statement: 

Table column names: 

Partition column name: 

Nulls in partition column: 

Boundary query: 

 

Output configuration

 

Storage type: 0

  0 : HDFS

 

Output format: 

  0 : TEXT_FILE

  1 : SEQUENCE_FILE

Choose: 0

Compression format: 

  0 : NONE

  1 : DEFAULT

  2 : DEFLATE

  3 : GZIP

  4 : BZIP2

  5 : LZO

  6 : LZ4

  7 : SNAPPY

Choose: 0

Output directory: /test

 

Throttling resources

 

Extractors: 

Loaders: 

New job was successfully created with validation status FINE  and persistent id 1

sqoop:000> start job --jid 1

Exception has occurred during processing command 

Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception

Stack trace:

at  org.apache.sqoop.client.request.Request$ServerExceptionFilter (Request.java:103)  

at  com.sun.jersey.api.client.WebResource (WebResource.java:670)  

at  com.sun.jersey.api.client.WebResource (WebResource.java:74)  

at  com.sun.jersey.api.client.WebResource$Builder (WebResource.java:563)  

at  org.apache.sqoop.client.request.Request (Request.java:67)  

at  org.apache.sqoop.client.request.SubmissionRequest (SubmissionRequest.java:64)  

at  org.apache.sqoop.client.request.SqoopRequests (SqoopRequests.java:146)  

at  org.apache.sqoop.client.SqoopClient (SqoopClient.java:436)  

at  org.apache.sqoop.shell.StartJobFunction (StartJobFunction.java:80)  

at  org.apache.sqoop.shell.SqoopFunction (SqoopFunction.java:51)  

at  org.apache.sqoop.shell.StartCommand (StartCommand.java:50)  

at  org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:127)  

at  org.codehaus.groovy.tools.shell.Command$execute (null:-1)  

at  org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42)  

at  org.codehaus.groovy.tools.shell.Command$execute (null:-1)  

at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:101)  

at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:-1)  

at  sun.reflect.GeneratedMethodAccessor24 (null:-1)  

at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)  

at  java.lang.reflect.Method (Method.java:606)  

at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)  

at  groovy.lang.MetaMethod (MetaMethod.java:233)  

at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)  

at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)  

at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:173)  

at  sun.reflect.GeneratedMethodAccessor23 (null:-1)  

at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)  

at  java.lang.reflect.Method (Method.java:606)  

at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267)  

at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52)  

at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:141)  

at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:121)  

at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:114)  

at  org.codehaus.groovy.tools.shell.Shell$leftShift$0 (null:-1)  

at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:88)  

at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1)  

at  sun.reflect.GeneratedMethodAccessor21 (null:-1)  

at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)  

at  java.lang.reflect.Method (Method.java:606)  

at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)  

at  groovy.lang.MetaMethod (MetaMethod.java:233)  

at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)  

at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)  

at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148)  

at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:100)  

at  sun.reflect.GeneratedMethodAccessor20 (null:-1)  

at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)  

at  java.lang.reflect.Method (Method.java:606)  

at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267)  

at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52)  

at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:137)  

at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:57)  

at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1)  

at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)  

at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)  

at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)  

at  java.lang.reflect.Method (Method.java:606)  

at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)  

at  groovy.lang.MetaMethod (MetaMethod.java:233)  

at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)  

at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)  

at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148)  

at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:66)  

at  java_lang_Runnable$run (null:-1)  

at  org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42)  

at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:108)  

at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:112)  

at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:463)  

at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:402)  

at  org.apache.sqoop.shell.SqoopShell (SqoopShell.java:128)  

Caused by: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0005:No column is found to partition data

Stack trace:

at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:148)  

at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:51)  

at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:40)  

at  org.apache.sqoop.framework.JobManager (JobManager.java:378)  

at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:152)  

at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:122)  

at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:75)  

at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  

at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  

at  javax.servlet.http.HttpServlet (HttpServlet.java:643)  

at  javax.servlet.http.HttpServlet (HttpServlet.java:723)  

at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  

at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  

at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  

at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  

at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  

at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103)  

at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  

at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  

at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:861)  

at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606)  

at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  

at  java.lang.Thread (Thread.java:745)  

sqoop:000>

 

 

sqoop:000> show connection --all

1 connection(s) to show: 

Connection with id 3 and name leads (Enabled: true, Created by asdf at 11/15/14 5:35 PM, Updated by asdf at 11/15/14 5:35 PM)

Using Connector id 1

  Connection configuration

    JDBC Driver Class: com.microsoft.sqlserver.jdbc.SQLServerDriver

    JDBC Connection String: jdbc:sqlserver://test

    Username: test

    Password: 

    JDBC Connection Properties: 

  Security related configuration options

    Max connections:

 

 

 

 

sqoop:000> show job --all

1 job(s) to show: 

Job with id 1 and name leads_import (Enabled: true, Created by asdf at 11/15/14 5:50 PM, Updated by asdf at 11/15/14 5:50 PM)

Using Connection id 3 and Connector id 1

  Database configuration

    Schema name: dbo

    Table name: test

    Table SQL statement: 

    Table column names: 

    Partition column name: 

    Nulls in partition column: 

    Boundary query: 

  Output configuration

    Storage type: HDFS

    Output format: TEXT_FILE

    Compression format: NONE

    Output directory: /test

  Throttling resources

    Extractors: 

    Loaders:

avatar
Expert Contributor
It seems the issue is that your partition column can't be found. Sqoop should, by default, use the primary key as your partition column. If your table doesn't have a primary key, then you'll have to set it your self in "Partition column name".

avatar
Explorer

Thanks, this solved my problem!