Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop import - Cannot resolve SQL type -151

avatar
Explorer

I am trying into ingest one table from MS SQL Server to hdfs through sqoop and I get following error. 

 

 

 

cause:java.io.IOException: Sqoop does not have the splitter for the given SQL data type. Please use either different split column (argument --split-by) or lower the number of mappers to 1. Unknown SQL data type: -151
19/10/30 12:42:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Sqoop does not have the splitter for the given SQL data type. Please use either different split column (argument --split-by) or lower the number of mappers to 1. Unknown SQL data type: -151
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:194)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:305)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:322)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

 

 

 

The sqoop command 

 

 

 

sqoop import --connect 'jdbc:sqlserver://<host>;database=<dbname>' --username "xxxxx" -P --driver com.microsoft.sqlserver.jdbc.SQLServerDriver --table "<tablename>" --target-dir "/data/test/" --hive-import --hive-database <hiveschemaname> --hive-table <hivetablename> --map-column-hive report_date=String,start_time=String,stop_time=String --map-column-java report_date=String,start_time=String,stop_time=String --split-by report_date --as-parquetfile

 

 

 

 

I realised --split-by report_date attribute is causing this issue. The datatype for this column in SQL server table is datetime. I don't get this error When I run without --split-by attribute, but I am loosing parellelism. 

For some unknown reason, sqoop is not able resolve the datatype for this column(all columns in --map-column attribute is of datatime datatype and if I remove these two --map attributes I get the same error Unknown SQL data type: -151 for all these columns). 

 

The sqoop version is Sqoop 1.4.6-cdh5.12.1

 

Any idea about this error ? 

1 ACCEPTED SOLUTION

avatar
Explorer

I replaced jdbc:sqlserver jar and used jtds jar. It worked !!

I figured out later that the sqlserver jar version and sqlserver engine version are not compatible. 

View solution in original post

4 REPLIES 4

avatar
Explorer

I get this error in pyspark too.

avatar
Expert Contributor

Hi,

 

Did you try using -m 1?

 

Does that work fine?

 

Regards

Nitish

avatar
Explorer

I replaced jdbc:sqlserver jar and used jtds jar. It worked !!

I figured out later that the sqlserver jar version and sqlserver engine version are not compatible. 

avatar
Community Manager

I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. 

 

Screen Shot 2019-08-06 at 1.54.47 PM.png

 

 


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.