Support Questions
Find answers, ask questions, and share your expertise

Sqoop import into Hive as Parquet fails for decimal type

Sqoop import into Hive as Parquet fails for decimal type

New Contributor


I am trying to import a table from MS SQL server into Hive as Parquet, and one of the columns is a decimal type. By default, Sqoop would change the type for the decimal to a double, but unfortunately that is causing precision issues for some of our calculations.

Right now, I am getting the following error running in a HDP 2.4 sandbox:

Import command:

[root@sandbox sqoop]# sqoop import -Dsqoop.avro.logical_types.decimal.enable=true --hive-import --num-mappers 1 --connect "jdbc:sqlserver://<conn_string>" --username uname --password pass --hive-overwrite --hive-database default --table SqoopDecimalTest --driver --null-string '\\N' --as-parquetfile

Error: Failed to append {"id": 1, "price": 19.123450} to ParquetAppender{path=hdfs://, schema={"type":"record","name":"SqoopDecimalTest","doc":"Sqoop import of SqoopDecimalTest","fields":[{"name":"id","type":["null","int"],"default":null,"columnName":"id","sqlType":"4"},{"name":"price","type":["null",{"type":"bytes","logicalType":"decimal","precision":19,"scale":6}],"default":null,"columnName":"price","sqlType":"3"}],"tableName":"SqoopDecimalTest"}, fileSystem=DFS[DFSClient[clientName=DFSClient_attempt_1514513583437_0001_m_000000_0_1859161154_1, ugi=root (auth:SIMPLE)]], avroParquetWriter=org.apache.parquet.avro.AvroParquetWriter@f60f96b} at at$DatasetRecordWriter.write( at$DatasetRecordWriter.write( at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write( at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write( at$Context.write( at at at at at org.apache.hadoop.mapred.MapTask.runNewMapper( at at org.apache.hadoop.mapred.YarnChild$ at Method) at at at org.apache.hadoop.mapred.YarnChild.main(

Caused by: java.lang.ClassCastException: java.math.BigDecimal cannot be cast to java.nio.ByteBuffer at org.apache.parquet.avro.AvroWriteSupport.writeValue( at org.apache.parquet.avro.AvroWriteSupport.writeRecordFields( at org.apache.parquet.avro.AvroWriteSupport.write( at org.apache.parquet.hadoop.InternalParquetRecordWriter.write( at org.apache.parquet.hadoop.ParquetWriter.write( at at at

I am running Sqoop v.1.4.7 built against Kite v. 1.1.1-SNAPSHOT (the master branch) because I noticed that the current version 1.0.0 uses parquet-avro 1.6.0, so I thought using parquet-avro 1.8.1 might help. I get the error in both versions.

Does anyone know what might be wrong? Or, is the answer that this is simply not supported in Sqoop? Any ideas would be greatly appreciated!

Thank you,



Re: Sqoop import into Hive as Parquet fails for decimal type

New Contributor

Hello @subhash_sriram,

I encounter the same issue. Did you find the solution? 

Re: Sqoop import into Hive as Parquet fails for decimal type

Community Manager

@ou As this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. 



Vidya Sargur,
Community Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community: