Created 08-03-2017 10:50 AM
Hi all,
I am trying to ingest schema from sql Server to hive .I am getting below error.
Could you please advise?
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: columns has 47 elements while columns.types has 44 elements!)]
Created 08-04-2017 03:20 AM
Issue seems to be mismatch with the number of columns coming in from source and columns in Hive table.
Share the sqoop import command and also, the DDL of the Hive table.
Created 08-04-2017 04:14 AM
if you are importing the data from the DB the default field delimiter is comma (,), make sure your data is not having any commas if you are using the default delimiter and the same applies to other delimiter's as well.
Created 08-04-2017 05:08 AM
Thanks @Sindhu and @vekat ... Sindhu I am directly importing tables into hive that is using --hive-import .So I am not defining the ddl for Hive its automatically taken care of . Here is a sqoop import statement
sqoop import -D map.retry.numRetries=2 -D mapreduce.map.maxattempts=2 -D mapreduce.job.queuename=prod -D ipc.client.connect.max.retries=1 --connect 'jdbc connection string ' --p 'password file ' --username 'XXXXX' --relaxed-isolation --delete-target-dir --fetch-size '10000' -m '1' --query 'select * from [dbo].[EUC_ExtractTextTest] where $CONDITIONS' --target-dir '/apps/hive/warehouse/ngt_us_prod/tmp/ngt_us_prod/NGT_US_PROD_20170803_EUC_ExtractTextTest' --compression-codec 'org.apache.hadoop.io.compress.SnappyCodec' --null-string '\\N' --null-non-string '\\N' --hive-import --hive-table 'NGT_US_PROD_20170803.EUC_ExtractTextTest' --hive-delims-replacement ' '