Support Questions

Find answers, ask questions, and share your expertise

Sqoop export fails from hive to oracle when column names are numeric

avatar

We have a hive table in cloudera data platform and we need to export the same to oracle.

hive:

create external table temp_fns .ABC(account_id NUMBER,

`1234` number) stored as orc;

 

oracle : 

create table schema.ABC(account_id decimal(28,0),

"1234" decima;(28,0)) ;

 

sqoop export command :

sqoop export  --connect jdbc:oracle:thin:@//server:1521/xyz --username pravin --P --num-mappers 5 --hcatalog-database temp_fns --hcatalog-table ABC -table schema.ABC -- --batch

 

The ORA error I am getting is 

"

Error: java.io.IOException: java.sql.SQLSyntaxErrorException: ORA-01747: invalid user.table.column, table.column, or column specification

"

We cannot alter the oracle structure and most important thing is the same sqoop export works from HDP but it fails while exporting from CDP.

THe error is because of numeric column name "1234" when sqoop generates insert into statements like 

insert into schema.ABC (account_id,1234) values (1,234)

it fails with ORA error  ORA-01747: invalid user.table.column, table.column, or column specification.

Need help if anyone has ever encountered such scenario and how did they tackle.

1 REPLY 1

avatar
Master Mentor

@pravin_speaks 
I see some typo error  in the oracle create table stmt

Spoiler

create table schema.ABC(account_id decimal(28,0), "1234" decima;(28,0)) ;

Is that the exact copy and paste ??
Geoffrey