Created 07-31-2024 02:36 AM
We have a hive table in cloudera data platform and we need to export the same to oracle.
hive:
create external table temp_fns .ABC(account_id NUMBER,
`1234` number) stored as orc;
oracle :
create table schema.ABC(account_id decimal(28,0),
"1234" decima;(28,0)) ;
sqoop export command :
sqoop export --connect jdbc:oracle:thin:@//server:1521/xyz --username pravin --P --num-mappers 5 --hcatalog-database temp_fns --hcatalog-table ABC -table schema.ABC -- --batch
The ORA error I am getting is
"
Error: java.io.IOException: java.sql.SQLSyntaxErrorException: ORA-01747: invalid user.table.column, table.column, or column specification
"
We cannot alter the oracle structure and most important thing is the same sqoop export works from HDP but it fails while exporting from CDP.
THe error is because of numeric column name "1234" when sqoop generates insert into statements like
insert into schema.ABC (account_id,1234) values (1,234)
it fails with ORA error ORA-01747: invalid user.table.column, table.column, or column specification.
Need help if anyone has ever encountered such scenario and how did they tackle.
Created 11-21-2024 10:12 PM
Can you export the below before running the sqoop command and see if it helps?
export HADOOP_CLIENT_OPTS="-Dsqoop.oracle.escaping.disabled=false -Djava.security.egd="file:///dev/../dev/urandom"
Regards,
Chethan YM
Created 08-05-2024 11:11 AM
@pravin_speaks
I see some typo error in the oracle create table stmt
create table schema.ABC(account_id decimal(28,0), "1234" decima;(28,0)) ;
Is that the exact copy and paste ??
Geoffrey
Created 11-21-2024 10:12 PM
Can you export the below before running the sqoop command and see if it helps?
export HADOOP_CLIENT_OPTS="-Dsqoop.oracle.escaping.disabled=false -Djava.security.egd="file:///dev/../dev/urandom"
Regards,
Chethan YM