Member since
02-25-2021
4
Posts
1
Kudos Received
0
Solutions
07-31-2024
02:36 AM
1 Kudo
We have a hive table in cloudera data platform and we need to export the same to oracle. hive: create external table temp_fns .ABC(account_id NUMBER, `1234` number) stored as orc; oracle : create table schema.ABC(account_id decimal(28,0), "1234" decima;(28,0)) ; sqoop export command : sqoop export --connect jdbc:oracle:thin:@//server:1521/xyz --username pravin --P --num-mappers 5 --hcatalog-database temp_fns --hcatalog-table ABC -table schema.ABC -- --batch The ORA error I am getting is " Error: java.io.IOException: java.sql.SQLSyntaxErrorException: ORA-01747: invalid user.table.column, table.column, or column specification " We cannot alter the oracle structure and most important thing is the same sqoop export works from HDP but it fails while exporting from CDP. THe error is because of numeric column name "1234" when sqoop generates insert into statements like insert into schema.ABC (account_id,1234) values (1,234) it fails with ORA error ORA-01747: invalid user.table.column, table.column, or column specification. Need help if anyone has ever encountered such scenario and how did they tackle.
... View more
Labels:
11-28-2021
10:16 PM
Hello, 1. First add the hosts to your /etc/hosts files e.g in my case $vi /etc/hosts 192.168.160.134 master.hadoop.com master 192.168.160.135 node1.hadoop.com node1 192.168.160.136 node2.hadoop.com node2 2. Restart cloudera-scm-agent on CM server $ sudo service cloudera-scm-agent restart This should resolve your issue. Regards, PC
... View more
02-25-2021
10:04 PM
Dear @ask_bill_brooks Is it something that only currently supported Ambari and HDP binaries are present on cloudera repository? How the lower versions of Ambari and HDP can be downloaded if valid access is received from cloudera. Regards, Pravin
... View more