Member since
03-06-2020
406
Posts
56
Kudos Received
36
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
915 | 11-21-2024 10:40 PM | |
880 | 11-21-2024 10:12 PM | |
2695 | 07-23-2024 10:52 PM | |
2012 | 05-16-2024 12:27 AM | |
6706 | 05-01-2024 04:50 AM |
09-07-2025
10:49 PM
@AEAT, Did the response assist in resolving your query? If it did, please mark the relevant reply as the solution, as it will help others locate the answer more easily in the future.
... View more
08-28-2025
09:19 PM
@Pratibha123 ORA-12705 - This is the key error. The Oracle JDBC driver needs to set up a language and character set session with the Oracle database. To do this, it attempts to read NLS configuration data from files on the local filesystem of the machine where Sqoop is running. This error occurs because it cannot find or access those NLS data files, or the environment variable that points to them is invalid. ORA-00604: This is a cascading error. It means an internal, recursive SQL statement that Oracle runs during the connection/session setup failed, because the session setup itself was incomplete. References: https://docs.oracle.com/en/error-help/db/ora-12705/?r=23ai https://stackoverflow.com/questions/7700330/ora-12705-cannot-access-nls-data-files-or-invalid-environment https://docs.oracle.com/en/error-help/db/ora-00604/?r=23ai https://stackoverflow.com/questions/30478070/how-to-solve-sql-error-ora-00604-error-occurred-at-recursive-sql-level-1 "oracle.jdbc.NLS_LANG" Is not seems to be valid property. Can you export it before running the job and check? > export NLS_LANG="AMERICAN_AMERICA.AL32UTF8" Ensure libjars has the correct ojdbc8.jar and it's compatible with your Oracle DB version. Regards, Chethan YM
... View more
05-07-2025
05:48 AM
1 Kudo
@Yigal It is not supported in Impala, Below is the Jira for your reference it is still in open state and not Resolved. https://issues.apache.org/jira/browse/IMPALA-5226 Regards, Chethan YM
... View more
05-05-2025
06:35 AM
Hi @Rich_Learner can you try this: "SELECT get_json_object(product_json, '$.ProductCOde') AS product_code, get_json_object(product_json, '$.Type') AS product_type FROM customer_table LATERAL VIEW json_tuple(json_column, 'Customer') c AS customer_json LATERAL VIEW json_tuple(customer_json, 'products') p AS products_json LATERAL VIEW explode(from_json(products_json, 'array<map<string,string>>')) product_table AS product_json ;"" OR WITH cleaned_json AS ( SELECT regexp_replace( regexp_replace( get_json_object(json_column, '$.Customer.products'), '\\}\\s*,\\s*\\{', '}~{' ), '\\[|\\]', '' ) AS flat_products FROM customer_table ), split_json AS ( SELECT split(flat_products, '~') AS product_array FROM cleaned_json ) SELECT get_json_object(item, '$.ProductCOde') AS product_code, get_json_object(item, '$.Type') AS product_type FROM split_json LATERAL VIEW explode(product_array) exploded_table AS item; Ensure your JSON keys match case-sensitively and Use consistent JSON structure. If offer is both a number and an array in different objects, consider preprocessing or cleaning up such inconsistencies. Regards, Chethan YM
... View more
05-05-2025
06:31 AM
Hi @rdhau You can go through the below Cloudera documentations completely to understand to work with HWC. https://docs.cloudera.com/cdp-private-cloud-base/7.1.8/integrating-hive-and-bi/topics/hive_hivewarehouseconnector_for_handling_apache_spark_data.html https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/integrating-hive-and-bi/topics/hive-hwc-reader-mode.html https://docs.cloudera.com/cdw-runtime/1.5.4/hive-metastore/topics/hive_apache_spark_hive_connection_configuration.html Regards, Chethan YM
... View more
04-02-2025
01:44 PM
Adding to what @ChethanYM said, shared traces can also arise from third party tool while connection gets closed abruptly or issue with proxy host/knox if you are using one in JDBC connection settings. If you would like more debugging on this, appending JDBC driver DEBUG logs would be helpful. To do so, add below with JDBC connection string and repro the issue , It will generate driver DEBUG logs and may give some more details about the issue. LogLevel=6;LogPath=/tmp/jdbclog Regards, Krish
... View more
02-13-2025
05:42 AM
Hi, I have not tested it may be you can try something like this and see if that works. curl -X POST "https://cdp.company.com/gateway/cdp-proxy-api/Impala/api/v1/query" \ -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \ -H "Content-Type: application/json" \ -d '{ "query": "SELECT * FROM sales_data WHERE date >= current_date - interval 7 days;", "database": "analytics_db", "async": false }' Cloudera API overview: https://docs.cloudera.com/cdp-public-cloud/cloud/api/topics/mc-api-overview.html?utm_source=chatgpt.com Regards, Chethan YM
... View more
12-20-2024
01:40 AM
1 Kudo
I think the problem partly has to do with our Python3.8 installation. We did the installation via Anaconda. Cloudera recommended will use yum to install the rh-python38 on our RHEL/OL7 as I mentioned in the previous message. Documentation is here: Installing Python 3.8 standard package on RHEL 7 | CDP Private Cloud. The installation resolved most of the Web Server issue. The Web Server issue for Impala not only has to do with Python installation but the Web Server username and password. Below is the following action performed to resolve the Impala Web Server issue after enabling the hadoop_secure_web_ui. WORK PERFORMED: Removed the below configurations from CM UI : Impala > Configuration > Catalog Server > Web Server Username Impala > Configuration > Catalog Server > Server Web Server User Password Impala > Configuration > Impala Daemon > Web Server Username Impala > Configuration >Impala Daemon > Web Server User Password Impala > Configuration >Statestore > Web Server Username Impala > Configuration >Statestore > Web Server User Password Enabled "Enable Kerberos Authentication for HTTP Web-Consoles" under CM UI > Impala > Configurations Restarted Impala Service. Also, regarding the Impala, this Cloudera documentation was quite helpful: Configuring Impala Web UI | CDP Public Cloud The issue is resolved now by following the instructions in the above documentation.
... View more
11-22-2024
04:56 AM
1 Kudo
If you suspect that TEZ-4032 is the cause, consider upgrading your cluster to CDP 7 and testing it again, as it has been backported in CDP 7.
... View more
11-21-2024
10:40 PM
1 Kudo
Hi @mrblack To avoid full table scan you follow these tips: 1. Ensure proper partition pruning: https://impala.apache.org/docs/build/html/topics/impala_partitioning.html#:~:text=the%20impalad%20daemon.-,Partition%20Pruning%20for%20Queries,-Partition%20pruning%20refers 2. Re write the query with sub queries. 3. Add explicit hints for join behaviour. Impala supports join hints like brodcast and shuffle that can influence query planning. After optimising check the explain plan. Regards, Chethan YM
... View more