Member since
10-28-2020
624
Posts
47
Kudos Received
41
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 759 | 01-21-2026 01:59 AM | |
| 3958 | 02-17-2025 06:54 AM | |
| 9538 | 07-23-2024 11:49 PM | |
| 2024 | 05-28-2024 11:06 AM | |
| 2740 | 05-05-2024 01:27 PM |
03-24-2026
12:38 AM
@mohammad_shamim You cannot copy a managed(ACID) table using HDFS GET/PUT commands because there are writeIDs associated with ACID tables, and if that information is missing in HMS, you will not be able to read the data files. Here is the supported way to copy/move a managed table: 1. Create an external table first, on top of the new HDFS path:
CREATE EXTERNAL TABLE ext_source_table (
col1 INT,
col2 STRING,
col3 DOUBLE,
col4 DATE
)
STORED AS ORC
LOCATION '[HDFS PATH]';
2. PERFORM MSCK REAPAIR on the External table and see if you can read it.
MSCK REPAIR ext_source_table;
3. Use CREATE TABLE AS SELECT command to create teh target Managed ACID table from that external table.
e.g
CREATE TABLE target_managed_table
AS
SELECT * FROM ext_source_table;
... View more
01-21-2026
01:59 AM
@gurumoorthyk Iceberg is not supported with Hive in CDP on-prem 7.1.9. You may enable it using the runtime jars, but some functionalities may not work, and at Cloudera, we do not support it. Iceberg with Hive is supported from 7.3.1(on prem). The procedure you have listed to load iceberg runtime jar to Hive classpath is correct.
... View more
05-17-2025
04:34 PM
1 Kudo
@AlexDriver I tried this with latest ODBC driver 2.8.2. Used it with pyodbc, and I can reproduce the issue. The error I get is : Traceback (most recent call last):
File "/hive/pyodbc_test.py", line 13, in <module>
for row in cursor.fetchall():
^^^^^^^^^^^^^^^^^
pyodbc.ProgrammingError: No results. Previous SQL was not a query. I have raised an enhancement request with the driver team.
... View more
05-16-2025
11:25 AM
@AlexDriver could you share the ODBC driver version you are using, so that we can test this?
... View more
05-16-2025
06:36 AM
Could you please share both versions and mark this thread resolved?
... View more
05-15-2025
01:04 PM
Could you test with multiline comment, just to see if it works? SELECT *
FROM (/* Request ID: ... */ SELECT version()) AS Subquery__9
LIMIT 1; What's the ODBC driver version that you are using? Is it from Cloudera?
... View more
05-15-2025
12:49 PM
@AlexDriver is it possible to enable trace logging in the driver and share the logs, and also see how the query shows up in the hiveserver2 logs?
... View more
05-13-2025
11:47 PM
@AlexDriver So our query is essentially this: SELECT *
FROM (
SELECT version()
) AS Subquery__9
LIMIT 1; I think the driver connector is unable to parse the comment. You may enable Native Query by setting UseNativeQuery=1, this way driver does not try to transform the query, and it runs directly in Hive. Try this and see if it works. Driver user guide - https://downloads.cloudera.com/connectors/Cloudera_Hive_ODBC_2.8.2.1002/Cloudera-ODBC-Driver-for-Apache-Hive-Install-Guide.pdf
... View more
02-17-2025
06:54 AM
ahh @Rich_Learner please use the following query. I just tested. It should work. WITH json_extract AS (
SELECT
get_json_object(xml_data, '$.app.Id') AS ID,
get_json_object(xml_data, '$.app.apply[0].flag') AS Flag,
regexp_replace(regexp_replace(get_json_object(xml_data, '$.app.apply[0].Product'), '\\[|\\]', ''), '\\}\\,\\{', '\\}\\;\\{') AS products
FROM check
)
SELECT
ID,
Flag,
get_json_object(product_data, '$.Code') AS Code,
get_json_object(product_data, '$.Line') AS Line,
get_json_object(product_data, '$.status') AS Status
FROM json_extract
LATERAL VIEW explode(split(products, ';')) p AS product_data;
... View more
02-13-2025
08:54 AM
Try : WITH json_extract AS (
SELECT
get_json_object(xml_data, '$.app.Id') AS ID,
get_json_object(xml_data, '$.app.apply[0].flag') AS Flag,
get_json_object(xml_data, '$.app.apply[0].Product') AS products
FROM check
)
SELECT
ID,
Flag,
get_json_object(product_data, '$.Code') AS Code,
get_json_object(product_data, '$.Line') AS Line,
get_json_object(product_data, '$.status') AS Status
FROM json_extract
LATERAL VIEW OUTER EXPLODE(SPLIT(products, '},')) p AS product_data;
... View more