Member since
10-28-2020
578
Posts
46
Kudos Received
40
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
787 | 02-17-2025 06:54 AM | |
5005 | 07-23-2024 11:49 PM | |
872 | 05-28-2024 11:06 AM | |
1428 | 05-05-2024 01:27 PM | |
907 | 05-05-2024 01:09 PM |
03-10-2025
09:49 PM
2 Kudos
Sorry for my late response. I did try with the JDK version as mentioned in the driver documentation. It didn't work. However, I am now using keytab method for connecting and I am fine with it. @asish thanks a ton for all the support.
... View more
02-17-2025
07:55 AM
Thanks a lot.. It worked like magic 😄
... View more
02-07-2025
11:50 AM
@zorrofrombrasil Did you get a chance to check the updates from @ggangadharan and @smruti? If so, did the information help to fix the issue? or Do you have any other concerns?
... View more
12-17-2024
10:14 AM
@Viki_Nodejs if you haven't resolved this issue could you try the below steps and revert. 1. Install the Required NPM Packages Use the hive-driver package for Node.js, which supports HiveServer2 over HTTP/HTTPS. npm install hive-driver 2. Prerequisites Ensure you have: HiveServer2 URL: Includes the hostname and port. SSL Configuration: Paths to your .jks trust store and its password. Hive httppath: Set to cliservice. Authentication details (if required): Username/password or Kerberos configuration. 3. Configure the Connection Here's an example of how to set up the connection using the hive-driver: const { HiveClient, TCLIServiceTypes } = require('hive-driver'); async function connectToHive() { const client = new HiveClient(TCLIServiceTypes); // Configure the Hive connection const connection = client.connect({ host: '<HIVE_SERVER_HOSTNAME>', // e.g., hive.example.com port: 10001, // HiveServer2 port, typically 10001 for HTTPS options: { path: '/cliservice', // HTTP path to HiveServer2 ssl: true, // Enable SSL sslOptions: { rejectUnauthorized: true, // Ensure certificates are verified ca: '<path/to/truststore.pem>' // Convert your JKS truststore to PEM format }, // Authentication username: '<YOUR_USERNAME>', password: '<YOUR_PASSWORD>', // You can add session configurations here } }); try { // Open the connection await connection.openSession(); console.log('Connected to Hive'); // Example query const result = await connection.executeStatement('SELECT * FROM your_table LIMIT 10'); console.log(result); // Close the session await connection.closeSession(); } catch (error) { console.error('Error connecting to Hive:', error); } finally { // Ensure the connection is closed await connection.close(); } } connectToHive(); 4. Key Point to Note !!!!!!!!! SSL Truststore [Very Important] Hive uses .jks files for its truststore, but hive-driver requires a .pem file for SSL. Convert your .jks file to .pem using the following commands: keytool -importkeystore -srckeystore truststore.jks -destkeystore truststore.p12 -deststoretype PKCS12 openssl pkcs12 -in truststore.p12 -out truststore.pem -nokeys I also saw an EAI_FAIL error in the screenshot this is related to not being able to resolve the DNS. Hope this helps
... View more
11-26-2024
12:06 AM
1 Kudo
I'm 3 years late, but in case anyone else facing the same issue, try using auth='NOSASL'. Maybe the issue does not occur with newer versions.
... View more
10-16-2024
08:00 AM
hi @NagendraKumar , You can use puthiveql as described in this article https://community.cloudera.com/t5/Support-Questions/how-to-use-puthiveql-in-NIFI/td-p/204542
... View more
10-02-2024
07:12 AM
@NagendraKumar were you able to truncate the data with the processor that my colleague Smruti mentioned?
... View more
09-18-2024
01:22 AM
1 Kudo
@zhuodongLi, Did the responses help resolve your query? If it did, kindly mark the relevant reply as the solution, as it will aid others in locating the answer more easily in the future.
... View more
08-18-2024
12:52 PM
Until now, you could query JDBC data sources from Hive using JDBC Storage Handler. However, with the release of CDP 7.2.18 (Public Cloud runtime) and 7.1.9 SP 1 (Private Cloud Base), you can also query Hive tables hosted in a different environment using a similar method. Here's a brief demo:
0: jdbc:hive2://ccycloud-1.sbiswal.root.como> CREATE EXTERNAL TABLE loc_hive_table(c1 int, c2 string) STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler' TBLPROPERTIES(
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.database.type"="HIVE",
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.jdbc.driver"="org.apache.hive.jdbc.HiveDriver",
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.jdbc.url"="jdbc:hive2://sbiswal-jdbctest-master0.sup-defa.xxxxxxx.a4.cloudera.site/db1;ssl=true;transportMode=http;httpPath=sbiswal-jdbctest/cdp-proxy-api/hive",
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.dbcp.username”= ”user1”,
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.dbcp.password"="********",
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.table"="src_tbl",
. . . . . . . . . . . . . . . . . . . . . . .> "hive.sql.dbcp.maxActive"="1");
0: jdbc:hive2://ccycloud-1.sbiswal.root.como> select * from loc_hive_table;
INFO : Compiling command(queryId=hive_20240818172147_9c14d21e-1802-43a6-b5a4-0846ff5bad7f): select * from loc_hive_table
INFO : Semantic Analysis Completed (retrial = false)
INFO : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:loc_hive_table.c1, type:int, comment:null), FieldSchema(name:loc_hive_table.c2, type:string, comment:null)], properties:null)
INFO : Completed compiling command(queryId=hive_20240818172147_9c14d21e-1802-43a6-b5a4-0846ff5bad7f); Time taken: 10.033 seconds
INFO : Executing command(queryId=hive_20240818172147_9c14d21e-1802-43a6-b5a4-0846ff5bad7f): select * from loc_hive_table
INFO : Completed executing command(queryId=hive_20240818172147_9c14d21e-1802-43a6-b5a4-0846ff5bad7f); Time taken: 0.02 seconds
INFO : OK
+---------------------+---------------------+
| loc_hive_table.c1 | loc_hive_table.c2 |
+---------------------+---------------------+
| 1 | abc |
| 2 | def |
+---------------------+---------------------+
2 rows selected (12.581 seconds)
0: jdbc:hive2://ccycloud-1.sbiswal.root.como>
The main point to highlight here is hive.sql.database.type"="HIVE.
In this example, a Hive table db1.src_tbl in a remote cluster. To read the table from a different Hive cluster, an external table loc_hive_table had to be created leveraging JDBCStorageHandler. As you can see, the data was read successfully.
If the table to be queried is under a non-default database, it can be defined as follows:
Mention the DB name in the JDBC connection string, as used above: "hive.sql.schema"="db1.test1":
You could query the table from a secured cluster using LDAP authentication. Please be informed that you can only read and not write to the remote Hive table using this approach.
If you want to use any aggregate function in the query and encounter any of the following errors in the YARN application, we may have to add some jars at the session level.
java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver
java.lang.NoClassDefFoundError: org/apache/hive/service/cli/HiveSQLException
Add equivalent versions of these jars from your local parcel directory,
0: jdbc:hive2://ccycloud-1.sbiswal.root.como> ADD JAR /opt/cloudera/parcels/CDH/lib/hive/lib/hive-jdbc-3.1.3000.7.1.9.1000-103.jar;
0: jdbc:hive2://ccycloud-1.sbiswal.root.como> ADD JAR /opt/cloudera/parcels/CDH/lib/hive/lib/hive-service-3.1.3000.7.1.9.1000-103.jar;
Now you should be able to run your query successfully.
... View more
08-12-2024
04:46 AM
1 Kudo
Just got the article access. By the way this issue occurs in mysql even not a galera cluster types. even i make the same Mariadb as master / slave, the same issue occurs, so this is not the problem with galera cluster, is for sure.
... View more