Member since
02-07-2019
2690
Posts
235
Kudos Received
30
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1089 | 04-15-2025 10:34 PM | |
3279 | 10-28-2024 12:37 AM | |
1418 | 09-04-2024 07:38 AM | |
3241 | 06-10-2024 10:24 PM | |
1382 | 02-01-2024 10:51 PM |
01-06-2025
08:15 AM
@Shelton / @MattWho , My NIFI is behind corporate proxy, because of that In production, NIFI is not able to hit the azure OIDC discovery url. could you please help me on it ? Thanks, spiker
... View more
12-19-2024
09:23 PM
1 Kudo
Hi @SAMSAL , This works really fine. Thank you so much for your solution. I got your idea of splitting the JSON and perform the transformation in SQL table. I will work on that. Thank you again.
... View more
12-17-2024
09:56 PM
1 Kudo
Welcome to our community! To help you get the best possible answer, I have tagged our Airflow expert @smdas who may be able to assist you further. Please feel free to provide any additional information or details about your query. We hope that you will find a satisfactory solution to your question.
... View more
12-17-2024
10:31 AM
@denysobukhov If this issue hasn't been resolved I am suspecting the HS2 idle Timeout and Thread pool size. Can you please do the below and share the out come. 1. Address Server-Side Resource or Timeout Issues Increase HiveServer2 Idle Timeout By default, HiveServer2 may close idle connections after a certain period. Increase this timeout: Update the HiveServer2 config: hive.server2.idle.session.timeout (default: 600000 ms / 10 minutes). Set it to a larger value, e.g., 3600000 (1 hour). hive.server2.idle.operation.timeout (default: 5 minutes for operations). Increase to match your app's use case. SET hive.server2.idle.session.timeout=3600000; SET hive.server2.idle.operation.timeout=3600000; Adjust Thread Pool Size If HiveServer2 runs out of threads to handle requests, it can drop connections: Increase hive.server2.threads to a higher value in HiveServer2 configurations. Restart HiveServer2 after changes. First check the default hive.server2.thrift.max.worker.threads jstack -l <HiveServere2_ProccessId> | grep ".Thread.Stat" | wc -l Happy hadooping
... View more
12-17-2024
10:14 AM
@Viki_Nodejs if you haven't resolved this issue could you try the below steps and revert. 1. Install the Required NPM Packages Use the hive-driver package for Node.js, which supports HiveServer2 over HTTP/HTTPS. npm install hive-driver 2. Prerequisites Ensure you have: HiveServer2 URL: Includes the hostname and port. SSL Configuration: Paths to your .jks trust store and its password. Hive httppath: Set to cliservice. Authentication details (if required): Username/password or Kerberos configuration. 3. Configure the Connection Here's an example of how to set up the connection using the hive-driver: const { HiveClient, TCLIServiceTypes } = require('hive-driver'); async function connectToHive() { const client = new HiveClient(TCLIServiceTypes); // Configure the Hive connection const connection = client.connect({ host: '<HIVE_SERVER_HOSTNAME>', // e.g., hive.example.com port: 10001, // HiveServer2 port, typically 10001 for HTTPS options: { path: '/cliservice', // HTTP path to HiveServer2 ssl: true, // Enable SSL sslOptions: { rejectUnauthorized: true, // Ensure certificates are verified ca: '<path/to/truststore.pem>' // Convert your JKS truststore to PEM format }, // Authentication username: '<YOUR_USERNAME>', password: '<YOUR_PASSWORD>', // You can add session configurations here } }); try { // Open the connection await connection.openSession(); console.log('Connected to Hive'); // Example query const result = await connection.executeStatement('SELECT * FROM your_table LIMIT 10'); console.log(result); // Close the session await connection.closeSession(); } catch (error) { console.error('Error connecting to Hive:', error); } finally { // Ensure the connection is closed await connection.close(); } } connectToHive(); 4. Key Point to Note !!!!!!!!! SSL Truststore [Very Important] Hive uses .jks files for its truststore, but hive-driver requires a .pem file for SSL. Convert your .jks file to .pem using the following commands: keytool -importkeystore -srckeystore truststore.jks -destkeystore truststore.p12 -deststoretype PKCS12 openssl pkcs12 -in truststore.p12 -out truststore.pem -nokeys I also saw an EAI_FAIL error in the screenshot this is related to not being able to resolve the DNS. Hope this helps
... View more
12-16-2024
02:13 AM
3 Kudos
@JackieW , I am in charge of the Public Cloud Management Console documentation. I have written the mentioned bit of documentation based on the current UI (which marks this option as deprecated) and in alignment with the developer team in charge of the feature. I have freshly consulted the head of the responsible developer team regarding your comment and they have confirmed: although the delegated subnet option is not deprecated by Microsoft, it it deprecated by Cloudera as we are now supporting the much more favorable Private Link option.
... View more
12-12-2024
10:08 AM
1 Kudo
@irshan When you add balancer as a role in the HDFS cluster, it indeed will show as not started. So its an expected one. Coming to your main query, it could be possible that when you run the balancer, the balancer threshold could be with in the default percentage of 10, so it won't move the blocks. You may have to reduce the balance threshold and try again.
... View more
12-12-2024
10:02 AM
1 Kudo
@Remme Though the procedure you followed might have helped you, with a larger cluster with TBs of Data, this is not a viable option. In that case, would advise working with Cloudera Support.
... View more
12-12-2024
09:28 AM
@darshanhira , There is not much changes to the NFS gateway end at the CDP 7.1.8, the issue you might be facing due to the underlying Linux issue. Please check if there is any stale nfs process that is blocking the NFS Gateway startup. Also please check if by chance any other process holding the port 2049, if so this may also cause the NFS gateway service startup. Also, please refer to our documentation as well. https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/scaling-namespaces/topics/hdfs-using-the-nfs-gateway-for-accessing-hdfs.html
... View more