Member since
02-07-2019
2713
Posts
237
Kudos Received
31
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 952 | 08-21-2025 10:43 PM | |
| 1730 | 04-15-2025 10:34 PM | |
| 4552 | 10-28-2024 12:37 AM | |
| 1815 | 09-04-2024 07:38 AM | |
| 3647 | 06-10-2024 10:24 PM |
12-17-2024
10:14 AM
@Viki_Nodejs if you haven't resolved this issue could you try the below steps and revert. 1. Install the Required NPM Packages Use the hive-driver package for Node.js, which supports HiveServer2 over HTTP/HTTPS. npm install hive-driver 2. Prerequisites Ensure you have: HiveServer2 URL: Includes the hostname and port. SSL Configuration: Paths to your .jks trust store and its password. Hive httppath: Set to cliservice. Authentication details (if required): Username/password or Kerberos configuration. 3. Configure the Connection Here's an example of how to set up the connection using the hive-driver: const { HiveClient, TCLIServiceTypes } = require('hive-driver'); async function connectToHive() { const client = new HiveClient(TCLIServiceTypes); // Configure the Hive connection const connection = client.connect({ host: '<HIVE_SERVER_HOSTNAME>', // e.g., hive.example.com port: 10001, // HiveServer2 port, typically 10001 for HTTPS options: { path: '/cliservice', // HTTP path to HiveServer2 ssl: true, // Enable SSL sslOptions: { rejectUnauthorized: true, // Ensure certificates are verified ca: '<path/to/truststore.pem>' // Convert your JKS truststore to PEM format }, // Authentication username: '<YOUR_USERNAME>', password: '<YOUR_PASSWORD>', // You can add session configurations here } }); try { // Open the connection await connection.openSession(); console.log('Connected to Hive'); // Example query const result = await connection.executeStatement('SELECT * FROM your_table LIMIT 10'); console.log(result); // Close the session await connection.closeSession(); } catch (error) { console.error('Error connecting to Hive:', error); } finally { // Ensure the connection is closed await connection.close(); } } connectToHive(); 4. Key Point to Note !!!!!!!!! SSL Truststore [Very Important] Hive uses .jks files for its truststore, but hive-driver requires a .pem file for SSL. Convert your .jks file to .pem using the following commands: keytool -importkeystore -srckeystore truststore.jks -destkeystore truststore.p12 -deststoretype PKCS12 openssl pkcs12 -in truststore.p12 -out truststore.pem -nokeys I also saw an EAI_FAIL error in the screenshot this is related to not being able to resolve the DNS. Hope this helps
... View more
12-16-2024
02:13 AM
3 Kudos
@JackieW , I am in charge of the Public Cloud Management Console documentation. I have written the mentioned bit of documentation based on the current UI (which marks this option as deprecated) and in alignment with the developer team in charge of the feature. I have freshly consulted the head of the responsible developer team regarding your comment and they have confirmed: although the delegated subnet option is not deprecated by Microsoft, it it deprecated by Cloudera as we are now supporting the much more favorable Private Link option.
... View more
12-12-2024
10:08 AM
1 Kudo
@irshan When you add balancer as a role in the HDFS cluster, it indeed will show as not started. So its an expected one. Coming to your main query, it could be possible that when you run the balancer, the balancer threshold could be with in the default percentage of 10, so it won't move the blocks. You may have to reduce the balance threshold and try again.
... View more
12-12-2024
10:02 AM
1 Kudo
@Remme Though the procedure you followed might have helped you, with a larger cluster with TBs of Data, this is not a viable option. In that case, would advise working with Cloudera Support.
... View more
12-12-2024
09:28 AM
@darshanhira , There is not much changes to the NFS gateway end at the CDP 7.1.8, the issue you might be facing due to the underlying Linux issue. Please check if there is any stale nfs process that is blocking the NFS Gateway startup. Also please check if by chance any other process holding the port 2049, if so this may also cause the NFS gateway service startup. Also, please refer to our documentation as well. https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/scaling-namespaces/topics/hdfs-using-the-nfs-gateway-for-accessing-hdfs.html
... View more
12-11-2024
08:59 PM
2 Kudos
Hi Samsal, Firstly I want to thank you for taking your time in solving my query. The solution you provided worked like a magic. Secondly, yes I am new to this platform and also for JOLT, moving forward I will follow your tips and suggestions and will go through the courses which you've shared. Once again thank you for your valuable assistance. It made a significant difference. I am grateful.
... View more
12-11-2024
06:33 AM
Hello, These are NOT ERRORS: INFO conf.Configuration: resource-types.xml not found.
INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. As for this: INFO mapreduce.Job: map 0% reduce 0% How many mappers were specified for the IMPORT? Try locating the running containers in YARN and take a few JSTACKs to find out if the mapper is stuck waiting from your source database, if so make sure there are no firewall/network rules preventing the flow of data. Are you able to execute SQOOP EVAL on the source DB? If so, try using options: -jt local
-m 1
--verbose If the job completes, that would confirm a communication issue from your NodeManagers to the source DB
... View more
12-11-2024
01:16 AM
1 Kudo
@VidyaSargur it somewhat helped. It was failing because we had an NFS client running on that server. Since we have a customer-facing client -> server architecture for NFS, we could not start the HDFS NFS Gateway again on the same port. So, the only solution was to stop the HDFS NFS Gateway.
... View more
12-05-2024
03:08 AM
1 Kudo
@tono425, Thank you for your participation in the Cloudera Community. I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
12-03-2024
04:36 AM
1 Kudo
Hi @Mikhai , Its hard to say what is going on without looking at the data itself or seeing the ExcelReader Configuration. I know providing the data is not easy but if you can replicate the issue using dummy data then please share. Also if you can provide more details on how you configured the ExcelReader, for example are you using custom schema or infering the schema? I would try the following: 1- Try to find table boundary in excel and delete empty rows. If you cant then for sake of testing copy the table with the rows you need into new excel and see if that works. 2- If ExcelReader works with 545 rows , then I will try and provide custom schema - if not provided - and try to set some of the fields where there should be a value to not allow null. Maybe by doing so it will help the ExcelReader not to import empty rows. I tried to use ExcelReader before but ran into issues when the excel has some formula columns because of a bug in the reader itself. Im not sure if those issues were addressed but as workaround I used Python Extension to develop custom processor that takes excel input and convert into Json using Pandas library. This might be an option to consider if you are still having problems with the ExcelReader service but you have to use Nifi 2.0 version in order to use python extension. If that helps please accept the solution, Thanks
... View more