Member since
06-26-2015
515
Posts
138
Kudos Received
114
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2271 | 09-20-2022 03:33 PM | |
| 6048 | 09-19-2022 04:47 PM | |
| 3259 | 09-11-2022 05:01 PM | |
| 3734 | 09-06-2022 02:23 PM | |
| 5822 | 09-06-2022 04:30 AM |
03-15-2022
02:14 PM
1 Kudo
@JohnnyC , I see. These instructions actually refer to JAR files taken from the actual CDP deployment, not the one we distribute separately. These JARs can be found under the /opt/cloudera/parcels/CDH/jars directory, in any of the CDP nodes. [centos@cdp ~]$ ls -l /opt/cloudera/parcels/CDH/jars/hive-{exec,jdbc,metastore,service}-[0-9]*[0-9].jar
-rw-r--r--. 1 cloudera-scm cloudera-scm 45594524 Aug 3 2021 /opt/cloudera/parcels/CDH/jars/hive-exec-3.1.3000.7.1.7.0-551.jar
-rw-r--r--. 1 cloudera-scm cloudera-scm 129949 Aug 3 2021 /opt/cloudera/parcels/CDH/jars/hive-jdbc-3.1.3000.7.1.7.0-551.jar
-rw-r--r--. 1 cloudera-scm cloudera-scm 41223 Aug 3 2021 /opt/cloudera/parcels/CDH/jars/hive-metastore-3.1.3000.7.1.7.0-551.jar
-rw-r--r--. 1 cloudera-scm cloudera-scm 771018 Aug 3 2021 /opt/cloudera/parcels/CDH/jars/hive-service-3.1.3000.7.1.7.0-551.jar Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-15-2022
02:09 PM
Hi, @Ankit88 , thanks for the info! The error that you're getting is because the host in the cloud where your application is running doesn't know anything about your Kerberos configuration, in particular, where your KDC is. There's a few things you must do to configure it properly: Ensure the Kerberos client libraries are installed on that host (krb5-workstation package) Your on-prem krb5.conf file must be copied to the cloud host. If you also have a Kerberos KDC in the cloud, there will already be a krb5.conf file on that host and the two configurations will need to be carefully merged. The sections [realms] and [domain_realm] are especially important to solve your issue. Ensure that the hostname of your KDC as well as the host names of ALL Kafka brokers can be resolved from the cloud (you can test it with nslookup and/or ping). This must work correctly for Kerberos to work. If there's no integrated DNS you will have to add entries to your /etc/hosts file to ensure the resolution is correct. Ensure that any firewalls are configured correctly to open ports between your application and your on-prem environment: Open all the ports required for the client to connect to Kafka Open all the ports required for the client to communicate with the KDC (typically, ports 88 UDP and 88 TCP) The above being correctly configuration, you should be able to authenticate correctly using Kerberos. A simple test to ensure it's working, before you try the Kafka application, is to authenticate on the command line using the kinit command. Hope this helps. André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-15-2022
01:51 PM
1 Kudo
@JohnnyC , Where are you reading these instructions from? André
... View more
03-15-2022
05:41 AM
Hi, @Ankit88 , Is your cloud Kafka running on CDP Public Cloud or is it your own deployment on AWS? What about the Kafka on-prem? Is it a Kafka on CDP or some other type of deployment? What the version of the on-prem Kafka? André
... View more
03-15-2022
04:51 AM
Hi, @abvincita , Try using the following Command Arguments in your processor configuration: "${ERROR_DETAIL}"~"${ERROR_CODE}" Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-15-2022
04:27 AM
@krishna123 , Check your environment path and Java Home settings. NiFi will pick the first Java binary that it finds in your path. André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-15-2022
03:34 AM
1 Kudo
Hi , @krishna123, Thanks for the output! This explains the issue that you're seeing. You mentioned before that you were using Java 1.8.0_275, but the log line you just provided shows that NiFi is actually using Java 1.8.0_74. In this version of Java, cryptographic key sizes are limited and NiFi cannot not handle AES keys larger than 128 bits. Because of that, NiFi fails to instantiate AES iphers properly. To enable that in your Java version, you would have to download the Java Cryptography Extension (JCE) Unlimited Strength policies and copy them to your Java home manually to allow AES keys of size 256 and larger. These policies are only included by default in the JDK starting from update 1.8.0_162. If you already have Java 1.8.0_275 installed, make sure NiFi can see it and use it. This will solve your problem. You can verify which Java version NiFi is using by checking the line that I mentioned in the log. Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-15-2022
02:16 AM
Could you please look into your nifi-bootstrap.conf and look for the line like the one below and paste it here? 2022-03-15 16:03:24,008 INFO [main] org.apache.nifi.bootstrap.RunNiFi Runtime Java version: ...
... View more
03-14-2022
11:14 PM
@krishna123 , Could you please provide the full stack trace of the exception you found in nifi-app.log? André
... View more
03-14-2022
10:10 PM
@krishna123 , What's the exact version? (java -version) Cheers, André
... View more