Member since
06-26-2015
515
Posts
138
Kudos Received
114
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2269 | 09-20-2022 03:33 PM | |
| 6044 | 09-19-2022 04:47 PM | |
| 3258 | 09-11-2022 05:01 PM | |
| 3727 | 09-06-2022 02:23 PM | |
| 5804 | 09-06-2022 04:30 AM |
03-10-2022
12:54 PM
1 Kudo
@dutras , Please open a case with Cloudera support so that they can investigate the problem. Cheers André
... View more
03-10-2022
03:44 AM
1 Kudo
Hi @Griggsy , You don't need to use regex for this. You can, for example, connect the output of the ParseSyslog5424 directly to the following AttributesToJson processor: The output of this will be a JSON like the one below: {
"syslog.structuredData.SDID@0.os" : "linux",
"syslog.structuredData.SDID@0.utilization" : "high"
} You can then use a JoltTransformJSON process to transform the above into your final product. For example, the following JOLT specification: [
{
"operation": "shift",
"spec": {
"syslog.structuredData.*.*": {
"@": "&(1,2)"
}
}
}
] Will produce the following output: {
"os" : "linux",
"utilization" : "high"
} This is the flow: Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-09-2022
05:32 PM
1 Kudo
Hi, @CookieCream As I updated jdk version to 17.0.2, it works!!! Great to hear it works. However, as I mentioned in my previous post, please note that NiFi is not supported on Java 17, only Java 8 and 11. Please bear this in mind. If you find any strange behaviour, try using Java 11 instead. It might be silly but have a couple more questions: 1. As I check for the java version on the terminal, it shows the right version java version "17.0.2" 2022-01-18 LTS
Java(TM) SE Runtime Environment (build 17.0.2+8-LTS-86)
Java HotSpot(TM) 64-Bit Server VM (build 17.0.2+8-LTS-86, mixed mode, sharing) But JAVA_HOME environment variable holds the old directory and I have to type 'export JAVA_HOME = /Library/Java/JavaVirtualMachines/jdk-17.0.2.jdk/Contents/Home' every time I open the terminal to launch nifi. Could you tell me how to set JAVA_HOME permanently? I'm not sure how JAVA_HOME is being set in your environment. I guess this depends on how Java 8 was installed in the first place. But one way to overwrite this is to add the export command at the end of you ~/.bashrc file. This way it will be executed automatically every time you open a new terminal. 2. When I access to nifi, it asks me to enter User and Password. I found the answer for the same issue and it says that I could find the user info as search text 'Generated User'/'Generated Password' on nifi-app.log, but I do not see any information Thank you. If you ran the commands I listed before, you executed this as well: ./bin/nifi.sh set-single-user-credentials admin supersecret1 This sets your credentials to admin/supersecret1, instead of getting them generated randomly. Try using these credentials to log in. Cheers, André
... View more
03-09-2022
01:57 PM
2 Kudos
Hi @CookieCream , I think I have a good idea of what's going on. The bootstrap.log, which you shared in pvt, has an error message complaining about a invalid key size. When I saw that I checked again your Java version and realized that you're using a very old Java version (1.8.0_65). In this version of Java, cryptographic key sizes were limited and you could not create long keys. Because of that, NiFi fails to create the TLS keys when it starts. To enable that in your Java version, you would have to download the Java Cryptography Extension (JCE) Unlimited Strength policies and copy them to your Java home manually to allow AES keys of size 256 and larger. These policies were only included by default in the JDK starting from update 1.8.0_162. So, you have two options to make this work: (Recommended) Upgrade your JDK to the latest Java 1.8 or, preferrably, to Java 11 (do not use versions higher than 11 because they are not supported by NiFi) Alternatively, you can download the Java Cryptography Extension (JCE) Unlimited Strength, unzip it and follow the steps in the README.TXT to install it. Once you do one of the above, clean up the directories of your previous attempts and try again with the steps that I copied below: wget "https://downloads.apache.org/nifi/1.15.3/nifi-1.15.3-bin.tar.gz" -P .
tar -zxvf ./nifi-1.15.3-bin.tar.gz
cd nifi-1.15.3
./bin/nifi.sh set-single-user-credentials admin supersecret1
./bin/nifi.sh start Please let us know if it works this time 😉 Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-09-2022
04:56 AM
1 Kudo
@Onkar_Gagre , The ConsumerKafka processor will consume every single message from Kafka as a separate flowfile, which is very innefficient and makes NiF very slow to process all the data that you have. You should rewrite your flow using record-based processors, which will give you a lot more performance and throughput. Please watch Mark Payne's "Apache NiFi Anti-Patterns, Part 1" where he explains the concept of record-based processing and talks about what not to do in NiFi. Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-09-2022
03:03 AM
1 Kudo
@m_hsn , You can use the JoltTransformJSON processor to flatten this JSON structure, using the following JOLT specification: [
{
"operation": "shift",
"spec": {
"sensor": {
"*": {
"*": "[&1].&",
"@(2,date)": "[&1].date",
"@(2,device_name)": "[&1].device_name",
"@(2,device_id)": "[&1].device_id",
"@(2,status)": "[&1].status"
}
}
}
}
] Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-09-2022
12:08 AM
1 Kudo
Hi @lee_yg , The "Permanently Delete" option is equivalent to "removeMissingFiles = True" and "skipTrash = True". The "Copy HDFS File" option is equivalent to "replicateData = True", in the Hive arguments. Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-08-2022
06:53 PM
@FJATP Can you share your flow definition? André
... View more
03-08-2022
06:15 PM
@na2_koihey11 , Yes, there is! Please take a look at the page below: https://docs.cloudera.com/runtime/7.2.10/using-hiveql/topics/hive_hive_3_tables.html Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
03-08-2022
03:28 PM
What's the error you're getting? André
... View more