Member since
07-19-2018
613
Posts
100
Kudos Received
117
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3121 | 01-11-2021 05:54 AM | |
2235 | 01-11-2021 05:52 AM | |
5955 | 01-08-2021 05:23 AM | |
5530 | 01-04-2021 04:08 AM | |
25598 | 12-18-2020 05:42 AM |
11-30-2020
11:51 AM
@Chigoz Your issue with that sandbox cluster is likely too many services trying to run on too small of an instance/node. You will need to strategically turn on the components you need individually starting with HDFS first. If you have issues specific to the sandbox, or certain components starting, you should open a post with those specific errors. To install HUE, check out my management pack: https://github.com/steven-matison/dfhz_hue_mpack Local search for "hue install" topics includes articles reference above hue mpack: https://community.cloudera.com/t5/forums/searchpage/tab/message?advanced=false&allow_punctuation=false&q=install%20hue If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more
11-23-2020
05:21 AM
@Makaveli You can use File Filter Regex in ListSFTP to control which files come out of this processor. Otherwise, you need to use ListSftp->UpdateAttribute->RouteOnAttribute to make sure the file name ends as suggested above. Once you have a solid route of the files you want, then connect to FetchSFTP.
... View more
11-17-2020
06:43 AM
@ravisro I think the only way you can do this is to make a custom process that interacts with the NiFI Api to complete your actions. If you use the NiFI UI, with developer tools open, you can watch/record all of the api calls the UI uses against the api to perform the action manually. Next, duplicate the required actions in your processor. I would love to see you post this same question in the NiFi Slack...
... View more
11-16-2020
09:31 AM
@manas_mandal786 Thanks for providing template. The next thing I wanted to look at was configs so you read my mind. I was able to import your template, start the context service, and add success/failure routes for each test to an output port (so i can see results are success/fail). With the workflow setup I execute 2 curl calls: curl -L -X POST 'http://localhost:8009/report' -H 'Content-Type: multipart/form-data' -F test=test curl -L -X POST 'http://localhost:8009/report' -H 'Content-Type: multipart/form-data' -H 'ReportType: test' -F test=test And my canvas looks like this: I did not get any warnings. Everything worked as expected. If you are still getting issues, i suspect it is something to do with your environment (windows) not being able to store the metadata for the StandardSSLContext Service. This store will hold the client connection until you send the response. In your case that part does not seem to be working. Look for any errors/warnings in nifi-app.log while testing. You may see warnings in the Controller Services window on the StandardSSLContext. You may need to set each processor to DEBUG in order to find more info in the nifi-app.log.
... View more
11-12-2020
05:26 AM
@Namitjain This should absolutely work. I would suggest you update the post with information about the exact issue you are having. Be sure to include your configuration of the putS3 Processor. I also recommend not routing failure back on the processor itself in testing. Route to another proc, or an output port during testing. When a flowfile goes to failure, inspect it. Look at the attributes on the flowfile, it often has info on why it failed. If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more
11-02-2020
05:15 AM
@P_Rat98 Per our PM discussion, in your flow use DetectDuplicate before sending an email. This should rate limit the # of messages you send based on your configuration of detectDuplicate. Additionally when this is linked in your flow, and duplicates are auto-terminated it will drain the flow and stop it from filling up the queue. Also as suggested you can chose to retain the duplicates, but move them into a much bigger Queue which isnt going to back up the main flow. Then once you see the email, you can go look at flow, see what flowfiles were causing issues, and take some corrective action. If you really need to monitor flow for a queue being full, you would need to use the nifi API to check the status of the queue. This maybe more work than it is worth, when you can solve as above much easier. However, i would recommend you check the api out, there are a lot of api capabilities and I am beginning to use nifi api calls within my flow to monitor, stop, start, and take actions automatically that would normally require a human doing them in the UI. If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more
10-29-2020
01:51 PM
Thanks @stevenmatison Do you by chance know the answer to this question https://community.cloudera.com/t5/Support-Questions/Extract-string-nested-in-JSON-value/m-p/305099 It's probably something very easy, but nothing that I tried works. Valentin
... View more
10-29-2020
07:54 AM
Thanks, I am trying some stuff now to parse data using the JoltSpec/JoltTransformJSON processor that could help me with this issue, but thanks for this help, hopefully can get things running more smoothly soon. 🙂
... View more
10-29-2020
04:46 AM
@amey84 Yes. Although yum install still provides the bundled postgres, you can choose to install it or another database separately. During ambari-server setup you choose Y here: Enter advanced database configuration [y/n] (n)? y The following links will be helpful here for more info about ambari + postgres: https://docs.cloudera.com/HDPDocuments/Ambari-2.6.1.5/bk_ambari-administration/content/using_ambari_with_postgresql.html If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more