Member since
02-27-2020
173
Posts
42
Kudos Received
48
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2417 | 11-29-2023 01:16 PM | |
| 2957 | 10-27-2023 04:29 PM | |
| 2423 | 07-07-2023 10:20 AM | |
| 4938 | 03-21-2023 08:35 AM | |
| 1672 | 01-25-2023 08:50 PM |
09-17-2020
01:17 AM
1 Kudo
Hi Alex, I've resolved the issue. I had been using the wf:X() in the wrong place. Initially I had added an argument for the shell action like ${user} and then set $wf:user() as the value of that argument when submitting the workflow. I replaced ${user} with the Expression Language Function and the value was resolved automatically and when submitting the workflow I wasn't asked to provide a value for the argument 🙂
... View more
09-16-2020
12:20 PM
Hi James, Thanks for clarifying your question. It's true that there is no native functionality for this, however it is possible to change the action name in a slightly hacky way: 1. In the edit mode of your Oozie workflow click on the name of the node and note the id: 2. Save and export your workflow. This will give you access to a JSON file that you can edit. 3. In that JSON file do a search and replace for -[NODE ID]\" and replace with your desired name for the node. All of the references to the old node ID should be replaced to maintain references. Save the file. 4. Import your JSON back into Hue. This will update your existing workflow file and now the generated Oozie XML will have the name for the node that you want. Hope this helps. Regards, Alex
... View more
07-29-2020
03:20 AM
I was afraid of that. Yes, I am using distcp for migration. Thanks very much nevertheless for your reply. The bandwidth option might be a very last resort, but probably, that will have to do.
... View more
07-25-2020
08:31 PM
I just could not see it!!:-( Thank you so much. 😊
... View more
07-23-2020
02:38 AM
Can we delete kafka consumer group data? not the consumer group need to delete group data?
... View more
07-20-2020
11:27 PM
1 Kudo
You can use Nifi to save your Kafka messages into HDFS (for instance). Something like this : - ConsumeKafka : flowfile content is the Kafka message itself, and you have access to some attributes : topic name, partition, offset, key...(but not timestamp !). When i need it I store the timestamp in the key. - ReplaceText : build your backup line using flowfile content and attributes - MergeContent : to build a big file containing multiple Kafka message - Extracttext : to set attribute to be used as filename - PutHDFS : to save the created file into HDFS And you can do the reverse if you need to push it bash to your kafka cluster.
... View more
06-27-2020
10:48 PM
1 Kudo
Hi Guy, Please try adjusting your command to the following: ozone sh volume create --quota=1TB --user=hdfs o3://ozone1/tests Note the documentation states that the last parameter is a URI in the format <prefix>://<Service ID>/<path>. Service Id is what you found in ozone-site.xml.
... View more
06-12-2020
09:48 AM
Hi @Maria_pl , generally speaking the approach is as follows: 1. Generate a dummy flow file that will trigger (GenerateFlowFile processor) 2. Next step is UpdateAttribute processor that sets the start date and end date as attributes in the flow file 3. ExecuteScript is next. This can be a python script, or whichever language you prefer, that will use the start and end attributes to list out all the dates in between. 4. If your script produces single file output of dates, you can then use SplitText processor to cut each row into its own flow file and from there each file will have its own unique date in your range. Hope that makes sense.
... View more
06-06-2020
09:15 PM
Glad to hear that you have finally found the root cause of this issue. Thanks for sharing @Heri
... View more
- « Previous
- Next »