Member since
01-02-2020
40
Posts
3
Kudos Received
5
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 11186 | 12-23-2020 09:33 AM | |
| 2829 | 05-18-2020 01:27 AM | |
| 2416 | 04-28-2020 11:02 AM | |
| 6744 | 04-23-2020 12:20 PM | |
| 3090 | 01-25-2020 11:50 PM |
01-13-2023
07:39 AM
Hi I want to extract a CSV file from SQL query one of the columns should be present in the CSV twice, diffirent position. I had the same error is there a workaround ? Thank you
... View more
02-28-2021
10:23 PM
1 Kudo
@murali2425 Here are two possible solutions. At solution 1 the value is set into an array. At solution 2 I took the flowfile-content (json) and set it into an attribute. Then you can work with expression language to get the value. For testing the syntax I recommend this site: http://jsonpath.herokuapp.com/
... View more
01-08-2021
05:23 AM
2 Kudos
@murali2425 The solution you are looking for is QueryRecord configured with a CSV Record Reader and Record Writer. You also have UpdateRecord and ConvertRecord which can use the Readers/Writers. This method is preferred over splitting the file and adds some nice functionality. This method allows you to provide a schema for both the inbound csv (reader) and the downstream csv (writer). Using QueryRecord you should be able to split the file, and set attribute of filename set to column1. At the end of the flow you should be able to leverage that filename attribute to resave the new file. You can find some specific examples and configuration screen shots here: https://community.cloudera.com/t5/Community-Articles/Running-SQL-on-FlowFiles-using-QueryRecord-Processor-Apache/ta-p/246671 If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more
12-23-2020
09:33 AM
Hi Matt, Great, with your suggestion, I got what I was expecting. Thank You, --Murali
... View more
05-18-2020
01:27 AM
The issue is with the missing of square brackets at starting and @ ending .. The working query is .. [{ "$group": { "_id": { "X": "$X", "Y_DT": "$Y_DT", "Z": "$Z" }, "adj": {"$sum": "$adj" }, "bjc": {"$sum": "$bjc" }, "jbc": {"$sum": "$jbc" }, "mnk": {"$sum": "$mnk"} } }]
... View more
04-28-2020
11:02 AM
it did work after adding '\t' to read_csv as 2nd arg.
... View more
03-28-2020
09:26 AM
1 Kudo
great and Thanks it was very useful for my work. Here I will be having data in the excel, I need to extract the column names(1st row) and form the create the table with primary key and foreign key and data(2nd row) in the database(mysql) from nifi.
... View more
01-25-2020
11:50 PM
Hi, This issue is resolved, it was the issue with the kafka server, which was stopped due /tmp log issue
... View more
01-23-2020
03:02 AM
Thanks Matt
... View more
01-10-2020
05:05 AM
Ahh I did not pay attention to that in the original screen shot, just tried to offer the syntax for the json parsing. Glad you got it to work! Isn't learning NiFi fun? I love it.
... View more