Member since
01-15-2016
34
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2822 | 08-16-2017 09:48 AM |
08-16-2017
09:48 AM
Got the solution: 1. Replace String with actual year search value: <(.*?)> Replacement Value: ${now():format('yyyy ')} 2. Extract Timestamp and Message time = (^.{20}) message = (^.*$) 3. Replace String with insert into clause: search value: (?s)(^.*$) Replacement Value: insert into syslog values ('${time:substring(0,20):toDate('yyyy MMM dd HH:mm:ss'):format('yyyy-MM-dd HH:mm:ss')}','${message:substring(21):substringBefore(' ')}','${message:substring(21):substringAfter(' ')}');
... View more
08-14-2017
02:00 PM
Hi, i have the following input single row FlowFiles: Aug 9 10:34:12 sysxyz-p2 sudo: pam_unix(sudo:session): session opened for user root by (uid=0) Aug 10 10:34:12 sysxxx.xxx.xx sudo: pam_unix(sudo:session): session opened for user root by (uid=0) Aug 11 10:34:12 sysxxx.xxx.xx sudo: pam_unix(sudo:session): session closed for user root Aug 12 10:34:12 sysxxx.xxx.xyz sudo: pam_unix(sudo:session): session closed for user root Aug 13 10:34:01 sysxxx.xxx.xx CRON[26312]: pam_unix(cron:session): session opened for user monadmin by (uid=0) 1. Split the Flow into timestamp, host, message fields: timestamp host message Aug 9 10:34:12 sysxyz-p2 sudo: pam_unix(sudo:session): session opened for user root by (uid=0) Aug 10 10:34:12 sysxxx.xxx.xx sudo: pam_unix(sudo:session): session opened for user root by (uid=0) Aug 11 10:34:12 sysxxx.xxx.xx sudo: pam_unix(sudo:session): session closed for user root Aug 12 10:34:12 sysxxx.xxx.xyz sudo: pam_unix(sudo:session): session closed for user root Aug 13 10:34:01 sysxxx.xxx.xx CRON[26312]: pam_unix(cron:session): session opened for user monadmin by (uid=0) 2. Convert timestamp Format: Aug 9 10:34:12 - > 'YYYY.MM.DD HH:MI:SS' 3. Concat the extracted files to insert into ... string: example: insert into table ('2017-08-09 10:34:12', 'sysxyz-p2', 'sudo: pam_unix(sudo:session): session opened for user root by (uid=0)'; Thanks for you help! Timo
... View more
Labels:
- Labels:
-
Apache NiFi
08-11-2017
01:44 PM
Great, works !!! Thanks!
... View more
08-11-2017
11:52 AM
Hi Bryan, how can i merge the FlowData Files to a multiline File? Using the MergeCentent processor i got a concatenated line. input: 1 FlowData: <30>Aug 11 06:27:26 xxx.xxx.com systemd[28892]: Stopping Timers. 2 FlowData:<30>Aug 11 06:27:15 xxx.xxx.com systemd[24517]: Stopping Paths etc... MergeContent output: <30>Aug 11 06:27:26 xxx.xxx.com systemd[28892]: Stopping Timers.<30>Aug 11 06:27:15 xxx.xxx.com systemd[24517]:Stopping Paths I need the following multiline structure: <30>Aug 11 06:27:26 xxx.xxx.com systemd[28892]: Stopping Timers. <30>Aug 11 06:27:15 xxx.xxx.com systemd[24517]:Stopping Paths any solution? regards Timo
... View more
08-09-2017
04:02 PM
Thanks, this was really easy. I just generating "insert into table" statements including the Flow File with the ReplaceText processor and connect it with the PutSQL processor. regards Timo
... View more
08-08-2017
06:04 PM
Hi, 1. step we ingest SYSLOG messages with the ListenSyslog and PutHDFS in our DataLake. 2. step ingest with PXF external table technology into HAWQ. CREATE EXTERNAL TABLE ext_syslog_hist (message TEXT) LOCATION ('pxf://xxx.xxx.com:8020/apps/nifi/syslog/history/*?PROFILE=HdfsTextSimple') FORMAT 'TEXT' (delimiter=E'\t'); Every minute there are thousands of small (100 bytes) files created in the HDFS. So I'm looking for a way to ingest the incoming ListenSyslog directly into a HAWQ table. Regards Timo
... View more
Labels:
- Labels:
-
Apache NiFi
06-21-2017
09:34 AM
Hi,
we get every day a dynamic list if host with the InvokeHTTP processor from our source system. On this base we have to request the source system again with InvokeHTTP host by host to get detailed data. 1. InvokeHTTP Request: https://xyz.abc.com/prod/check_mk/view.py?view_name=hostgroup&hostgroup=vmware-host&limit=hard&output_format=json&_username=bi&_secret=XXXXXXXXX
Output :
[{"host_state":"UP","host“:“xxx.yyy.com“,“host_icons":"menu","num_services_ok":"28","num_services_warn":"0","num_services_unknown":"0","num_services_crit":"0","num_services_pending":"0","ctime":"Wed Jun 21 06:00:02 CEST 2017"},{"host_state":"UP","host“:“yyy.yyy.com“,“host_icons":"menu","num_services_ok":"34","num_services_warn":"0","num_services_unknown":"0","num_services_crit":"0","num_services_pending":"0","ctime":"Wed Jun 21 06:00:02 CEST 2017"},{"host_state":"UP","host“:“zzz.yyy.com“,“host_icons":"menu","num_services_ok":"34","num_services_warn":"0","num_services_unknown":"0","num_services_crit":"0","num_services_pending":"0","ctime":"Wed Jun 21 06:00:02 CEST 2017“}] From this output we have to grep the host values (“xxx.yyy.com“, ....) and build an request for every host: 2. InvokeHTTP Requests https://xyz.abc.com/prod/check_mk/webapi.py?action=get_graph&_username=ansible&_secret=XXXXXXXXX&request={"specification":["template",{"service_description":"Filesystem /","site":"prod","graph_index":0,"host_name":"xxx.yyy.com"}],"data_range":{"time_range":[1491174000,1491260340]}} https://xyz.abc.com/prod/check_mk/webapi.py?action=get_graph&_username=ansible&_secret=XXXXXXXXX&request={"specification":["template",{"service_description":"Filesystem /","site":"prod","graph_index":0,"host_name":"yyy.yyy.com"}],"data_range":{"time_range":[1491174000,1491260340]}} https://xyz.abc.com/prod/check_mk/webapi.py?action=get_graph&_username=ansible&_secret=XXXXXXXXX&request={"specification":["template",{"service_description":"Filesystem /","site":"prod","graph_index":0,"host_name":"zzz.yyy.com"}],"data_range":{"time_range":[1491174000,1491260340]}}
...
How can we challenge this?
Thanks
Timo
... View more
Labels:
- Labels:
-
Apache NiFi
06-20-2017
12:24 PM
Thanks Matt! Works perfekt ...
... View more
06-19-2017
01:57 PM
i have the following son structure and want to replace the special character string "[{" and "]}" with "{" and "}" : {"result": {"step": 1800, "start_time": 1491174000, "end_time": 1491260400, "curves": [{"color": "#a05830", "rrddata": [0.603695, 1.06903, 0.94504, 0.68786, 31.3228, 0.316447, 0.808407, 0.247655, 0.174552, 0.123072, 0.62, 0.0689, 0.30758, 0.0869783, 0.14478, 0.305993, 0.808873, 0.193055, 0.113133, 0.46116, 8.047, 1.88388, 2.62721, 0.770247, 8.06144, 2.25591, 22.3061, 57.5539, 0.270233, 1.50602, 0.819887, 5.90425, 0.43361, 0.526907, 2.46678, 0.759873, 0.451133, 0.25843, 0.224033, 0.661373, 1.1279, 0.348587, 0.277142, 0.06647, 0.16693, 0.06225, 0.0588483, 0.08057], "line_type": "area", "title": "Disk utilization"}]}, "result_code": 0}
... View more
Labels:
- Labels:
-
Apache NiFi
06-09-2017
08:34 AM
We access Json HDFS files as an external Table in HAWQ (external table:ext_chkmk_hosts).
The "insert into table check_host ... select ..." loads the new data into a HAWQ table (check_hosts) in the same database.
We want to schedule all the jobs with Nifi.
1. Load Data from System (InvokeHTTP)
2. Convert the JSON output (JoltTransformJSON)
3. add key:value (timestamp) (JoltTransformJSON)
4. store the data in HDFS (PutHDFS)
5. Access all HDFS JSON file in HAWQ (external table)
6. insert new data into HAWQ Table (insert into check_hosts ... select ... from ext_checkmk_hosts.
... View more