Member since
01-17-2016
42
Posts
50
Kudos Received
4
Solutions
01-17-2017
06:43 PM
7 Kudos
If you have ever tried to spawn multiple cloudbreak shells you may have run into an error. That is because the default "cbd util cloudbreak-shell" uses docker containers. The fastest work around of this is to use the Jars directly. These Jars can be remotely run from your personal machine or run on the cloudbreak machine itself. Prepping the cloudbreak machine(only needed if running jars locally on the AWS image) Log into your cloudbreak instance and go to /etc/yum.repos.d Remove the Centos-Base.repo file (this is a redhat machine and this can cause conflicts) Install java-8 (yum install java-1.8.0*) Change directory back to /home/cloudbreak Downloading the Jar Set a global variable equal to your cloudbreak version (export CB_SHELL_VERSION=1.6.1) Download the jar (curl -o cloudbreak-shell.jar https://s3-eu-west-1.amazonaws.com/maven.sequenceiq.com/releases/com/sequenceiq/cloudbreak-shell/$CB_SHELL_VERSION/cloudbreak-shell-$CB_SHELL_VERSION.jar) Using the Jar Interactive mode (java -jar ./cloudbreak-shell.jar --cloudbreak.address=https://<your-public-hostname> --sequenceiq.user=admin@example.com --sequenceiq.password=cloudbreak --cert.validation=false) Using a command file (java -jar ./cloudbreak-shell.jar --cloudbreak.address=https://<your-public-hostname> --sequenceiq.user=admin@example.com --sequenceiq.password=cloudbreak --cert.validation=false --cmdfile=<your-FILE>)
... View more
Labels:
09-15-2016
11:28 AM
6 Kudos
In this article I will review the steps required to enrich and filter logs. It is assumed that the logs are landing one at a time as a stream into the nifi cluster. The steps involved
Extract Attributes - IP and Action Extract Attributes - IP and Action Cold Store non ip logs GeoEnrich the IP address Cold store local IP addresses Route the remaining logs based on threat level Store the low threat logs in HDFS Place high threat logs into an external table Extract IP Address and Action - ExtractText Processor This processor will evaluate each log and parse the information into attributes. To create a new attribute add a property and give it a name(soon to be attribute name) and a java-style regex command. As the processor runs it will evaluate the regex and create an attribute with the result.
If there is no match it will be sent to the 'unmatched' result which is a simple way of filtering out different logs. GeoEnrichIP - GeoEnrichIP Processor This processor takes the ipaddr attribute generated in the previous step and compares it to a geo-database('mmdb'). I am using the GeoLite - City Database found here Route on Threat - RouteOnAttribute Processor This processor takes the IsDenied attribute from the previous step and tests to see if it is there. This will only exist if the "Extract IP Address" Processor found "iptables denied" in the log. It is then routed to a connectionw ith that property's name. More properties can be added with thier own rules following the nifi expression language
Note I plan on adding location filtering but did not want to obscure the demo in too many steps. Cold and Medium Storage - Processor Groups These two processor groups are very similar in function. Eventually they could be combined into one shared group using attributes for rules but for now they are separate. Merge Content - This processor takes each individual line and combines them into a larger aggregated file. This helps avoid the too many small files problem that arises in large clusters Compress Content - Simply saves disk space by compressing them Set Filename As Timestamp - UpdateAttribute Processor - This takes each aggragate and sets the attribute 'filename' to the current time. This will allow us to sort the aggregates by when they were written for later review PutHDFS Processor - Takes the aggregate and saves it to HDFS High Threat - Processor Group In order to be read by a hive external table we need to convert the data to a JSON format and save it to the correct directory. Rename Attributes - UpdateAttribute Processor - This renames the fields to match the hive field format Put Into JSON - AttributesToJSON - Takes the renamed fields and saves them in a JSON string that the hive SerDe can read natively Set Filename As Timestamp - UpdateAttribute Processor - Once again this sets the filename to the timestamp. This may be better served as systemname + timestamp moving forward PutHDFS - Stores the data to the hive external file location Hive Table Query Using the ambari hive view I am able to now query my logs and use sql-style queries to get results CREATE TABLE `securitylogs`( `ctime` varchar(255) COMMENT 'from deserializer', `country` varchar(255) COMMENT 'from deserializer', `city` varchar(255) COMMENT 'from deserializer', `ipaddr` varchar(255) COMMENT 'from deserializer', `fullbody` varchar(5000) COMMENT 'from deserializer') ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' LOCATION 'hdfs://sandbox.hortonworks.com:8020/user/nifi/High_Threat'
... View more
Labels:
02-27-2019
09:37 AM
Thanks Chris, it is also my use case for NiFi. Can you provide a download link for the template? Thanks.
... View more
08-26-2016
08:31 PM
I write the inner path in mounted directory. Then I start and this time it wrote xml's to the HDFS. So it seems that Recurse Subdirectories property is not working. Could ypu use recurse subdirectories property correctly? Still can't achieve to write all xml's in all sub directories automatically!
... View more
07-01-2018
01:59 PM
Hi XML tree is a complex as they are hierarchical and you most likely want a flat structure for easier access of the data. I just wrapped up the second article of this yesterday, and the code for this is available at GitHub link included in the article. http://max.bback.se/index.php/2018/06/30/xml-to-tables-csv-with-nifi-and-groovy-part-2-of-2/ The article describe the problem and is providing an implementation for the conversion from XML to CSV by flattening out the XML files, my example XML is flattened out into 4 tables, all depends on how many branches you have that is of the type 1 to many. /Max
... View more
02-07-2019
02:00 PM
Another tutorial at this location https://medium.com/@julienlaurenceau/monitoring-nifi-with-graylog-22ef1493310a
... View more