1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1945 | 04-03-2024 06:39 AM | |
| 3056 | 01-12-2024 08:19 AM | |
| 1670 | 12-07-2023 01:49 PM | |
| 2448 | 08-02-2023 07:30 AM | |
| 3406 | 03-29-2023 01:22 PM |
11-22-2016
10:44 PM
3 Kudos
Download URLCrazy (http://www.morningstarsecurity.com/downloads/urlcrazy-0.5.tar.gz)
An Example Command Line Run for URLCrazy
[root@tspanndev13 security]# ./url.sh dataflowdeveloper.com
Typo Type,Typo,Valid,Pop,DNS-A,CC-A,Country-A,DNS-MX,Extn
Character Omission,daaflowdeveloper.com,true,,,?,,com,
Character Omission,dataflodeveloper.com,true,,,?,,com,
Character Omission,dataflowdeeloper.com,true,,,?,,com,
Character Omission,dataflowdeveloer.com,true,,,?,,com,
Character Omission,dataflowdevelope.com,true,,,?,,com,
Character Omission,dataflowdeveloper.cm,true,,,?,,cm,
Character Omission,dataflowdeveloper.co,false,,,?,,,
Character Omission,dataflowdeveloper.om,false,,,?,,,
Character Omission,dataflowdevelopercom,false,,,?,,,
...
Shell Script to Call From Apache NiFi
/opt/demo/security/urlcrazy-0.5/urlcrazy -i -f csv -p $@
An Example Command Line Run for NSLookup
Non-authoritative answer:
sparkdeveloper.com text = "v=spf1 ip4:00.000.0.0/24 ip4:00.000.00.0/24 ip4:11.111.111.0/19 ?all"
The Final JSON Output:
{
"path" : "./",
"execution.command" : "/opt/demo/security/url.sh",
"urlcrazy" : "Typo Type,Typo,Valid,Pop,DNS-A,CC-A,Country-A,DNS-MX,Extn\nCharacter Omission,sarkdeveloper.com,true,,,?,,com,\nCharacter Omission,spakdeveloper.com,true,,,?,,com,\nCharacter Omission,spardeveloper.com,true,,,?,,com,\nCharacter Omission,sparkdeeloper.com,true,,,?,,com,\nCharacter Omission,sparkdeveloer.com,true,,,?,,com,\nCharacter Omission,sparkdevelope.com,true,543,,?,,com,\nCharacter Omission,sparkdeveloper.cm,true,214000,,?,,cm,\nCharacter Omission,sparkdeveloper.co,false,,,?,,,\nCharacter Omission,sparkdeveloper.om,false,,,?,,,\nCharacter Omission,sparkdevelopercom,false,,,?,,,\nCharacter Omission,sparkdevelopr.com,true,,,?,,com,\nCharacter Omission,sparkdevelper.com,true,2190,,?,,com,\nCharacter Omission,sparkdeveoper.com,true,,,?,,com,\nCharacter Omission,sparkdevloper.com,true,2230,,?,,com,\nCharacter Omission,sparkdveloper.com,true,,,?,,com,\nCharacter Omission,sparkeveloper.com,true,,,?,,com,\nCharacter Omission,sprkdeveloper.com,true,,,?,,com,\nCharacter Repeat,spaarkdeveloper.com,true,,,?,,com,\nCharacter Repeat,sparkddeveloper.com,true,,,?,,com,\nCharacter Repeat,sparkdeeveloper.com,true,,,?,,com,\nCharacter Repeat,sparkdeveeloper.com,true,,,?,,com,\nCharacter Repeat,sparkdevelloper.com,true,,,?,,com,\nCharacter Repeat,sparkdevelooper.com,true,,,?,,com,\nCharacter Repeat,sparkdevelopeer.com,true,,,?,,com,\nCharacter Repeat,sparkdeveloper..com,false,,,?,,com,\nCharacter Repeat,sparkdeveloper.ccom,false,,,?,,,\nCharacter Repeat,sparkdeveloper.comm,false,,,?,,,\nCharacter Repeat,sparkdeveloper.coom,false,,,?,,,\nCharacter Repeat,sparkdeveloperr.com,true,2120,,?,,com,\nCharacter Repeat,sparkdevelopper.com,true,203,,?,,com,\nCharacter Repeat,sparkdevveloper.com,true,,,?,,com,\nCharacter Repeat,sparkkdeveloper.com,true,,,?,,com,\nCharacter Repeat,sparrkdeveloper.com,true,,,?,,com,\nCharacter Repeat,spparkdeveloper.com,true,,,?,,com,\nCharacter Repeat,ssparkdeveloper.com,true,,,?,,com,\nCharacter Swap,psarkdeveloper.com,true,,,?,,com,\nCharacter Swap,saprkdeveloper.com,true,,,?,,com,\nCharacter Swap,spakrdeveloper.com,true,,,?,,com,\nCharacter Swap,spardkeveloper.com,true,,,?,,com,\nCharacter Swap,sparkdeevloper.com,true,,,?,,com,\nCharacter Swap,sparkdeveloepr.com,true,,,?,,com,\nCharacter Swap,sparkdevelope.rcom,false,,,?,,,\nCharacter Swap,sparkdeveloper.cmo,false,,,?,,,\nCharacter Swap,sparkdeveloper.ocm,false,,,?,,,\nCharacter Swap,sparkdeveloperc.om,false,,,?,,,\nCharacter Swap,sparkdevelopre.com,true,,,?,,com,\nCharacter Swap,sparkdevelpoer.com,true,,,?,,com,\nCharacter Swap,sparkdeveolper.com,true,,,?,,com,\nCharacter Swap,sparkdevleoper.com,true,,,?,,com,\nCharacter Swap,sparkdveeloper.com,true,,,?,,com,\nCharacter Swap,sparkedveloper.com,true,,,?,,com,\nCharacter Swap,sprakdeveloper.com,true,18,,?,,com,\nCharacter Replacement,aparkdeveloper.com,true,129,,?,,com,\nCharacter Replacement,dparkdeveloper.com,true,,,?,,com,\nCharacter Replacement,soarkdeveloper.com,true,,,?,,com,\nCharacter Replacement,spaekdeveloper.com,true,,,?,,com,\nCharacter Replacement,sparjdeveloper.com,true,,,?,,com,\nCharacter Replacement,sparkdebeloper.com,true,,,?,,com,\nCharacter Replacement,sparkdeceloper.com,true,,,?,,com,\nCharacter Replacement,sparkdevekoper.com,true,,,?,,com,\nCharacter Replacement,sparkdeveliper.com,true,,,?,,com,\nCharacter Replacement,sparkdevelooer.com,true,92,,?,,com,\nCharacter Replacement,sparkdevelopee.com,true,,,?,,com,\nCharacter Replacement,sparkdeveloper.cim,false,,,?,,,\nCharacter Replacement,sparkdeveloper.con,false,,,?,,,\nCharacter Replacement,sparkdeveloper.cpm,false,,,?,,,\nCharacter Replacement,sparkdeveloper.vom,false,,,?,,,\nCharacter Replacement,sparkdeveloper.xom,false,,,?,,,\nCharacter Replacement,sparkdevelopet.com,true,,,?,,com,\nCharacter Replacement,sparkdeveloprr.com,true,,,?,,com,\nCharacter Replacement,sparkdevelopwr.com,true,,,?,,com,\nCharacter Replacement,sparkdevelpper.com,true,,,?,,com,\nCharacter Replacement,sparkdevrloper.com,true,,,?,,com,\nCharacter Replacement,sparkdevwloper.com,true,,,?,,com,\nCharacter Replacement,sparkdrveloper.com,true,,,?,,com,\nCharacter Replacement,sparkdwveloper.com,true,,,?,,com,\nCharacter Replacement,sparkfeveloper.com",
"filename" : "4963644600105857",
"execution.command.args" : "sparkdeveloper.com",
"execution.status" : "0",
"spf" : "Server:\t\t10.42.1.20\nAddress:\t10.42.1.20#53\n\nNon-authoritative answer:\nsparkdeveloper.com\ttext = \"v=spf1 ip4:38.113.1.0/24 ip4:38.113.20.0/24 ip4:65.254.224.0/19 ?all\"\n\nAuthoritative answers can be found from:\n\n",
"execution.error" : "",
"uuid" : "f13ca0f5-bac7-4da7-b5c3-8b1c145591bf",
"url" : "sparkdeveloper.com",
"enrich.dns.record0.group0" : "\"v=spf1 ip4:00.000.0.0/24 ip4:00.000.00.0/24 ip4:11.111.111.0/19 ?all\""
}
You can grab lots of different command line and REST results for augmenting existing data, tools and feeds.
An URL Crazy report is useful for intelligence on what other domains people may be squatting on that are close to yours. Often these can be used by spammers, malware and for other nefarious purposes.
... View more
Labels:
11-20-2016
07:20 PM
1 Kudo
or call the API in a script that can send a message or mail.
... View more
11-20-2016
07:20 PM
1 Kudo
https://nifi.apache.org/docs/nifi-docs/rest-api/ Could look for values: GET
/flowfile-queues/{id}/listing-requests/{listing-request-id}
Gets the current status of a listing request for the specified connection. You can call that from a NiFi InvokeHTTP and check the status.
... View more
11-20-2016
06:31 PM
1 Kudo
https://community.hortonworks.com/content/kbentry/7882/hdfnifi-best-practices-for-setting-up-a-high-perfo.html Where do you want the threshold? On the queue? On each processor? Set your run schedule or Scheduling Strategy.
... View more
11-19-2016
04:35 PM
I installed HDP 2.5 on Centos 7.2 with no issues, make sure it's the correct repo and follow the prereqs for install
... View more
11-18-2016
09:37 PM
1 Kudo
Use Case: Store log data in Hadoop Data Lake and Send Curated, Reduced Set to Sumologic via REST API Integration The integration point for sending log data to Sumologic is their HTTP Source. To send data you must setup an HTTP Source in Sumologic from your web console as shown below. Take the HTTP string they give to you and put into an InvokeHTTP processor with POST, it will look something like this: https://endpoint1.collection.us2.sumologic.com/receiver/v1/http/ZaLongCodeLong I noticed IP Address in the data, so I decided to parse it out: ${regex.6:substringAfterLast('source ip: '):replaceAll('\)','')} Then send it to MaxMind for processing. The MaxMind GeoIP free database is easy to download and use with NiFi. Just add the GeoIP processor and connect the field and the file location. Finally displaying and charting data is up next, easy as pie in Zeppelin. Just query my Phoenix data.
The flow is a bit long as I am using RegEx to convert the logs from NiFi Log4J format to individual attributes then make them into a JSON file and convert to SQL upsert for Phoenix insert. I log all failures to a file.
Transmitting Log Data It's pretty easy to integrate with Sumologic. They have a nice HTTP endpoint to send this data. They will accept JSON and many other text formats. They have a native agent, which can be interfaced with as well via several logging mechanisms. I asked them about it and I may work on that in the future. Apache Phoenix Table for Log Data 0: jdbc:phoenix:tspannserver> !describe nifilogs
+------------+--------------+-------------+--------------------+------------+------------+--------------+----------------+-----------------+-----------------+-----------+---------+
| TABLE_CAT | TABLE_SCHEM | TABLE_NAME | COLUMN_NAME | DATA_TYPE | TYPE_NAME | COLUMN_SIZE | BUFFER_LENGTH | DECIMAL_DIGITS | NUM_PREC_RADIX | NULLABLE | REMARKS |
+------------+--------------+-------------+--------------------+------------+------------+--------------+----------------+-----------------+-----------------+-----------+---------+
| | | NIFILOGS | SDATE | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | FRAGID | 12 | VARCHAR | null | null | null | null | 0 | |
| | | NIFILOGS | MSG | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | MODULE | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | STIME | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | STYPE | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | SCLASS | 12 | VARCHAR | null | null | null | null | 1 | |
| | | NIFILOGS | GEOCITY | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | GEOLATITUDE | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | GEOLONGITUDE | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | GEOCOUNTRY | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | GEOPOSTALCODE | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | GEOCOUNTRYISOCODE | 12 | VARCHAR | 255 | null | null | null | 1 | |
| | | NIFILOGS | IPADDRESS | 12 | VARCHAR | 255 | null | null | null | 1 | |
+------------+--------------+-------------+--------------------+------------+------------+--------------+----------------+-----------------+-----------------+-----------+---------+
NiFi Apache Phoenix (HBase) SQL Upsert (ReplaceText) upsert into nifilogs (sdate,fragid,msg,module,stime,stype,sclass, geocity, geolatitude, geolongitude, geocountry, geopostalcode, geocountryisocode, ipaddress, geostate)
values ('${'date'}','${'fragment.identifier'}', '${'msg'}','${'module'}','${'time'}','${'type'}','${'class'}','${'ipaddress.geo.city'}',
'${'ipaddress.geo.latitude'}','${'ipaddress.geo.longitude'}','${'ipaddress.geo.country'}','${'ipaddress.geo.postalcode'}','${'ipaddress.geo.country.isocode'}','${'ipaddress'}','${'ipaddress.geo.subdivision.isocode.0'}')
Note the use of stime, stype, sclass, sdate; I am trying to avoid using built-in SQL keywords. I added some fields for the geo encoding that will come from MaxMind database. I parse out IP Address from the main log record. References:
Fun with Regex Sumologic Download a MaxMind GeoLite Database for Geo Enrichment
... View more
Labels:
11-17-2016
02:28 PM
1 Kudo
Have you tried this in Spark? or NiFi? How much memory is configured in your app? How much is configured in YARN for your job resources? Can you post additional logs? code? submit details? Why is the key an avro record and not the value? You should make sure you have enough space in HDFS and also in the regular file system as some of the reduce stage will get mapped to regular disk. Can you post hdfs and regular file system df
... View more
11-16-2016
04:10 AM
make sure selinux is off iptables stopped jdk 1.8 proper dns running as root. passwordless ssh lots of space under /var and all the latest versions make sure you allow ttyless sudo allowed in visudo
... View more
11-15-2016
04:23 PM
A big note run visudo make sure there is no Defaults requiretty That will block ambari agent from doing sudo within installs which is needs.
... View more
11-11-2016
06:13 PM
8 Kudos
It's easier than I would have thought to add images to your SQL results tables in Apache Zeppelin. It's pretty simple to do this in HDP 2.5's version of Apache Zeppelin. You use the %html tag to output HTML instead of text. Use Case: Displaying Image with TensorFlow Inception Image Recognition Results in Same List Example SQL: SELECT user_name, handle, concat('%html <img width=50 height=60 src="', media_url, '">') as media,substring(inception,0,150) as inception, msg, sentiment, stanfordsentiment, location, time
FROM twitterorc where inception not like '%Not found%' and inception is not null and trim(inception)!= ''
... View more
Labels: