Member since
09-06-2016
108
Posts
36
Kudos Received
11
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2528 | 05-11-2017 07:41 PM | |
1190 | 05-06-2017 07:36 AM | |
6639 | 05-05-2017 07:00 PM | |
2321 | 05-05-2017 06:52 PM | |
6367 | 05-02-2017 03:56 PM |
05-07-2017
01:11 PM
@Daniel Kozlowski Does installing those libraries e.g. yum install libtre-devel tre-devel help with matching the prerequistes?
... View more
05-06-2017
07:36 AM
There is an article Creating a HANA Workflow using HADOOP Oozie from the SAP blog Perhaps easier is to connect to SAP Hana via NIFI. Preferred method is via REST API, not via JDBC (see previous question on this topic). However, you can connect over JDBC with Hana using NIFI as explained here: https://community.hortonworks.com/articles/81153/basic-cdc-using-apache-nifi-and-sap-hana.html
... View more
05-06-2017
06:12 AM
3 Kudos
Mindwave Neurosky The Mindwave Neurosky is a headset that allows you to record your brainwaves using EEG technology. In this article we show you how to ingest these brainwaves with NIFI Mindwave Neurosky driver installation for OSX Sierra
Download and install the latest driver from http://download.neurosky.com/public/Products/MindWave%20headset/RF%20driver%20for%20Mac/MindWaveDriver5.1.pkg After the driver is installed, download and install the latest MindWave Manager from http://download.neurosky.com/public/Products/MindWave%20headset/RF%20driver%20for%20Mac/MindWave%20Manager4.0.4.zip Launch the MindWave Manager, navigate to "Pairing" section and click the "Search for MindWave", then follow the instructions to pair the headset. Install NIFI on OSX Sierra with Homebrew
Install Homebrew from the terminal: /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" Install NIFI (time of writing v1.1.2): brew install nifi Import NIFI Flow Template
An example flow template can be downloaded using Curl:
curl -O https://gist.githubusercontent.com/wardbekker/a80cbe7d12bc1866f393c5a74bf417a0/raw/9d64daa748dec352ebe8cc350e3fb34e65130ec3/mindwave_nifi_ingest_template.xml
The most important processor here is the ListenTCP processor, which will listen on port 20000 and will receive the JSON payload. The flow also contains a Site 2 Site NIFI connection to a remote processgroup with the URL
http://wbekkerhdf0.field.hortonworks.com:9090/nifi . You can change it to your own remote NIFI cluster. Get Ruby 'forward' script
The Mindwave Thinkgear driver will create a socket where we can consume the sensor data as Json messages. To ingest it with the current vanilla version of NIFI, we need to 'forward' the messages from the thinkgear port to the NIFI ListenTCP processor port number. Upcoming versions of NIFI will have a
GetTCP processor, making this Ruby script obsolete.
Save this Ruby script as a file under
thinkgear.rb . Run it with ruby thinkgear.rb AFTER you have connected your headset AND started ListenTCP processor on the NIFI flow. Otherwise you will run into connection errors.
require 'socket'
require 'json'
require 'date'
thinkgear_server_socket = TCPSocket.new 'localhost', 13854
nifi_server_socket = TCPSocket.new 'localhost', 20000
# trigger json output
thinkgear_server_socket.puts "{\"enableRawOutput\": true, \"format\": \"Json\"}\n"
while line = thinkgear_server_socket.gets # Read lines from socket
hash = JSON.parse(line)
hash['timestamp'] = DateTime.now.strftime('%Q')
hash['user_id'] = 1
json = JSON.generate(hash)
puts json
nifi_server_socket.puts json
end
thinkgear_server_socket.close
nifi_server_socket.close
Start ingestion of your brainwaves
Connect you headset by launching the MindWave Manager, navigate to "Pairing" section and click the "Search for MindWave", then follow the instructions to pair the headset. Start the NIFI flow, or at least the ListenTCP processor. Start the ruby script with ruby thinkgear.rb .
At this point you should see JSON output from your Mindwave headset on your terminal, and new flowfiles into NIFI. Have fun with your brainwaves!
... View more
Labels:
05-05-2017
07:03 PM
Is there something in the NIFI log regarding the Hbase service?
... View more
05-05-2017
07:00 PM
1 Kudo
Don't know of a processor that directly converts JSON to XML. One option would be to use ExecuteScript with a Groovy script that did the conversion. It looks like there are some examples out there of converting JSON to XML with Groovy: Example groovyscript: import net.sf.json.JSON
import net.sf.json.JSONSerializer
import net.sf.json.xml.XMLSerializer
String str = '''{ "glossary": { "title": "example glossary", "GlossDiv": { "title": "S", "GlossList": { "GlossEntry": { "ID": "SGML", "SortAs": "SGML", "GlossTerm": "Standard Generalized Markup Language", "Acronym": "SGML", "Abbrev": "ISO 8879:1986", "GlossDef": { "para": "A meta-markup language, used to create markup languages such as DocBook.", "GlossSeeAlso": ["GML", "XML"] }, "GlossSee": "markup" } } } } }'''
JSON json = JSONSerializer.toJSON( str )
XMLSerializer xmlSerializer = new XMLSerializer()
xmlSerializer.setTypeHintsCompatibility( false )
String xml = xmlSerializer.write( json )
System.out.println(xml)
... View more
05-05-2017
06:52 PM
1 Kudo
Hi @Alexander Daher. In the commits dropdown (see image) you can probably select a previous commit to return to that version.
... View more
05-04-2017
12:45 PM
Hi, How can I disable the logging for the HDFS Audit log? My current config: hdfs.audit.logger=INFO,console
log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=${hdfs.audit.logger}
log4j.additivity.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=false
log4j.appender.DRFAAUDIT=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFAAUDIT.File=${hadoop.log.dir}/hdfs-audit.log
log4j.appender.DRFAAUDIT.layout=org.apache.log4j.PatternLayout
log4j.appender.DRFAAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
log4j.appender.DRFAAUDIT.DatePattern=.yyyy-MM-dd
... View more
Labels:
- Labels:
-
Apache Hadoop
05-03-2017
08:12 PM
Hi, as a workaround you could prevent the generation of the hprof files by setting the jvm option HeapDumpPath to /dev/null instead of /tmp. This will not resolve the root cause obviously.
... View more
05-03-2017
07:47 PM
Can you post the full date format used?
... View more