Member since
10-09-2015
86
Posts
179
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
25202 | 12-29-2016 05:19 PM | |
1857 | 12-17-2016 06:05 PM | |
14771 | 08-24-2016 03:08 PM | |
2163 | 07-14-2016 02:35 AM | |
3995 | 07-08-2016 04:29 PM |
05-18-2016
09:45 PM
2 Kudos
Hi @mayki wogno , I just created a tutorial in HCC regaring this, Please check if this answers your questions: https://community.hortonworks.com/articles/34147/nifi-security-user-authentication-with-kerberos.html Hope this helps.. Thanks, Jobin George
... View more
05-17-2016
03:59 PM
1 Kudo
Hi @Marco G Please Try adding hdfs-site.xml also to the configuration resources separated by comma.
... View more
05-13-2016
07:19 PM
Thanks! My bad, I wanted to know about decimal[0.25]. Glad to know its WIP.
... View more
05-13-2016
06:44 PM
3 Kudos
Hi, Does Latest version of NiFi/HDF supports Fractional numbers in expression language? If not do we have any plans to add it? Thanks, Jobin George
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera DataFlow (CDF)
05-12-2016
02:08 PM
1 Kudo
Hi @Andrew Grande Thanks!, I actually did that:) in the demo video I had it, but disabled and removed in the flow attached [Kathy will say if its going down based on value "You may want to buy some today" or Alex will say if Its going up," you may want to sell some" ;)] Thanks again!
... View more
05-12-2016
02:00 AM
6 Kudos
Introduction:
Here is a small demo how NiFi can let you retrieve Stock quotes
on voice commands and respond back with the help of Mac Dictation. - Here you can view the screen recording session that Demonstrates how it works! NiFi + Mac Dictation Demo Prerequisite:
1) Assuming you already have latest version of NiFi [0.5.x or
later] downloaded on your Mac.
2) Make sure you have Mac dictation feature, and its working!
Steps:
1) Use
GetHttp processor to Retrieve any
Stock price from “Google Finance”, you
can use the below URL to fetch YAHOO stock price:
http://finance.google.com/finance/info?client=ig&q=YHOO
- Processor Properties would look like:
2)
Use
ReplaceText and EvaluateJsonPath processors to trim
json result and extract stock value and variation in stock price today. Use
UpdateAttribute processor to save
values to attributes.
3) Now
you can use a
RouteOnAttribute to
decide when to trigger mail or/and give voice feedback.
4) Finally
Use
PutEmail
processor for email Alert and/or Use ExecuteStreamCommand
for voice feedback using Mac OS X’s “say” command, values can be extracted
from attributes.
I am
Attaching my flow template, which will help you understand better.
retrieving-real-time-quotes.xml
5) you may dry run the flow to see if it works as expected!
6)
Now we have to link Mac Dictation to communicate with NiFi. For that we have to create
custom commands in Dictation that fire NiFi API calls to make changes in the
GetHttp processor based on the stock requested via voice command. [Now
you would be thinking about better Speech recognition software 😉 ]
A
Sample API call to update processor
based on command would look like [update it with your processor-ids and client-ids]:
curl -i -X PUT -H 'Content-Type: application/json' -d '{"revision":{"version":176,"clientId":"93a9b515-1643-4ccf-8db1-71ef38238ae5"},"processor":{"id":"7f033b6f-e13e-4295-85de-33444aae6c4b","parentGroupId":"02888a59-ee2b-483e-8860-b0fdf948de97","config":{"properties":{"URL":"http://finance.google.com/finance/info?client=ig&q=AAPL","Filename":"APPLE"}}}}' http://localhost:8080/nifi-api/controller/process-groups/02888a59-ee2b-483e-8860-b0fdf948de97/processors/7f033b6f-e13e-4295-85de-33444aae6c4b
7) Now you can launch Dictation and try out!! Thanks,
Jobin
George
... View more
Labels:
03-02-2016
05:31 PM
1 Kudo
Yes it is an expected feature, It will only be valid if you add port to root canvas.
... View more
03-01-2016
09:16 PM
1 Kudo
Hi Obins, Please make sure you added 'output port' instead of Input port. I assume you are running HDF.1.1 or nifi-0.4.0 Below is a screenshot of input n output port, input will throw error with out any incoming connection. Thanks, Jobin
... View more
01-31-2016
08:46 PM
19 Kudos
Introduction Spark doesn't supply a mechanism to have data pushed to it - instead, it wants to pull data from other sources. In NiFi, this data can be exposed in such a way that a receiver can pull from it by adding an Output Port to the root process group. For Spark, we will use this same mechanism - we will use the Site-to-Site protocol to pull data from NiFi's Output Ports. Prerequisite 1) Assuming you already have latest version of NiFi-0.4.1/HDF-1.1.1 downloaded on your HW Sandbox, else execute below after ssh connectivity to sandbox is established: # cd /opt/
# wget http://public-repo-1.hortonworks.com/HDF/1.1.1.0/nifi-1.1.1.0-12-bin.tar.gz
# tar -xvf nifi-1.1.1.0-12-bin.tar.gz
2) Download Compatible version [in our case 0.4.1] of "nifi-spark-receiver" and "nifi-site-to-site-client" to Sandbox in a specific location: # mkdir /opt/spark-receiver
# cd /opt/spark-receiver
# wget http://central.maven.org/maven2/org/apache/nifi/nifi-site-to-site-client/0.4.1/nifi-site-to-site-client-0.4.1.jar
# wget http://central.maven.org/maven2/org/apache/nifi/nifi-spark-receiver/0.4.1/nifi-spark-receiver-0.4.1.jar Steps: 1) Configure Spark to load some specific NiFi Libraries as below, edit spark-defaults.conf to add jars to ClassPath. Append Below lines to bottom: # vi /usr/hdp/current/spark-client/conf/spark-defaults.conf
spark.driver.extraClassPath /opt/spark-receiver/nifi-spark-receiver-0.4.1.jar:/opt/spark-receiver/nifi-site-to-site-client-0.4.1.jar:/opt/nifi-1.1.1.0-12/lib/nifi-api-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/lib/bootstrap/nifi-utils-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-client-dto-1.1.1.0-12.jar
spark.driver.allowMultipleContexts = true
2) Open nifi.properties for updating configurations: # vi /opt/nifi-1.1.1.0-12/conf/nifi.properties
3) Change NIFI http port to run on 8090 as default 8080 will conflict with Ambari web UI # web properties #
nifi.web.http.port=8090
4) Configure NiFi instance to run site-to site by changing below configuration : add a port say 8055 and set "nifi.remote.input.secure" as "false" # Site to Site properties
nifi.remote.input.socket.host=
nifi.remote.input.socket.port=8055
nifi.remote.input.secure=false 5) Now Start [Restart if already running as configuration change to take effect] NiFi on your Sandbox. # /opt/nifi-1.1.1.0-12/bin/nifi.sh start 6) Let us build a small flow on NiFi canvas to read app log generated by NiFi itself to feed to spark: a) Connect to below url in your browser: http://<your_vm_ip>:8090/nifi/ b) Drop an "ExecuteProcess" Processor to canvas [or you can use TailFile Processor] to read lines added to "nifi-app.log". Auto Terminate relationship Failure. The configuration on the processor would look like below: c) Drop an OutputPort to the canvas and Name it 'spark', Once added, connect "ExecuteProcess" to the port for Success relationship. This simple flow will look like below: 7) Now lets go back to VM command line and create the Scala application to pull data from NiFi output port we just created: change directory to "/opt/spark-receiver" and create a shell script file "spark-data.sh" # cd /opt/spark-receiver
# vi spark-data.sh 😎 Add the below lines to the script file required for application to pull the data from NiFi output port and save it: // Import all the libraries required
import org.apache.nifi._
import java.nio.charset._
import org.apache.nifi.spark._
import org.apache.nifi.remote.client._
import org.apache.spark._
import org.apache.nifi.events._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.nifi.remote._
import org.apache.nifi.remote.client._
import org.apache.nifi.remote.protocol._
import org.apache.spark.storage._
import org.apache.spark.streaming.receiver._
import java.io._
import org.apache.spark.serializer._
object SparkNiFiAttribute {
def main(args: Array[String]) {
// Build a Site-to-site client config with NiFi web url and output port name[spark created in step 6c]
val conf = new SiteToSiteClient.Builder().url("http://localhost:8090/nifi").portName("spark").buildConfig()
// Set an App Name
val config = new SparkConf().setAppName("Nifi_Spark_Data")
// Create a StreamingContext
val ssc = new StreamingContext(config, Seconds(10))
// Create a DStream using a NiFi receiver so that we can pull data from specified Port
val lines = ssc.receiverStream(new NiFiReceiver(conf, StorageLevel.MEMORY_ONLY))
// Map the data from NiFi to text, ignoring the attributes
val text = lines.map(dataPacket => new String(dataPacket.getContent, StandardCharsets.UTF_8))
// Print the first ten elements of each RDD generated
text.print()
// Start the computation
ssc.start()
}
}
SparkNiFiAttribute.main(Array())
9) Lets Go back to the NiFi Web UI and start the flow we created, make sure nothing is wrong and you shall see data flowing 10) Now load the script to spark-shell with below command and start streaming: # spark-shell -i spark-data.sh 11) In the screenshot below, you can see the NiFi logs being pulled and printed on the console: 12) Same way we can pull data from NiFi and extract the associated Attributes: // Import all the libraries required
import org.apache.nifi._
import java.nio.charset._
import org.apache.nifi.spark._
import org.apache.nifi.remote.client._
import org.apache.spark._
import org.apache.nifi.events._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.nifi.remote._
import org.apache.nifi.remote.client._
import org.apache.nifi.remote.protocol._
import org.apache.spark.storage._
import org.apache.spark.streaming.receiver._
import java.io._
import org.apache.spark.serializer._
object SparkNiFiData {
def main(args: Array[String]) {
// Build a Site-to-site client config with NiFi web url and output port name
val conf = new SiteToSiteClient.Builder().url("http://localhost:8090/nifi").portName("spark").buildConfig()
// Set an App Name
val config = new SparkConf().setAppName("Nifi_Spark_Attributes")
// Create a StreamingContext
val ssc = new StreamingContext(config, Seconds(5))
// Create a DStream using a NiFi receiver so that we can pull data from specified Port
val lines = ssc.receiverStream(new NiFiReceiver(conf, StorageLevel.MEMORY_ONLY))
// Extract the 'uuid' attribute
val text = lines.map(dataPacket => dataPacket.getAttributes.get("uuid"))
text.print()
ssc.start()
ssc.awaitTermination()
}
}
SparkNiFiData.main(Array())
13) In the screenshot below, you can see the FlowFile attribute "uuid" being extracted and printed on the console: 14) You can create multiple output ports to transmit data to different Spark application from same NiFi Instance at the same time. Thanks, Jobin George
... View more
01-23-2016
08:52 PM
3 Kudos
Hi, Say If you are running spark-shell, pyspark, spark-sql from sandbox locally, you can try accessing http://sandbox:4040 to see the UI [only while application is running]. Spark launches UI starting from 4040 by default and if u run another local instances parallel to it will go 4041, 4042 etc. Thanks, Jobin George
... View more
- « Previous
- Next »