Created 07-27-2016 12:30 AM
I enjoy the power of groovy scripting FlowFiles using the ExecuteScript processor. Can I import external libraries to fully leverage the groovy framework in Nifi? If so, how do I implement a jar file in the ExecuteScript processor?
Created 07-27-2016 01:34 AM
Certainly! The Module Directory property in the ExecuteScript processor is for this purpose, you can give it a comma-separated list of directories and/or JAR files, and it will add them to the script's classpath. I have a blog post with an example (bringing in Hazelcast to get data into flowfiles):
http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
Also if you add the Apache Ivy JAR to your NiFi lib/ folder (normally a no-no but ok in this case), you can even leverage the Grab annotation to bring in dependencies, I have a post with an example here:
http://funnifi.blogspot.com/2016/05/using-groovy-grab-with-executescript.html
Created 07-27-2016 01:34 AM
Certainly! The Module Directory property in the ExecuteScript processor is for this purpose, you can give it a comma-separated list of directories and/or JAR files, and it will add them to the script's classpath. I have a blog post with an example (bringing in Hazelcast to get data into flowfiles):
http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
Also if you add the Apache Ivy JAR to your NiFi lib/ folder (normally a no-no but ok in this case), you can even leverage the Grab annotation to bring in dependencies, I have a post with an example here:
http://funnifi.blogspot.com/2016/05/using-groovy-grab-with-executescript.html
Created 08-01-2018 01:56 PM
But how to include the jars which are already supported by the nifi ? Like the hadoop jars which i feel is more safer to use the ones packed with the nifi. we have a use case to access some configurations which is written in Hadoop configuration file pattern. We want to access it through nifi but seems like through groovy script i am not able to access it without giving external module dependencies. Any way to safely refer to the nifi jars ?
,But how about for the hadoop jars ? some of the nars already has those jars. We should be able make use of it.
We have some hadoop configuration files and i am not able to make use of the hadoop nifi nars because of which i need to write some custom processor as i do not want get the dependencies with external hadoop jars.
Created 08-01-2018 07:48 PM
If you need a number of dependencies like Hadoop for a script, you may want to consider creating an actual processor/NAR, that way you can inherit the nifi-hadoop-libraries NAR from your NAR, and it gives you access to the hadoop JARs from your code.
Another alternative is to use Groovy Grab in your script to bring in the Hadoop dependencies you need. It will download another set of them to the Grapes cache, but you won't have to worry about getting all the transitive dependencies manually.
A more fragile alternative is to add a NAR's working directory to your Module Directory property in ExecuteScript, for example the nifi-hadoop-libraries NAR's working directory for dependencies is something like:
<NiFi location>/work/nar/extensions/nifi-hadoop-libraries-nar-<version>.nar-unpacked/META-INF/bundled-dependencies/
This directory doesn't exist until NiFi has been started and extracts the contents of the corresponding NAR to its working directory location.