Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Issues with PutElasticsearch5 processor using NiFi version nifi-1.2.0.3.0.0.0-453

avatar
Expert Contributor

Hello,

I am trying to send feed from nifi version nifi-1.2.0.3.0.0.0-453 to elasticsearch version 5.5.1 however I keep getting the following error:

PutElasticsearch5[id=f4da516d-2b44-1955-0000-00000a063fa3] 
PutElasticsearch5[id=f4da516d-2b44-1955-0000-00000a063fa3] failed to process due 
to org.apache.nifi.processor.exception.ProcessException: 
org.apache.nifi.processor.exception.ProcessException: X-Pack plugin could not be 
loaded and/or configured; rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.nifi.processor.exception.ProcessException: X-Pack plugin could not be 
loaded and/or configured

The full error log is the following:

ERROR [Timer-Driven Process Thread-16] o.a.n.p.elasticsearch.PutElasticsearch5 PutElasticsearch5[id=f4da516d-2b44-1955-0000-00000a063fa3] Failed to create Elasticsearch client due to org.apache.nifi.processor.exception.ProcessException: X-Pack plugin could not be loaded and/or configured: org.apache.nifi.processor.exception.ProcessException: X-Pack plugin could not be loaded and/or configured
org.apache.nifi.processor.exception.ProcessException: X-Pack plugin could not be loaded and/or configured
	at org.apache.nifi.processors.elasticsearch.AbstractElasticsearch5TransportClientProcessor.getTransportClient(AbstractElasticsearch5TransportClientProcessor.java:222)
	at org.apache.nifi.processors.elasticsearch.AbstractElasticsearch5TransportClientProcessor.createElasticsearchClient(AbstractElasticsearch5TransportClientProcessor.java:170)
	at org.apache.nifi.processors.elasticsearch.AbstractElasticsearch5Processor.setup(AbstractElasticsearch5Processor.java:94)
	at org.apache.nifi.processors.elasticsearch.PutElasticsearch5.onTrigger(PutElasticsearch5.java:175)
	at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
	at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
	at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
	at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
	at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.nifi.processors.elasticsearch.AbstractElasticsearch5TransportClientProcessor.getTransportClient(AbstractElasticsearch5TransportClientProcessor.java:215)
	... 15 common frames omitted
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.elasticsearch.xpack.XPackPlugin
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.elasticsearch.plugins.PluginsService.loadPlugin(PluginsService.java:451)
	at org.elasticsearch.plugins.PluginsService.<init>(PluginsService.java:116)
	at org.elasticsearch.client.transport.TransportClient.newPluginService(TransportClient.java:81)
	at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:106)
	at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:228)
	at org.elasticsearch.transport.client.PreBuiltTransportClient.<init>(PreBuiltTransportClient.java:69)
	at org.elasticsearch.xpack.client.PreBuiltXPackTransportClient.<init>(PreBuiltXPackTransportClient.java:50)
	at org.elasticsearch.xpack.client.PreBuiltXPackTransportClient.<init>(PreBuiltXPackTransportClient.java:46)
	... 20 common frames omitted

I tried following the suggestions in the link: https://community.hortonworks.com/questions/81606/where-should-i-store-files-used-by-processors-in-a... , but I am unable to fix the issue. I tried putting the x-pack jar and the transport jar for the elasticsearch version 5.5.1 on the X-Pack Transport Location configuration of the PutElasticsearch5 processor and I also tried using the x-pack jar and the transport jar of the elasticsearch version 5.0.1, however nothing seems to be working. Any suggestions on the issue would be greatly appreciated.

1 ACCEPTED SOLUTION

avatar
Master Guru

There could be a couple of things going on here, there is some discussion of each in the thread you mentioned:

1) The X-Pack JAR has multiple dependencies that are not included. When you install the X-Pack plugin into an Elasticsearch node, these dependencies are extracted and added to the ES path so the ES code can find them. In a NiFi node this must be done manually. Check the other thread for the X-Pack ZIP (not JAR), you will need to unzip that somewhere and point to the elasticsearch/ folder underneath that. Your "X-Pack Transport Location" property should be set to a comma-delimited list with two items, one being the transport JAR, and one being the elasticsearch/ subfolder that contains the x-pack JAR and all its dependencies.

2) The Elasticsearch native client (used by all the ES processors that don't end in "Http") is VERY particular about versions, meaning there is no guarantee that the one used by NiFi will be compatible with the ES cluster unless they are the same major and minor versions (I think dot releases -- X.Y.1 or X.Y.2 -- are ok). PutES5 comes with the 5.0.1 client, which means it should work with all ES 5.0.x clusters. However there is no guarantee that it will work with a 5.5.x cluster. In fact I believe Elastic has replaced the native client in 5.5 with a Java one that wraps the REST API. You can try the 5.0.1 X-Pack and Transport JARs (as one person from the other thread did) to see if that works.

If you don't require the native client, you may be better served by using PutElasticsearchHttp and enabling TLS/SSL for your Elasticsearch cluster. This (plus setting up access controls for authorization) should give you a robust way to deal with secure Elasticsearch clusters of any version. Also with such an approach you should be able to have X-pack installed on your ES cluster but interact with the cluster from NiFi using the Http versions of the processors; this is how you'd interact with other X-pack capabilities such as Marvel and Watcher. In this case you shouldn't need the X-pack plugin or the transport JAR on the NiFi node, as you won't be using the native client if you use PutElasticsearchHttp.

View solution in original post

2 REPLIES 2

avatar
Master Guru

There could be a couple of things going on here, there is some discussion of each in the thread you mentioned:

1) The X-Pack JAR has multiple dependencies that are not included. When you install the X-Pack plugin into an Elasticsearch node, these dependencies are extracted and added to the ES path so the ES code can find them. In a NiFi node this must be done manually. Check the other thread for the X-Pack ZIP (not JAR), you will need to unzip that somewhere and point to the elasticsearch/ folder underneath that. Your "X-Pack Transport Location" property should be set to a comma-delimited list with two items, one being the transport JAR, and one being the elasticsearch/ subfolder that contains the x-pack JAR and all its dependencies.

2) The Elasticsearch native client (used by all the ES processors that don't end in "Http") is VERY particular about versions, meaning there is no guarantee that the one used by NiFi will be compatible with the ES cluster unless they are the same major and minor versions (I think dot releases -- X.Y.1 or X.Y.2 -- are ok). PutES5 comes with the 5.0.1 client, which means it should work with all ES 5.0.x clusters. However there is no guarantee that it will work with a 5.5.x cluster. In fact I believe Elastic has replaced the native client in 5.5 with a Java one that wraps the REST API. You can try the 5.0.1 X-Pack and Transport JARs (as one person from the other thread did) to see if that works.

If you don't require the native client, you may be better served by using PutElasticsearchHttp and enabling TLS/SSL for your Elasticsearch cluster. This (plus setting up access controls for authorization) should give you a robust way to deal with secure Elasticsearch clusters of any version. Also with such an approach you should be able to have X-pack installed on your ES cluster but interact with the cluster from NiFi using the Http versions of the processors; this is how you'd interact with other X-pack capabilities such as Marvel and Watcher. In this case you shouldn't need the X-pack plugin or the transport JAR on the NiFi node, as you won't be using the native client if you use PutElasticsearchHttp.

avatar
Expert Contributor

Hi @Matt Burgess,

I was able to connect successfully following your guidelines. I had to call the elasticsearch folder from the X-Pack Zip for the ES version 5.0.1 along with the 5.0.1 Transport Jar in the X-Pack Transport Location property as a comma-delimited list. Thanks for all the help, greatly appreciate it!