Member since
10-17-2016
93
Posts
10
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4880 | 09-28-2017 04:38 PM | |
7330 | 08-24-2017 06:12 PM | |
1898 | 07-03-2017 12:20 PM |
12-14-2016
08:22 AM
1 Kudo
This might come across as a very naive question as I have never used Linux before. I was wondering where can I view the files inside the sandbox like hadoop configuration files? What is the path? Data files on the data node?? I should log onto the sandbox via the command line via root and where should i navigate to then. I was thinking of configuring a GUI (Gnome) for the sandbox, but unfortunately every simple task has proven to be more difficult than I expected. I do find answers to the problems that I face, online but unfortunately mostly the solutions would not work for me. I have trouble turning off the firewall in the sandbox (Centos 6.8) I have trouble installing Gnome because I get no package available etc. Lastly is this necessary? or if i have ambari I do not need to install the GUI? Trouble in pinging my machine from the guest. Trouble copying files to the guest machine.... conneciton refused 😞 Thanks
... View more
Labels:
12-10-2016
01:20 AM
Hi Matt I had used the suggested URL as mentioned in this tutorial: PutHiveQl by @Timothy Spann I have updated the URL and get the following error which I will explore further now: 2016-12-08 17:21:30,527 WARN [LeaseRenewer:ARSI@sandbox.hortonworks.com:8020] org.apache.hadoop.hdfs.LeaseRenewer Failed to renew lease for [DFSClient_NONMAPREDUCE_620414152_117, DFSClient_NONMAPREDUCE_-186846005_117] for 3487 seconds. Will retry shortly ...
java.net.ConnectException: Call From PCARSI01/10.20.115.30 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.GeneratedConstructorAccessor92.newInstance(Unknown Source) ~[na:na]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_111]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_111]
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client.call(Client.java:1473) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client.call(Client.java:1400) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.2.jar:na]
at com.sun.proxy.$Proxy128.renewLease(Unknown Source) ~[na:na]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:571) ~[hadoop-hdfs-2.6.2.jar:na]
at sun.reflect.GeneratedMethodAccessor308.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_111]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_111]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.2.jar:na]
at com.sun.proxy.$Proxy129.renewLease(Unknown Source) ~[na:na]
at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:882) ~[hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:423) [hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:448) [hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) [hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:304) [hadoop-hdfs-2.6.2.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_111]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_111]
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:608) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:706) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:369) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.ipc.Client.call(Client.java:1439) ~[hadoop-common-2.6.2.jar:na]
... 16 common frames omitted
2016-12-08 17:21:32,559 WARN [LeaseRenewer:ARSI@sandbox.hortonworks.com:8020] org.apache.hadoop.hdfs.LeaseRenewer Failed to renew lease for [DFSClient_NONMAPREDUCE_620414152_117, DFSClient_NONMAPREDUCE_-186846005_117] for 3489 seconds. Will retry shortly ...
java.net.ConnectException: Call From PCARSI01/10.20.115.30 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.GeneratedConstructorAccessor92.newInstance(Unknown Source) ~[na:na]
... View more
12-08-2016
05:25 PM
Thankyou so much! it is working now. I am surprised that even a space at the beginning would prevent the processor from working.
... View more
12-08-2016
04:48 PM
Hi @Matt Burgess I had used the format specified by @Timothy Spann in his post : PutHiveQl I have changed the URL to the one you specified and get the following error: 2016-12-08 17:39:31,880 INFO [Timer-Driven Process Thread-1] o.a.nifi.dbcp.hive.HiveConnectionPool HiveConnectionPool[id=de18f06b-0158-1000-758a-359b8b94716b] Simple Authentication
2016-12-08 17:39:31,881 ERROR [Timer-Driven Process Thread-1] o.a.nifi.dbcp.hive.HiveConnectionPool HiveConnectionPool[id=de18f06b-0158-1000-758a-359b8b94716b] Error getting Hive connection
2016-12-08 17:39:31,883 ERROR [Timer-Driven Process Thread-1] o.a.nifi.dbcp.hive.HiveConnectionPool
org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL ' jdbc:hive2://localhost:10000/default'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1452) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1371) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:269) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
at sun.reflect.GeneratedMethodAccessor467.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_111]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_111]
at org.apache.nifi.controller.service.StandardControllerServiceProvider$1.invoke(StandardControllerServiceProvider.java:177) [nifi-framework-core-1.0.0.jar:1.0.0]
at com.sun.proxy.$Proxy132.getConnection(Unknown Source) [na:na]
at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:152) [nifi-hive-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) [nifi-api-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.jar:1.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Caused by: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315) ~[na:1.8.0_111]
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1437) ~[commons-dbcp-1.4.jar:1.4]
... 21 common frames omitted
2016-12-08 17:39:31,883 ERROR [Timer-Driven Process Thread-1] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=de0c64f1-0158-1000-b1fd-9c33f9c8e7e0] PutHiveQL[id=de0c64f1-0158-1000-b1fd-9c33f9c8e7e0] failed to process due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL ' jdbc:hive2://localhost:10000/default'; rolling back session: org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL ' jdbc:hive2://localhost:10000/default'
2016-12-08 17:39:31,885 ERROR [Timer-Driven Process Thread-1] o.apache.nifi.processors.hive.PutHiveQL
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL ' jdbc:hive2://localhost:10000/default'
at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:273) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
at sun.reflect.GeneratedMethodAccessor467.invoke(Unknown Source) ~[na:na]
... View more
12-08-2016
04:01 PM
Hi I have deployed the HDP 2.5 inside virtualbox. I have Nifi 1.0.0 installed on my machine. Earlier I had tried to read and write files to HDFS and I had connectivity issues: PutHDFS/ GetHDFS Issue with data transfer Which is still unsolved. I thought may be I can move forward and try my luch with HIVE. I tried to execute an INSERT statement using PutHIVEQL. I think I am back to square one as am still getting the following errors: First i tried to connect to Hive inside the sandbox using the following URL jdbc://hive2://localhost:10000/default which gives the following error:
2016-12-08 13:15:42,902 ERROR [Timer-Driven Process Thread-3] o.a.nifi.dbcp.hive.HiveConnectionPool
org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL 'jdbc://hive2://localhost:10000/default'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1452) ~[commons-dbcp-1.4.jar:1.4]
Caused by: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315) ~[na:1.8.0_111]
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1437) ~[commons-dbcp-1.4.jar:1.4]
... 22 common frames omitted
2016-12-08 13:15:42,902 ERROR [Timer-Driven Process Thread-3] o.a.nifi.processors.hive.SelectHiveQL SelectHiveQL[id=de0cad2e-0158-1000-fd58-ecc08beadcb0] Unable to execute HiveQL select query SELECT * FROM drivers WHERE event='overspeed'; due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL 'jdbc://hive2://localhost:10000/default'. No FlowFile to route to failure: org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL 'jdbc://hive2://localhost:10000/default'
2016-12-08 13:15:42,903 ERROR [Timer-Driven Process Thread-3] o.a.nifi.processors.hive.SelectHiveQL
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL 'jdbc://hive2://localhost:10000/default'
at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:273) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL 'jdbc://hive2://localhost:10000/default'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1452) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1371) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:269) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
... 19 common frames omitted
Caused by: java.sql.SQLException: No suitable driver
Then I checked the ambari dashboard and found the following URL: jdbc:hive2://sandbox.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2 But now that I use this URL I do think I have moved forward but I am stuck with this error: 2016-12-08 13:20:31,101 ERROR [Timer-Driven Process Thread-10] o.a.nifi.processors.hive.SelectHiveQL SelectHiveQL[id=de0cad2e-0158-1000-fd58-ecc08beadcb0] SelectHiveQL[id=de0cad2e-0158-1000-fd58-ecc08beadcb0] failed to process due to java.lang.NullPointerException; rolling back session: java.lang.NullPointerException
2016-12-08 13:20:31,104 ERROR [Timer-Driven Process Thread-10] o.a.nifi.processors.hive.SelectHiveQL
java.lang.NullPointerException: null
at org.apache.thrift.transport.TSocket.open(TSocket.java:170) ~[hive-exec-1.2.1.jar:1.2.1]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266) ~[hive-exec-1.2.1.jar:1.2.1]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[hive-exec-1.2.1.jar:1.2.1]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204) ~[hive-jdbc-1.2.1.jar:1.2.1]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176) ~[hive-jdbc-1.2.1.jar:1.2.1]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) ~[hive-jdbc-1.2.1.jar:1.2.1]
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:269) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_111]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_111]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_111]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_111]
at org.apache.nifi.controller.service.StandardControllerServiceProvider$1.invoke(StandardControllerServiceProvider.java:177) ~[nifi-framework-core-1.0.0.jar:1.0.0]
at com.sun.proxy.$Proxy132.getConnection(Unknown Source) ~[na:na]
at org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:158) ~[nifi-hive-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.jar:1.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Can any one please help me with this? I have tried so many things but it seems it is impossible to get data through to the sandbox in the virtual machine. What should i do? any possible direction would be helpful. I have disabled the firewall. The replace text yields the following statement: INSERT INTO driveres (truckid,driverid,city,state,velocity,event)
VALUES ('N02','N02','Lahore','Punjab','100','overspeeding'); which runs successfully in the ambari hive view. I have also downloaded the hive configs from ambari and added them to the connection pool. The last option for me would be to have Nifi running in the HDP.
... View more
Labels:
12-05-2016
03:20 PM
thankyou @Matt Burgess I will use the Http processors in the meantime then!
... View more
12-05-2016
02:12 PM
I got the putelasticsearchHTTP processor to work... but when i try to use the putelasticsearch processor with the same data i get the following error: 2016-12-05 15:06:23,209 ERROR [Timer-Driven Process Thread-8] o.a.n.p.elasticsearch.PutElasticsearch PutElasticsearch[id=ce4b90b8-0158-1000-4eab-a028c319db21] Failed to insert into Elasticsearch due to None of the configured nodes are available: [{#transport#-1}{127.0.0.1}{localhost/127.0.0.1:9300}]. More detailed information may be available in the NiFi logs.: NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{127.0.0.1}{localhost/127.0.0.1:9300}]]
2016-12-05 15:06:23,209 ERROR [Timer-Driven Process Thread-8] o.a.n.p.elasticsearch.PutElasticsearch
org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{127.0.0.1}{localhost/127.0.0.1:9300}]
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:290) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:207) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.client.transport.support.TransportProxyClient.execute(TransportProxyClient.java:55) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:283) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:347) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:85) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:59) ~[elasticsearch-2.1.0.jar:2.1.0]
at org.apache.nifi.processors.elasticsearch.PutElasticsearch.onTrigger(PutElasticsearch.java:212) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) [nifi-api-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.jar:1.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111] and in Elasticsearch logs: [2016-12-05T15:06:23,113][WARN ][o.e.t.n.Netty4Transport ] [KXBLBSs] exception caught on transport layer [[id: 0xeb720e09, L:/127.0.0.1:9300 - R:/127.0.0.1:54211]], closing connection
java.lang.IllegalStateException: Received message from unsupported version: [2.0.0] minimal compatible version is: [5.0.0]
at org.elasticsearch.transport.TcpTransport.messageReceived(TcpTransport.java:1199) ~[elasticsearch-5.0.2.jar:5.0.2]
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:74) ~[transport-netty4-5.0.2.jar:5.0.2]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) [netty-codec-4.1.5.Final.jar:4.1.5.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:280) [netty-codec-4.1.5.Final.jar:4.1.5.Final]
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:396) [netty-codec-4.1.5.Final.jar:4.1.5.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:248) [netty-codec-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:129) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:610) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:513) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:467) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:437) [netty-transport-4.1.5.Final.jar:4.1.5.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:873) [netty-common-4.1.5.Final.jar:4.1.5.Final]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111]
... View more
12-05-2016
01:59 PM
ahhhh .....thankyou so much...it works.... I will try the Putelasticsearch processor now that this one is working....
... View more
12-05-2016
01:49 PM
Hi @Devin Pinkston Thanks for your reply.... I had made the mistake of using port 9300 instead of 9200....how can i resolve firewall issues... at the moment i have disabled the firewall but it has not helped. I have checked elasticsearch logs and i dont see any errors there... now that I have changed the port to 9200 i get the following error: 2016-12-05 14:45:39,130 ERROR [Timer-Driven Process Thread-6] o.a.n.p.e.PutElasticsearchHttp PutElasticsearchHttp[id=cef732f0-0158-1000-4c78-e6fd2c642944] PutElasticsearchHttp[id=cef732f0-0158-1000-4c78-e6fd2c642944] failed to process due to org.apache.nifi.processor.exception.ProcessException: java.net.ConnectException: Failed to connect to localhost/127.0.0.1:9002; rolling back session: org.apache.nifi.processor.exception.ProcessException: java.net.ConnectException: Failed to connect to localhost/127.0.0.1:9002
2016-12-05 14:45:39,130 ERROR [Timer-Driven Process Thread-6] o.a.n.p.e.PutElasticsearchHttp
org.apache.nifi.processor.exception.ProcessException: java.net.ConnectException: Failed to connect to localhost/127.0.0.1:9002
at org.apache.nifi.processors.elasticsearch.PutElasticsearchHttp.onTrigger(PutElasticsearchHttp.java:315) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.jar:1.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Caused by: java.net.ConnectException: Failed to connect to localhost/127.0.0.1:9002
at okhttp3.internal.io.RealConnection.connectSocket(RealConnection.java:187) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.io.RealConnection.buildConnection(RealConnection.java:170) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.io.RealConnection.connect(RealConnection.java:111) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.StreamAllocation.findConnection(StreamAllocation.java:187) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.StreamAllocation.findHealthyConnection(StreamAllocation.java:123) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.StreamAllocation.newStream(StreamAllocation.java:93) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.HttpEngine.connect(HttpEngine.java:296) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.HttpEngine.sendRequest(HttpEngine.java:248) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.getResponse(RealCall.java:243) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall$ApplicationInterceptorChain.proceed(RealCall.java:201) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:163) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.execute(RealCall.java:57) ~[okhttp-3.3.1.jar:na]
at org.apache.nifi.processors.elasticsearch.AbstractElasticsearchHttpProcessor.sendRequestToElasticsearch(AbstractElasticsearchHttpProcessor.java:166) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processors.elasticsearch.PutElasticsearchHttp.onTrigger(PutElasticsearchHttp.java:313) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
... 12 common frames omitted
... View more
12-04-2016
06:58 PM
1 Kudo
Hi I am reading a CSV file. converting it into JSON and splitting it based on records. I try to write these files into Elastic. I get the following error: 2016-12-04 19:42:18,008 ERROR [Timer-Driven Process Thread-10] o.a.n.p.e.PutElasticsearchHttp PutElasticsearchHttp[id=ca053faa-0158-1000-8397-413cc421f011] PutElasticsearchHttp[id=ca053faa-0158-1000-8397-413cc421f011] failed to process due to org.apache.nifi.processor.exception.ProcessException: java.net.SocketException: Connection reset by peer: socket write error; rolling back session: org.apache.nifi.processor.exception.ProcessException: java.net.SocketException: Connection reset by peer: socket write error
2016-12-04 19:42:18,017 ERROR [Timer-Driven Process Thread-10] o.a.n.p.e.PutElasticsearchHttp
org.apache.nifi.processor.exception.ProcessException: java.net.SocketException: Connection reset by peer: socket write error
at org.apache.nifi.processors.elasticsearch.PutElasticsearchHttp.onTrigger(PutElasticsearchHttp.java:315) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.jar:1.0.0]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.jar:1.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_101]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_101]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_101]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_101]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101]
Caused by: java.net.SocketException: Connection reset by peer: socket write error
at java.net.SocketOutputStream.socketWrite0(Native Method) ~[na:1.8.0_101]
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) ~[na:1.8.0_101]
at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_101]
at okio.Okio$1.write(Okio.java:80) ~[okio-1.8.0.jar:na]
at okio.AsyncTimeout$1.write(AsyncTimeout.java:180) ~[okio-1.8.0.jar:na]
at okio.RealBufferedSink.emitCompleteSegments(RealBufferedSink.java:171) ~[okio-1.8.0.jar:na]
at okio.RealBufferedSink.write(RealBufferedSink.java:41) ~[okio-1.8.0.jar:na]
at okhttp3.internal.http.Http1xStream$FixedLengthSink.write(Http1xStream.java:286) ~[okhttp-3.3.1.jar:na]
at okio.RealBufferedSink.emitCompleteSegments(RealBufferedSink.java:171) ~[okio-1.8.0.jar:na]
at okio.RealBufferedSink.write(RealBufferedSink.java:91) ~[okio-1.8.0.jar:na]
at okhttp3.RequestBody$2.writeTo(RequestBody.java:96) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.HttpEngine$NetworkInterceptorChain.proceed(HttpEngine.java:756) ~[okhttp-3.3.1.jar:na]
at okhttp3.internal.http.HttpEngine.readResponse(HttpEngine.java:613) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.getResponse(RealCall.java:244) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall$ApplicationInterceptorChain.proceed(RealCall.java:201) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:163) ~[okhttp-3.3.1.jar:na]
at okhttp3.RealCall.execute(RealCall.java:57) ~[okhttp-3.3.1.jar:na]
at org.apache.nifi.processors.elasticsearch.AbstractElasticsearchHttpProcessor.sendRequestToElasticsearch(AbstractElasticsearchHttpProcessor.java:166) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
at org.apache.nifi.processors.elasticsearch.PutElasticsearchHttp.onTrigger(PutElasticsearchHttp.java:313) ~[nifi-elasticsearch-processors-1.0.0.jar:1.0.0]
... 12 common frames omitted Any help is highly appriciated
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
- Next »