Member since
01-23-2018
12
Posts
0
Kudos Received
0
Solutions
02-02-2018
06:30 PM
Hi @Scott Shaw, the error is already stated in the first post. But here is another one: 2018-02-02 19:25:47,557 ERROR [Timer-Driven Process Thread-6] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=286bb204-0161-1000-9486-e320fcd50082] org.apache.nifi.processors.hive.PutHiveQL$$Lambda$228/655414801@168e589c failed to process due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=7cf6a2c4-8a46-4270-a5f1-3d0dde289eee,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1517497889134-1, container=default, section=1], offset=1156, length=89],offset=0,name=3026792823336151,size=89] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask; rolling back session: {}
org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=7cf6a2c4-8a46-4270-a5f1-3d0dde289eee,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1517497889134-1, container=default, section=1], offset=1156, length=89],offset=0,name=3026792823336151,size=89] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnGroupError$2(ExceptionHandler.java:226)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnError$1(ExceptionHandler.java:179)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:199)
at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:59)
at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:255)
at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:255)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:98)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$null$3(PutHiveQL.java:218)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
... 18 common frames omitted
... View more
02-01-2018
03:20 PM
Hi,
I want to use the PutHiveQL to execute the following INSERT (Content of the Incoming FlowFile:
INSERT INTO sensehat VALUES ('2018-01-24 05:26:21', 29.84, 30.78, 25.26, '192.168.16.3');
However, I get the following error:
2018-02-01 15:27:37,256 ERROR [Timer-Driven Process Thread-10] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=286bb204-0161-1000-9486-e320fcd50082] Failed to process session due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=b9e4155e-d7ef-4817-9835-721d7f108584,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1516880313990-13, container=default, section=13], offset=717915, length=89],offset=0,name=3026792823336151,size=89] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask: {}
org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=b9e4155e-d7ef-4817-9835-721d7f108584,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1516880313990-13, container=default, section=13], offset=717915, length=89],offset=0,name=3026792823336151,size=89] due
to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnGroupError$2(ExceptionHandler.java:226)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnError$1(ExceptionHandler.java:179)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:199)
at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:59)
at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:255)
at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:255)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:98)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$null$3(PutHiveQL.java:218)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
... 18 common frames omitted
My DataFlow looks as follows:
The SelectHiveQL is receiving data from the table "sensehat", but the insert statement is not working. I am using NiFi 1.4. and HDP 2.6.
Kind regards,
Jan
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache NiFi
01-23-2018
05:51 PM
Hi, I want to connect my pi with nifi as described in https://de.hortonworks.com/tutorial/analyze-iot-weather-station-data-via-connected-data-architecture/section/3/#build-nifi-flow-store-minifi-data-to-hdfs-3. I secured nifi with certificate authentification and generated two truststores and keystores as @Matt Clarke suggested: "I would execute the 'hostname -f" command on both the server where NiFi is installed and the server wheer MiNiFi is/will be installed. Then use those full hostnames in the tls-tookit.sh script to create two keystores for use on these two instances ./tls-toolkit.sh standalone -n '<nifi.hostname>,<minifi.hostname>' -C 'CN=<nifiuser>, OU=NIFI' -P <truststore passsword> -S <keystorepassword>
You will end up with a directory for each server being created that contains the needed TLS configuration info/files. If you want to include SAN to your new certificates, you will need to create each one at a time: ./tls-toolkit.sh standalone -n '<nifi.hostname>' -C 'CN=<nifiuser>, OU=NIFI' -P <truststore passsword> -S <keystorepassword> --subjectAlternativeNames '<nifi-ip-address>,<etc>'
./tls-toolkit.sh standalone -n '<minifi.hostname>' -P <truststore passsword> -S <keystorepassword> --subjectAlternativeNames '<minifi-ip-address>,<etc>' You should not need to any SAN entry if you generate your keystores using the actual hostnames assigned to your servers. Also make sure that the following properties have been set to the appropariate hostnames as well in the nifi.properties file: nifi.remote.input.host=nifi.web.https.host= Once your NiFi server is using the new keystore and truststore files, you can use the nifi server hostname in the RPG. Thanks, Matt" I copied the files in the conf folders of the NiFi and MiNiFi servers. The yml file of MiNiFi looks like: MiNiFi Config Version: 3
Flow Controller:
name: MiNiFi
comment: ''
Core Properties:
flow controller graceful shutdown period: 10 sec
flow service write delay interval: 500 ms
administrative yield duration: 30 sec
bored yield duration: 10 millis
max concurrent threads: 1
variable registry properties: ''
FlowFile Repository:
partitions: 256
checkpoint interval: 2 mins
always sync: false
Swap:
threshold: 20000
in period: 5 sec
in threads: 1
out period: 5 sec
out threads: 4
Content Repository:
content claim max appendable size: 10 MB
content claim max flow files: 100
always sync: false
Provenance Repository:
provenance rollover time: 1 min
implementation: org.apache.nifi.provenance.MiNiFiPersistentProvenanceRepository
Component Status Repository:
buffer size: 1440
snapshot frequency: 1 min
Security Properties:
keystore: './conf/keystore.jks'
keystore type: 'jks'
keystore password: 'xxxxxx'
key password: 'xxxxxx'
truststore: './conf/truststore.jks'
truststore type: 'jks'
truststore password: 'xxxxx'
ssl protocol: TLS
Sensitive Props:
key:
algorithm: PBEWITHMD5AND256BITAES-CBC-OPENSSL
provider: BC
Processors:
- id: ab316f87-2c97-3fe4-0000-000000000000
name: ExecuteProcess
class: org.apache.nifi.processors.standard.ExecuteProcess
max concurrent tasks: 1
scheduling strategy: TIMER_DRIVEN
scheduling period: 0 sec
penalization period: 30 sec
yield period: 1 sec
run duration nanos: 0
auto-terminated relationships list: []
Properties:
Argument Delimiter: ' '
Batch Duration: 5 sec
Command: python
Command Arguments: /home/pi/Documents/sensehat.py
Redirect Error Stream: 'false'
Controller Services: []
Process Groups: []
Input Ports: []
Output Ports: []
Funnels: []
Connections:
- id: 7aba8512-3f8c-3dd9-0000-000000000000
name: ExecuteProcess/success/1e4831a5-0161-1000-53c4-30f108c20272
source id: ab316f87-2c97-3fe4-0000-000000000000
source relationship names:
- success
destination id: 1e4831a5-0161-1000-53c4-30f108c20272
max work queue size: 10000
max work queue data size: 1 GB
flowfile expiration: 0 sec
queue prioritizer class: ''
Remote Process Groups:
- id: db36c66b-4e7b-316b-0000-000000000000
name: ''
url: https://<nifi hostname>:8011/nifi
comment: ''
timeout: 30 sec
yield period: 10 sec
transport protocol: RAW
proxy host: ''
proxy port: ''
proxy user: ''
proxy password: ''
local network interface: ''
Input Ports:
- id: 1e4831a5-0161-1000-53c4-30f108c20272
name: MiNiFi
comment: ''
max concurrent tasks: 1
use compression: false
Output Ports: []
NiFi Properties Overrides: {}
But I get the following error: 2018-01-23 18:30:46,323 ERROR [Timer-Driven Process Thread-3] o.a.n.c.t.ContinuallyRunConnectableTask RemoteGroupPort[name=MiNiFi,targets=https://gcvhdp01.ad.german-mgmt.de:8011/nifi] failed to process session due to java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
2018-01-23 18:30:46,325 ERROR [Timer-Driven Process Thread-3] o.a.n.c.t.ContinuallyRunConnectableTask
java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
at org.apache.nifi.controller.AbstractPort.onTrigger(AbstractPort.java:257)
at org.apache.nifi.controller.tasks.ContinuallyRunConnectableTask.call(ContinuallyRunConnectableTask.java:81)
at org.apache.nifi.controller.tasks.ContinuallyRunConnectableTask.call(ContinuallyRunConnectableTask.java:40)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.setupClient(SiteToSiteRestApiClient.java:278)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getHttpClient(SiteToSiteRestApiClient.java:219)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.execute(SiteToSiteRestApiClient.java:1189)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.execute(SiteToSiteRestApiClient.java:1237)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.fetchController(SiteToSiteRestApiClient.java:419)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getController(SiteToSiteRestApiClient.java:394)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getController(SiteToSiteRestApiClient.java:361)
at org.apache.nifi.remote.client.SiteInfoProvider.refreshRemoteInfo(SiteInfoProvider.java:69)
at org.apache.nifi.remote.client.SiteInfoProvider.getActiveClusterUrl(SiteInfoProvider.java:247)
at org.apache.nifi.remote.client.socket.EndpointConnectionPool.getEndpointConnection(EndpointConnectionPool.java:163)
at org.apache.nifi.remote.client.socket.SocketClient.createTransaction(SocketClient.java:127)
at org.apache.nifi.remote.StandardRemoteGroupPort.onTrigger(StandardRemoteGroupPort.java:238)
at org.apache.nifi.controller.AbstractPort.onTrigger(AbstractPort.java:250)
... 10 common frames omitted Where do I need to change something? Kind regards Jan
... View more
Labels:
- Labels:
-
Apache MiNiFi
01-23-2018
04:26 PM
thank you @Matt Clarke! Now it seems to be working, but the RPG throws another exception: Do I need to create a specific user and set the policy "retrieve data via site-to-site" in the Input Port of the nifi flow?
... View more
01-23-2018
02:36 PM
@Matt Clarke I used the command: ./bin/tls-toolkit.sh standalone -n '<ip address of the nifi server>' -C 'CN=<username>, OU=NIFI' -o './target' and followed the tutorial (but only one node instead of 2): https://bryanbende.com/development/2016/08/17/apache-nifi-1-0-0-authorization-and-multi-tenancy Do I need another command/argument to generate the keystore and truststore? Kind regards, Jan
... View more
01-23-2018
02:15 PM
Thank you for your answer. I would like to follow the tutorial: https://de.hortonworks.com/tutorial/analyze-iot-weather-station-data-via-connected-data-architecture/section/3/#build-nifi-flow-store-minifi-data-to-hdfs-3, but I am struggeling with step 3.1. So, yes I want to "create a dataflow to convert it into a yml file which will include a RPG that is pointing back to your main NiFi instance". That means, that the main NiFi instance is talking to itself and during that configuration the error occurs. But as long as I cannot configure the RPG, I cannot connect the "ExecuteProcess" to the RPG. That is the reason why I need to configure the RPG to talk to the main NiFi instance. King regards, Jan
... View more
01-23-2018
01:33 PM
Hi @Matt Clark, thank you for your answer! Actually I am trying to build a rpg on the server itself, so it tries to connect to himself (I want to configure the dataflow for MiNiFi). I verified point 1. The keystore has only one entry. How do I configure the SAN? The owner CN is configured as 'localhost'. That might be the problem. Kind regards, Jan
... View more
01-23-2018
12:13 PM
Hi, I want to configure a Remote Process Group in my NiFi Flow, but I get a SSLPeerUnverifiedException. Do I have to configure a certificate for the RPG? How can I import the certificate for the RPG? Thank you in advance! Kind regards, Jan
... View more
Labels:
- Labels:
-
Apache NiFi