Member since
07-03-2017
48
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
296 | 07-17-2017 11:34 PM |
11-29-2018
03:29 PM
Hello All, I am moving data from one Oracle database to another. My flow is like this: ExecuteSQL -> SplitAvro -> ConvertRecord ( to generate SQL statement) -> ConvertCharacterSet ( UTF-8 to ISO-8859-1) -> PutSQL Source oracle DB has character set AL32UTF8 Target oracle DB has character set WE8ISO8859P1 But I see lots of ?? in the target database. Any clues on what I am doing wrong here? P.S : My flow runs slow as I am not using prepared statement. I will work on this once this character set issue is resolved. Thanks in advance !!
... View more
Labels:
03-20-2018
04:52 PM
Hello Rahul, Thanks for prompt response. Process A and Process B run like this: ExecuteSQL -> SplitAvro --> ConvertRecord --> PutSQL. So I end up with one flow file per record. Should I use PutDatabaseRecord instead of PutSQL here? Will PutDatabaseRecord give me one flow file as output upon successful completion of all INSERTs? Thanks !!
... View more
03-20-2018
04:14 PM
Hello All, My process flow is like this: Process A: Read data from system A ( ExecuteSQL) and load into staging table A ( PutSQL) Process B: Read data from System B ( ExecuteSQL) and load into staging table B ( PutSQL) Process C: Join data between tables A and B( ExecuteSQL)and generate a file I want Process C to run only after Process A and Process B are done. How can I achieve this in NiFi. I looked into Wait/Notify and MergeContent processors and nothing seems to be working. Any ideas?
... View more
Labels:
11-21-2017
06:24 PM
Thanks this solution worked.
... View more
11-17-2017
05:01 PM
Hello All,I am getting this error in
PublishKafka_0_10 processor for a message of size 2.08 MB. I have
updated Max Request Size to 10 MB in processor properties and
max.request.size to 10 MB in Kafka's server.properties. After reboot
Kafka Broker I can see that max.request.size = 10 MB in Kafka logs but I
am still getting below error. What am I missing here? 2017-11-17 11:07:47,966 ERROR [Timer-Driven Process Thread-4] o.a.n.p.kafka.pubsub.PublishKafka_0_10 PublishKafka_0_10[id=e6d932d9-97ae-1647-aa8f-86d07791ce25] Failed to send all message for StandardFlowFileRecord[uuid=fa2399e5-bea5-4113-b58b-6cdef228733c,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1510934860019-132, container=default, section=132], offset=0, length=2160613],offset=0,name=12337127439954063,size=2160613] to Kafka; routing to failure due to org.apache.nifi.stream.io.exception.TokenTooLargeException: A message in the stream exceeds the maximum allowed message size of 1048576 bytes.: {} org.apache.nifi.stream.io.exception.TokenTooLargeException: A message in the stream exceeds the maximum allowed message size of 1048576 bytes. at org.apache.nifi.stream.io.util.AbstractDemarcator.extractDataToken(AbstractDemarcator.java:157) at org.apache.nifi.stream.io.util.StreamDemarcator.nextToken(StreamDemarcator.java:129) at org.apache.nifi.processors.kafka.pubsub.PublisherLease.publish(PublisherLease.java:78) at org.apache.nifi.processors.kafka.pubsub.PublishKafka_0_10$1.process(PublishKafka_0_10.java:334) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2136) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2106) at org.apache.nifi.processors.kafka.pubsub.PublishKafka_0_10.onTrigger(PublishKafka_0_10.java:330) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:748)Thanks and Regards
... View more
Labels:
10-18-2017
04:22 PM
Thanks Matt.
... View more
10-12-2017
06:20 PM
Hello All,Can someone please help in understanding what I am doing wrong here?I am passing following values to PutSQL processor sql.args.1.type 2005 sql.args.1.value V99|1.0|
E99|BILLING|1234567|11049532|350|201706|201709|
R03|350|11049532|4| |KK|174|A|07/27/1948| |322| |0|F| | |201712|201709|0|168900.56|0|0|0|0|40134|201712|0|0|316| |200402|0| | | | | | | | | |POPR_CUPDT| | | | | | | | | | | |0|0|2|200402|0| | | |
P03|0|1|0|0|199404|317|201711|08/17/2017| |0|201712|08/22/2017|322|0|
K06|201706|111.53|111.53|14.8|96.73|111.53|15||0|0|
K06|201707|117.48|117.48|15.85|101.63|117.48|15||0|0|
K06|201708|227.98|227.98|44.34|183.64|227.98|30||0|0|
SQL query is: select obds_loader.getmain(?) from dual PutSQL processor is throwing below error. 2017-10-12 13:56:39,129 ERROR [Timer-Driven Process Thread-1] o.apache.nifi.processors.standard.PutSQL PutSQL[id=0beb1a31-980d-38f9-fc0e-6c0ac7a68984]
Failed to update database due to a failed batch update,
java.sql.BatchUpdateException: invalid batch command. There were a total
of 1 FlowFiles that failed, 0 that succeeded, and 0 that were not
execute and will be routed to retry; : java.sql.BatchUpdateException:
invalid batch command java.sql.BatchUpdateException: invalid batch command at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10358) at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230) at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297) at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297) at org.apache.nifi.processors.standard.PutSQL.lambda$null$10(PutSQL.java:348) at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127) at org.apache.nifi.processors.standard.PutSQL.lambda$new$12(PutSQL.java:346) at org.apache.nifi.processor.util.pattern.PutGroup.putFlowFiles(PutGroup.java:94) at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101) at org.apache.nifi.processors.standard.PutSQL.lambda$onTrigger$20(PutSQL.java:554) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at org.apache.nifi.processors.standard.PutSQL.onTrigger(PutSQL.java:554) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:748) Thanks and Regards
... View more
Labels:
09-16-2017
12:00 AM
I am able to connect to ambari DB sudo su postgres bash-4.2$ psql -U ambari -d ambari could not change directory to "/home/kfkadm": Permission denied Password for user ambari: psql (9.6.5) Type "help" for help. ambari=> ambari.properties are mostly same except for the ones in bold below: server.jdbc.connection-pool=internal ( This one is extra in my setup) server.jdbc.database=postgres server.jdbc.database_name=ambari server.jdbc.driver=org.postgresql.Driver server.jdbc.hostname=localhost ( This one is pointing to local host instead of server name) server.jdbc.port=5432 server.jdbc.postgres.schema=ambarischema server.jdbc.rca.driver=org.postgresql.Driver server.jdbc.rca.url=jdbc:postgresql://*******:5432/ambari server.jdbc.rca.user.name=ambari server.jdbc.rca.user.passwd=/etc/ambari-server/conf/password.dat server.jdbc.url=jdbc:postgresql://**********:5432/ambari server.jdbc.user.name=ambari server.jdbc.user.passwd=/etc/ambari-server/conf/password.dat I can see correct password at /etc/ambari-server/conf/password.dat
... View more
09-15-2017
11:38 AM
@Sridhar Reddy Thanks for replying. This is what I see [kfkadm@*********** ~]$ service postgresql status Redirecting to /bin/systemctl status postgresql.service ● postgresql.service Loaded: not-found (Reason: No such file or directory) Active: inactive (dead) [kfkadm@********** ~]$ sudo systemctl status postgresql-9.6 [sudo] password for kfkadm: ● postgresql-9.6.service - PostgreSQL 9.6 database server Loaded: loaded (/usr/lib/systemd/system/postgresql-9.6.service; enabled; vendor preset: disabled) Active: active (running) since Thu 2017-09-14 15:46:05 EDT; 15h ago Main PID: 7428 (postmaster) CGroup: /system.slice/postgresql-9.6.service ├─7428 /usr/pgsql-9.6/bin/postmaster -D /var/lib/pgsql/9.6/data/ ├─7429 postgres: logger process ├─7431 postgres: checkpointer process ├─7432 postgres: writer process ├─7433 postgres: wal writer process ├─7434 postgres: autovacuum launcher process └─7435 postgres: stats collector process Sep 14 15:46:05 ******* systemd[1]: Starting PostgreSQL 9.6 database server... Sep 14 15:46:05 ********** postmaster[7428]: < 2017-09-14 15:46:05.333 EDT > LOG: redirecting log output to loggi...ocess Sep 14 15:46:05 ************* postmaster[7428]: < 2017-09-14 15:46:05.333 EDT > HINT: Future log output will appear ...log". Sep 14 15:46:05 ************* systemd[1]: Started PostgreSQL 9.6 database server. Hint: Some lines were ellipsized, use -l to show in full. [kfkadm@********** ~]$ netstat -ntpl | grep 5432 (No info could be read for "-p": geteuid()=30009 but you should be root.) tcp 0 0 127.0.0.1:5432 0.0.0.0:* LISTEN - tcp6 0 0 ::1:5432 :::* LISTEN -
... View more
09-15-2017
11:32 AM
@Geoffrey Shelton Okot Thanks for replying. Yes. I chose option 4 and then provided values I had configured for standalone PostgreDB
... View more
09-15-2017
01:38 AM
Hello All, I just finished setting up Ambari 2.5.1. I am using Standalone PostgreSQL 9.6 Setup finished successfully but I am getting error while starting Ambari server. This is what I see in Linux command line Ambari Server 'setup' completed successfully. [kfkadm@******didkfkw ~]$ ambari-server start Using python /usr/bin/python Starting ambari-server Organizing resource files at /var/lib/ambari-server/resources... Unable to check firewall status when starting without root privileges. Please do not forget to disable or adjust firewall if needed Ambari database consistency check started... Server PID at: /var/run/ambari-server/ambari-server.pid Server out at: /var/log/ambari-server/ambari-server.out Server log at: /var/log/ambari-server/ambari-server.log /usr/bin/sh: line 0: ulimit: open files: cannot modify limit: Operation not permitted Waiting for server start.........Unable to determine server PID. Retrying... ......Unable to determine server PID. Retrying... ......Unable to determine server PID. Retrying... ERROR: Exiting with exit code -1. REASON: Ambari Server java process died with exitcode 1. Check /var/log/ambari-server/ambari-server.out for more information. This is what I see in ambari-server.out file Error injecting constructor, java.lang.RuntimeException: Error while creating database accessor at org.apache.ambari.server.orm.DBAccessorImpl.<init>(DBAccessorImpl.java:85) at org.apache.ambari.server.orm.DBAccessorImpl.class(DBAccessorImpl.java:73) while locating org.apache.ambari.server.orm.DBAccessorImpl while locating org.apache.ambari.server.orm.DBAccessor for field at org.apache.ambari.server.orm.dao.DaoUtils.dbAccessor(DaoUtils.java:36) at org.apache.ambari.server.orm.dao.DaoUtils.class(DaoUtils.java:36) while locating org.apache.ambari.server.orm.dao.DaoUtils for field at org.apache.ambari.server.orm.dao.UserDAO.daoUtils(UserDAO.java:45) at org.apache.ambari.server.orm.dao.UserDAO.class(UserDAO.java:45) while locating org.apache.ambari.server.orm.dao.UserDAO for field at org.apache.ambari.server.controller.internal.ActiveWidgetLayoutResourceProvider.userDAO(ActiveWidgetLayoutResourceProvider.java:61) Caused by: java.lang.RuntimeException: Error while creating database accessor at org.apache.ambari.server.orm.DBAccessorImpl.<init>(DBAccessorImpl.java:118) at org.apache.ambari.server.orm.DBAccessorImpl$$FastClassByGuice$$86dbc63e.newInstance(<generated>) at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40) at com.google.inject.internal.DefaultConstructionProxyFactory$1.newInstance(DefaultConstructionProxyFactory.java:60) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.FactoryProxy.get(FactoryProxy.java:54) at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:53) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:110) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:94) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:53) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:110) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:94) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:53) at com.google.inject.internal.InjectionRequestProcessor$StaticInjection$1.call(InjectionRequestProcessor.java:116) at com.google.inject.internal.InjectionRequestProcessor$StaticInjection$1.call(InjectionRequestProcessor.java:110) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) at com.google.inject.internal.InjectionRequestProcessor$StaticInjection.injectMembers(InjectionRequestProcessor.java:110) at com.google.inject.internal.InjectionRequestProcessor.injectMembers(InjectionRequestProcessor.java:78) at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:170) at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:109) at com.google.inject.Guice.createInjector(Guice.java:95) at com.google.inject.Guice.createInjector(Guice.java:72) at com.google.inject.Guice.createInjector(Guice.java:62) at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:992) Caused by: org.postgresql.util.PSQLException: Connection refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections. at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:207) at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:64) at org.postgresql.jdbc2.AbstractJdbc2Connection.<init>(AbstractJdbc2Connection.java:138) at org.postgresql.jdbc3.AbstractJdbc3Connection.<init>(AbstractJdbc3Connection.java:29) at org.postgresql.jdbc3g.AbstractJdbc3gConnection.<init>(AbstractJdbc3gConnection.java:21) at org.postgresql.jdbc4.AbstractJdbc4Connection.<init>(AbstractJdbc4Connection.java:31) at org.postgresql.jdbc4.Jdbc4Connection.<init>(Jdbc4Connection.java:24) at org.postgresql.Driver.makeConnection(Driver.java:410) at org.postgresql.Driver.connect(Driver.java:280) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.ambari.server.orm.DBAccessorImpl.<init>(DBAccessorImpl.java:91) ... 41 more Caused by: java.net.ConnectException: Connection refused (Connection refused) at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at java.net.Socket.connect(Socket.java:538) at org.postgresql.core.PGStream.<init>(PGStream.java:60) at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:101) ... 52 more How do I resolve this error?
... View more
Labels:
09-15-2017
01:33 AM
Thank you all. All 3 answers were correct and pointed me in right direction.
... View more
09-14-2017
06:35 PM
Hello All, I am getting below error while installing Ambari. How do I resolve this? Completing setup... Configuring database... Enter advanced database configuration [y/n] (n)? n Configuring database... Default properties detected. Using built-in database. Configuring ambari database... Checking PostgreSQL... Running initdb: This may take up to a minute. About to start PostgreSQL [sudo] password for kfkadm: ERROR: Exiting with exit code 3. REASON: Unable to start PostgreSQL server. Exiting I had installed PostgreSQL by running below command: sudo
yum install https://download.postgresql.org/pub/repos/yum/9.6/redhat/rhel-7-x86_64/pgdg-redhat96-9.6-3.noarch.rpm and then ran following command to Setup Ambari: sudo
yum install ambari-server Thanks!!
... View more
Labels:
09-05-2017
11:06 PM
Hello All, Logged https://issues.apache.org/jira/browse/NIFI-4352 for this issue. Thanks!!
... View more
09-05-2017
08:33 PM
Thanks. My issue is resolved now.
... View more
09-05-2017
04:14 PM
Even I am exploring ways to deliver data in order via NIFi. Please note that even on single node your data can get out of order if NiFi penalizes a fiow file or if you try to implement some custom "retry" logic.
... View more
09-04-2017
07:31 PM
Hello All, I dragged an Output Port in canvas inside a Process Group and I am getting error that - Port X is invalid because Output Connection for port X is not defined. What configuration am I missing? Thanks!!
... View more
Labels:
09-04-2017
02:21 PM
Hello All, I am passing sql.args.1.type = 2005 and sql.args.1.value as a CLOB value to PutSQL and it is throwing below error. How do I resolve this error? 2017-09-04 10:17:24,924 INFO [StandardProcessScheduler Thread-5] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PutSQL[id=69534f73-821e-1782-ffff-fffffabb94fb] to run with 1 threads 2017-09-04 10:17:25,009 ERROR [Timer-Driven Process Thread-4] o.apache.nifi.processors.standard.PutSQL PutSQL[id=69534f73-821e-1782-ffff-fffffabb94fb] org.apache.nifi.processors.standard.PutSQL$$Lambda$653/696434096@771333e failed to process due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=14895cdd-3eac-4f2a-adae-f014526e24b1,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1502123289026-125, container=default, section=125], offset=71441, length=39],offset=0,name=5936554321553480,size=39] due to java.lang.ClassCastException: java.lang.String cannot be cast to oracle.sql.CLOB; rolling back session: {} org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=14895cdd-3eac-4f2a-adae-f014526e24b1,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1502123289026-125, container=default, section=125], offset=71441, length=39],offset=0,name=5936554321553480,size=39] due to java.lang.ClassCastException: java.lang.String cannot be cast to oracle.sql.CLOB at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnGroupError$14(ExceptionHandler.java:226) at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnError$13(ExceptionHandler.java:179) at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$11(ExceptionHandler.java:54) at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$11(ExceptionHandler.java:54) at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148) at org.apache.nifi.processors.standard.PutSQL.lambda$new$72(PutSQL.java:283) at org.apache.nifi.processors.standard.PutSQL.lambda$new$75(PutSQL.java:324) at org.apache.nifi.processor.util.pattern.PutGroup.putFlowFiles(PutGroup.java:91) at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101) at org.apache.nifi.processors.standard.PutSQL.lambda$onTrigger$86(PutSQL.java:544) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at org.apache.nifi.processors.standard.PutSQL.onTrigger(PutSQL.java:544) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to oracle.sql.CLOB at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:8874) at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8396) at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:8980) at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:242) at org.apache.commons.dbcp.DelegatingPreparedStatement.setObject(DelegatingPreparedStatement.java:166) at org.apache.commons.dbcp.DelegatingPreparedStatement.setObject(DelegatingPreparedStatement.java:166) at org.apache.nifi.processors.standard.PutSQL.setParameter(PutSQL.java:888) at org.apache.nifi.processors.standard.PutSQL.setParameters(PutSQL.java:677) at org.apache.nifi.processors.standard.PutSQL.lambda$null$71(PutSQL.java:285) at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127) ... 19 common frames omitted
... View more
Labels:
09-04-2017
01:34 PM
Hello All, Can I pass a CLOB as sql.args.N.value? Are their any size limitations for attributes? Thanks!!
... View more
Labels:
09-04-2017
04:46 AM
Hello All, I want to run below query in PutSQL. This query is calling an Oracle function and accepts a CLOB value as parameter. selectobds_loader.getmain(:clob)from dual Does PutSQL support SELECT queries? Can I pass a CLOB as attribute in NiFi? If Yes, then what would be the value for sql.args.n.type for CLOB? Thanks in advance
... View more
Labels:
08-28-2017
03:59 PM
Hello, Data Provenance UI allows replay of data. Is their any way to fix "bad data" in NiFi before replay. For ex - Let's say data record is missing a value in mandatory field. Instead of fixing it in source and re sending data, is it possible to modify data by adding value to mandatory field in NiFi and replay it? Thanks!!
... View more
Labels:
08-28-2017
03:54 PM
Hello , We do lots of dataflows moving data in and out of Confluent Kafka using NiFi. Can Atlas be used to track end to end lineage in such flows ( NiFi --> Confluent Kafka --> NiFi) ? Thanks!!
... View more
Labels:
08-16-2017
10:51 PM
Sales team confirmed that Kafka Streams, Connect and MirrorMaker are supported in HDF.
... View more
08-10-2017
04:41 PM
Hello All, How can I pivot record based data from columns to rows. Here is an example of what I want to achieve: Input: A,B,C M,N,O Output: A,B A,C M,N M,O Thanks!!
... View more
Labels:
08-07-2017
06:34 PM
Thanks Bryan !!
... View more
08-07-2017
01:55 PM
Hello All, How can I add a new NiFi consumer to a Kafka topic and send data from beginning? I do not see any setting in ConsumerKafkaRecord_0_10 processor to accomplish this. I am looking for functionality similar to --from-beginning parameter in Kafka Console Consumer. "./bin/kafka-console-consumer --zookeeper zk1:2181,zk2:2181,zk3:2181/kafka --topic LCL_LANG_DSTRBTN1 --from-beginning" Thanks!!
... View more
Labels:
08-06-2017
12:51 AM
I was able to accomplish this by using SplitAvro --> QueryRecord --> PutSQL
... View more
08-05-2017
08:41 PM
Hello All, I have an Avro message coming out of SplitAvro processor. This message contains a field and its value VAL. I want to run an update SQL against a database table as follows: Update table X set Status='Processed' where ID = VAL; How can I achieve this in NiFi? Thanks!!
... View more
Labels: