Member since
07-19-2016
31
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1563 | 07-21-2016 08:02 PM |
05-08-2018
09:06 PM
I have a MiniFi process that does site-to-site with a NiFi 1.6. The MiniFi 0.1.0 works just fine connecting to the Upgraded Nifi 1.6. Now I'm upgrading MiniFi and I'm unable to get any data to flow from MiniFi 0.4.0 to NiFi.1.6. I see no errors in the NiFi log. and the only error I see in MiniFi logs are the following: 2018-05-08 14:52:12,535 WARN [Timer-Driven Process Thread-2] o.a.n.c.t.ContinuallyRunConnectableTask RemoteGroupPort[name=svc_perf_team_port01,targets=https://abe-hadoop-nifi01.local:8181/nifi] Administratively Pausing fo 10 seconds due to processing failure: java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClienBuilder;
java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
at org.apache.nifi.controller.AbstractPort.onTrigger(AbstractPort.java:257)
at org.apache.nifi.controller.tasks.ContinuallyRunConnectableTask.call(ContinuallyRunConnectableTask.java:81)
at org.apache.nifi.controller.tasks.ContinuallyRunConnectableTask.call(ContinuallyRunConnectableTask.java:40)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.setupClient(SiteToSiteRestApiClient.java:278)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getHttpClient(SiteToSiteRestApiClient.java:219)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.execute(SiteToSiteRestApiClient.java:1189)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.execute(SiteToSiteRestApiClient.java:1237)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.fetchController(SiteToSiteRestApiClient.java:419)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getController(SiteToSiteRestApiClient.java:394)
at org.apache.nifi.remote.util.SiteToSiteRestApiClient.getController(SiteToSiteRestApiClient.java:361)
at org.apache.nifi.remote.client.SiteInfoProvider.refreshRemoteInfo(SiteInfoProvider.java:69)
at org.apache.nifi.remote.client.SiteInfoProvider.getSiteToSiteHttpPort(SiteInfoProvider.java:160)
at org.apache.nifi.remote.client.http.HttpClient.getBootstrapPeerDescription(HttpClient.java:90)
at org.apache.nifi.remote.client.PeerSelector.fetchRemotePeerStatuses(PeerSelector.java:379)
at org.apache.nifi.remote.client.PeerSelector.refreshPeers(PeerSelector.java:352)
at org.apache.nifi.remote.client.PeerSelector.createPeerStatusList(PeerSelector.java:316)
at org.apache.nifi.remote.client.PeerSelector.getNextPeerStatus(PeerSelector.java:275)
at org.apache.nifi.remote.client.http.HttpClient.createTransaction(HttpClient.java:129)
at org.apache.nifi.remote.StandardRemoteGroupPort.onTrigger(StandardRemoteGroupPort.java:238)
at org.apache.nifi.controller.AbstractPort.onTrigger(AbstractPort.java:250)
... 10 common frames omitted
... View more
Labels:
- Labels:
-
Apache MiNiFi
-
Apache NiFi
04-18-2018
02:46 PM
Yes we managed to get this with work, however the solution did not use NiFi, we used an Oracle View with an INSTEAD of trigger.
... View more
04-10-2018
06:49 PM
Sorry I did not get this to work in xQuery. I ended up using XSLT to do the transformation.
... View more
11-07-2017
04:59 PM
So the PutDatabaseRecords looks very promising. I had to make some small changes to the input file. First I had to place a [ at the beginning of the file and a ] at the end of the file, then I replace the LR with a comma ',' . I used the command sed '$!s/$/,/' to make the changes. Is there a NiFi process to make this happen? So the file was {...}
{...}
{...}
{...}
New file is <code>[{...},
{...},
{...},
{...}]
If I route this new file (JSON array) into both the JSONtoSQL/PUTSQL and PutDatabaseRecords. The JSONtoSQL outputs a new flowfile (much like the splitrecords) for each row and the PutDatabaseRecords ingests the file and is currently creating an empty record for each row, I haven't figured why that is yet
... View more
11-03-2017
08:12 PM
Hm, learning something new here, never used the ScriptedRecordSetWriter, need to read up on that. Below are some 3 sample records, there is a LF at the end of each record. Ideally the NetworkPolicy array goes into a single column asis. { "ServiceInstanceReferenceId": "9143992365913643215", "ServiceInstanceStatus": "active", "AccessNetworkTechnologyType": "vdsl2", "NetworkPolicy": [ "conn-conf-set:ttv|hsia", "dn-speed-kbps:150000", "up-speed-kbps:30000" ], "createTimeStamp": "2016-03-03 19:35:06-05", "modifiersName": "cn=directory manager", "modifyTimeStamp": "2016-05-13 06:10:45-04" }
{ "ServiceInstanceReferenceId": "9143992365913643271", "ServiceInstanceStatus": "active", "AccessNetworkTechnologyType": "vdsl2", "NetworkPolicy": [ "conn-conf-set:ttv|hsia", "dn-speed-kbps:150000", "up-speed-kbps:30000" ], "createTimeStamp": "2016-03-03 19:37:21-05", "modifiersName": "cn=directory manager", "modifyTimeStamp": "2016-05-13 06:10:45-04" }
{ "ServiceInstanceReferenceId": "7030000000000000001", "ServiceInstanceStatus": "active", "AccessNetworkTechnologyType": "gpon", "NetworkPolicy": [ "up-speed-kbps:50000", "dn-speed-kbps:50000", "redir-policy:ssp", "conn-conf-set:hsia" ], "createTimeStamp": "2015-12-04 16:20:16-05", "modifiersName": "cn=directory manager", "modifyTimeStamp": "2016-05-13 05:29:20-04" }
... View more
11-02-2017
08:20 PM
I have a large file that contains JSON records and I currently use SplitContent to break every record into it's own flowfile, and into ConvertJSONToSQL process. This causes NiFi large Data Provinace overhead. Is there a way I can bypass SplitContent and instead convert the file into an array of JSON elements? I've attempted using the new convertrecord process using JsonTreeReader and JsonRecordSetWriter however it only output a single record containing nulls for each element?
... View more
Labels:
- Labels:
-
Apache NiFi
10-06-2017
06:33 PM
I'm hoping someone can point me in the right direction. I have a SimpleKeyValueLookupService configured with some sample type I'm attempting to lookup. The way I understand it so far (from all the searches I've done) is in the LookUpAttribute I create a Dynamic Property of 'key' and set it's value to be in my case ${column_name}. So my question is where does the result get returned into?
... View more
Labels:
- Labels:
-
Apache NiFi
06-06-2017
06:12 PM
I have a JSON flow-file and I need determine if I should be doing an INSERT or UPDATE. The trick is to only update the columns that match the JSON attributes. I have an ExecuteSQL working and it returns executesql.row.count, however I've lose the original JSON flow-file which I was planing to use as a routeonattribuite. I'm trying to get the MergeContent to join the ExecuteSQL (dump the Avro output, I only need the executesql.row.count attribute) with the JSON flow. I've set follow before I do the ExecuteSQL: fragment.count=2 fragment.identifier=${UUID()} fragment.index=${nextInt()} Alternativly I could create a MERGE, if there is a way to loop through the list of JSON attributes that match the Oracle table?
... View more
Labels:
- Labels:
-
Apache NiFi
12-19-2016
09:06 PM
Sorry same error.
... View more
12-19-2016
05:01 PM
I did test out the connectivity and the Oracle port is working and the DB is up. However I was using a newest driver (ojdbc7.jar), I'm going to fall back to the previouns version whech was working on the other host.
... View more
12-16-2016
10:17 PM
1 Kudo
I have moved NiFi from a Windows platform to Linux and installed NiFi 2.0.1, I've also install ojdbc7.jar in the NiFi lib directory (Yes NiFi was restart). I've exported the windows template and imported into the Linux NiFi. I'm testing this new configuration, however I'm getting the following error: 2016-12-16 12:05:57,178 ERROR [Timer-Driven Process Thread-10] o.a.n.p.standard.ConvertJSONToSQL ConvertJSONToSQL[id=89164c79-43c8-4067-4859-cfeab7df7c70] ConvertJSONToSQL[id=89164c79-43c8-4067-4859-cfeab7df7c70] failed to process due to
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (IO Error: Connection reset); rolling back session: org.apache.nifi.processor.exception.ProcessExc
eption: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (IO Error: Connection reset)
2016-12-16 12:05:57,187 ERROR [Timer-Driven Process Thread-10] o.a.n.p.standard.ConvertJSONToSQL
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (IO Error: Connection reset)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:234) ~[na:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_102]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_102]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_102]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_102]
at org.apache.nifi.controller.service.StandardControllerServiceProvider$1.invoke(StandardControllerServiceProvider.java:177) ~[nifi-framework-core-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at com.sun.proxy.$Proxy82.getConnection(Unknown Source) ~[na:na]
at org.apache.nifi.processors.standard.ConvertJSONToSQL.onTrigger(ConvertJSONToSQL.java:267) ~[nifi-standard-processors-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) [nifi-framework-core-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.0.0.2.0.1.0-12.jar:1.0.0.2.0.1.0-12]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_102]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_102]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_102]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_102]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_102]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_102]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_102]
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (IO Error: Connection reset)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549) ~[na:na]
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) ~[na:na]
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) ~[na:na]
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:231) ~[na:na]
... 19 common frames omitted
Caused by: java.sql.SQLRecoverableException: IO Error: Connection reset
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:752) ~[ojdbc7.jar:12.1.0.2.0]
at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:666) ~[ojdbc7.jar:12.1.0.2.0]
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:32) ~[ojdbc7.jar:12.1.0.2.0]
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:566) ~[ojdbc7.jar:12.1.0.2.0]
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) ~[na:na]
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) ~[na:na]
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556) ~[na:na]
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545) ~[na:na]
... 22 common frames omitted
Caused by: java.net.SocketException: Connection reset
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) ~[na:1.8.0_102]
at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_102]
at oracle.net.ns.DataPacket.send(DataPacket.java:209) ~[ojdbc7.jar:12.1.0.2.0]
at oracle.net.ns.NetOutputStream.flush(NetOutputStream.java:215) ~[ojdbc7.jar:12.1.0.2.0]
--More--(51%)
... View more
Labels:
- Labels:
-
Apache NiFi
10-19-2016
05:06 PM
I have a NiFi host doing ETL processing outside the Hadoop Cluster. The cluster is secured using Knox/Ranger and the only ports open are ssh to the Hadoop Edge Nodes, and Kafka queue. My question is what are the best options to write data into either HBase or Hive? Ideas I have are: Deploy a NiFi inside the cluster do a site to site (requires opening a firewall port) From NiFi write to the Kafka queue, and from inside the cluster write a java process to pull from the queue and output the data to the target (HBase or Hive) Any other sugestions?
... View more
Labels:
- Labels:
-
Apache NiFi
08-24-2016
04:57 PM
1 Kudo
I have a JSON array that I'm attempting to input to ConvertJSONToSQL, however I'm getting the following error: "None of the fields in the JSON map to the columns defined" I would like the process to be dynamic rather than use the AttributesToJSON process. Is that possible? My Table does have the EndTime, H1526728411, etc, columns. My JSON looks like: {
"Table":{
"TableName":"HH_CELL_VQI",
"EndTime":"2016-07-12T05:30:00-06:00",
"H1526728411":"0",
"H1526728412":"0",
"H1526728413":"0",
"H1526728414":"0",
"H1526728415":"0",
"H1526728416":"0",
"H1526728417":"0",
"H1526728418":"0",
"H1526728419":"0",
"H1526728420":"0"
}
}
... View more
Labels:
- Labels:
-
Apache NiFi
08-17-2016
07:29 PM
I have a flow that works on my desktop and I would like to move it to a single server that has 16 cores. My question is there anything I need to configure in NiFi to light up all the cores. I see documentation on clustering servers, which doesn't apply to me, or does NiFI use everything available, which is the behavior I'm looking for?
... View more
Labels:
- Labels:
-
Apache NiFi
07-25-2016
02:48 PM
Wow, didn't know the xQuery could do this. Based on quick research it appears that fn:tokenize might be just what I'm looking for. I'll have to play with this to see if I can this to work.
... View more
07-22-2016
04:10 PM
i'm still learning the ropes of NiFi, I have some XML (See Below) <measInfo measInfoId="1542455297">
<measTypes>1542455297 1542455298 1542455299 1542455300 1542455301 1542455302 1542455303 1542455304 1542455305 1542455306 1542455307 1542460296 1542460297 </measTypes>
<measValue measObjLdn="LTHAB0113422/ETHPORT:Cabinet No.=0, Subrack No.=1, Slot No.=7, Port No.=0, Subboard Type=BASE_BOARD">
<measResults>116967973 585560 496041572 682500 0 12583680 72080 520454 46670568 73432 2205837 1000000 1000000 </measResults>
</measValue>
<measValue measObjLdn="LTHAB0113422/ETHPORT:Cabinet No.=0, Subrack No.=1, Slot No.=7, Port No.=1, Subboard Type=BASE_BOARD">
<measResults>0 0 0 0 0 0 0 0 0 0 0 0 0 </measResults>
</measValue>
</measInfo>
I successfully parsed the XML, but need further processing. I'm looking for suggestions as to the best way to handle this. Ultimately I would like a key/value pair between MeasTypes and MeasResults to insert into Hbase or Pivot data into a Hive table (MeasType would get translated into a Column Name). The exact number of MeasTypes and MeasResults pairs is variable. measTypes measResults
1542455297 116967973
1542455298 585560
1542455299 496041572
1542455300 682500
1542455301 0
1542455302 12583680
1542455303 72080
1542455304 520454
1542455305 46670568
1542455306 73432
1542455307 2205837
1542460296 1000000
1542460297 1000000
I have 10GB of XML arriving every 30 minutes, so efficiency is also a concern. I'm able to take the measTypes and put it into a file and parse it with a ExtractText and ReplaceText (using a Regex), so the next challenge is to replace the GetFIle with a feed from my existing XML template, how do I do it? Another Option is to InvokeScriptedProcessor (Maybe Python)? Sample code would be greatly appreciated. 🙂
... View more
Labels:
- Labels:
-
Apache NiFi
07-21-2016
08:02 PM
Doh, found the problem, I forgot I was in the Java world where things are case sensetive, and Oracle which is not 😞
... View more
07-21-2016
07:43 PM
Ok, I learned something new there is a LogAttribute process, and this is what is in the logs now, problem is nothing is being inserted into the table now? Standard FlowFile Attributes
Key: 'entryDate'
Value: 'Thu Jul 21 13:40:06 MDT 2016'
Key: 'lineageStartDate'
Value: 'Thu Jul 21 13:40:05 MDT 2016'
Key: 'fileSize'
Value: '117'
FlowFile Attribute Map Content
Key: 'RouteOnAttribute.Route'
Value: 'Filter_Attributes'
Key: 'absolute.path'
Value: 'C:\HDF-1.2.0.1-1\nifi\.\data-in/'
Key: 'begin_time'
Value: '2016-07-12T05:00:00-06:00'
Key: 'file.creationTime'
Value: '2016-07-21T13:40:05-0600'
Key: 'file.lastAccessTime'
Value: '2016-07-21T13:40:05-0600'
Key: 'file.lastModifiedTime'
Value: '2016-07-18T14:20:14-0600'
Key: 'file.owner'
Value: 'BUILTIN\Administrators'
Key: 'filename'
Value: 'A20160712.0500-0600-0530-0600_LTHAB0113422.xml'
Key: 'fragment.count'
Value: '1'
Key: 'fragment.identifier'
Value: '50540eda-0902-4b5a-bf7a-2b45d956fe3a'
Key: 'fragment.index'
Value: '0'
Key: 'infoid'
Value: '1526726706'
Key: 'measResults'
Value: '0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NIL NIL NIL 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 '
Key: 'measTypes'
Value: '1526726986 1526726987 1526726988 1526726989 1526726990 1526726991 1526726992 1526726993 1526726994 1526727226 1526727227 1526727228 1526728306 1526728307 1526728308 1526728309 1526728310 1526728311 1526728312 1526728313 1526728314 1526728315 1526728316 1526728317 1526728318 1526728321 1526728322 1526728323 1526728324 1526728326 1526728327 1526728328 1526728329 1526728330 1526728331 1526728386 1526728387 1526728388 1526728400 1526728401 1526728402 1526728403 1526728404 1526728405 1526728406 1526728407 1526728408 1526728409 1526728410 1526728441 1526728442 1526728468 1526728469 1526728470 1526728471 1526728472 1526728497 1526728498 1526728499 1526728500 1526728501 1526728502 1526728503 1526728504 1526728505 1526728506 1526728507 1526728508 1526728509 1526728510 1526728511 1526728512 1526728513 1526728529 1526728530 1526728531 1526728532 1526728533 1526728534 1526728535 1526728536 1526728542 1526728543 1526728544 1526728560 1526728561 1526728705 1526728706 1526728707 1526728708 1526728709 1526728710 1526728746 1526728747 1526728748 1526728749 1526728750 1526728751 1526728752 1526728753 1526728754 1526728755 1526728756 1526728757 1526728758 1526728759 1526728760 1526728761 1526728889 1526728890 1526728891 1526728892 1526728893 1526728894 1526728895 1526728896 1526728897 1526728898 1526728899 1526728900 1526728901 1526729054 1526729055 1526729260 1526729422 1526729423 1526729424 1526729425 1526729426 1526729427 1526729428 1526729429 1526729432 1526729433 1526729434 1526729485 1526729486 1526729487 1526729488 1526729489 1526729490 1526729491 1526729492 1526729505 1526729506 1526729507 1526729508 1526729509 1526729510 1526729511 1526729512 1526729513 1526729514 1526729515 1526729516 1526729544 1526729573 1526730017 1526730018 1526730019 1526730020 1526730021 1526730022 1526730023 1526730024 1526730025 1526730026 1526730027 1526730028 1526730029 1526730030 1526730031 1526730032 1526730033 1526730034 1526730035 1526730036 1526730037 1526730038 1526730039 1526730040 1526730041 1526730076 1526730077 1526730078 1526730079 1526730080 1526730081 1526730082 1526730083 1526730084 1526730085 1526730086 1526730087 1526730088 1526730089 1526730090 1526730091 1526730092 1526730093 1526730094 1526730095 1526730096 1526730097 1526730098 1526730099 1526730146 1526730147 1526730148 1526730848 1526730849 1526733006 1526733007 1526733008 1526733009 1526733190 1526733191 1526733192 '
Key: 'measValue'
Value: 'LTHAB0113422/Cell:eNodeB Function Name=LTHAB0113422, Local Cell ID=110, Cell Name=LTHAB0113422-110-2600-1-1, eNodeB ID=113422, Cell FDD TDD indication=CELL_FDD'
Key: 'mime.type'
Value: 'text/plain'
Key: 'node_name'
Value: 'LTHAB0113422'
Key: 'path'
Value: '/'
Key: 'sql.args.1.type'
Value: '12'
Key: 'sql.args.1.value'
Value: '1526726706'
Key: 'sql.args.2.type'
Value: '12'
Key: 'sql.args.2.value'
Value: '2016-07-12T05:00:00-06:00'
Key: 'sql.args.3.type'
Value: '12'
Key: 'sql.args.3.value'
Value: 'LTHAB0113422/Cell:eNodeB Function Name=LTHAB0113422, Local Cell ID=110, Cell Name=LTHAB0113422-110-2600-1-1, eNodeB ID=113422, Cell FDD TDD indication=CELL_FDD'
Key: 'sql.args.4.type'
Value: '12'
Key: 'sql.args.4.value'
Value: '0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NIL NIL NIL 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 '
Key: 'sql.args.5.type'
Value: '12'
Key: 'sql.args.5.value'
Value: '1526726986 1526726987 1526726988 1526726989 1526726990 1526726991 1526726992 1526726993 1526726994 1526727226 1526727227 1526727228 1526728306 1526728307 1526728308 1526728309 1526728310 1526728311 1526728312 1526728313 1526728314 1526728315 1526728316 1526728317 1526728318 1526728321 1526728322 1526728323 1526728324 1526728326 1526728327 1526728328 1526728329 1526728330 1526728331 1526728386 1526728387 1526728388 1526728400 1526728401 1526728402 1526728403 1526728404 1526728405 1526728406 1526728407 1526728408 1526728409 1526728410 1526728441 1526728442 1526728468 1526728469 1526728470 1526728471 1526728472 1526728497 1526728498 1526728499 1526728500 1526728501 1526728502 1526728503 1526728504 1526728505 1526728506 1526728507 1526728508 1526728509 1526728510 1526728511 1526728512 1526728513 1526728529 1526728530 1526728531 1526728532 1526728533 1526728534 1526728535 1526728536 1526728542 1526728543 1526728544 1526728560 1526728561 1526728705 1526728706 1526728707 1526728708 1526728709 1526728710 1526728746 1526728747 1526728748 1526728749 1526728750 1526728751 1526728752 1526728753 1526728754 1526728755 1526728756 1526728757 1526728758 1526728759 1526728760 1526728761 1526728889 1526728890 1526728891 1526728892 1526728893 1526728894 1526728895 1526728896 1526728897 1526728898 1526728899 1526728900 1526728901 1526729054 1526729055 1526729260 1526729422 1526729423 1526729424 1526729425 1526729426 1526729427 1526729428 1526729429 1526729432 1526729433 1526729434 1526729485 1526729486 1526729487 1526729488 1526729489 1526729490 1526729491 1526729492 1526729505 1526729506 1526729507 1526729508 1526729509 1526729510 1526729511 1526729512 1526729513 1526729514 1526729515 1526729516 1526729544 1526729573 1526730017 1526730018 1526730019 1526730020 1526730021 1526730022 1526730023 1526730024 1526730025 1526730026 1526730027 1526730028 1526730029 1526730030 1526730031 1526730032 1526730033 1526730034 1526730035 1526730036 1526730037 1526730038 1526730039 1526730040 1526730041 1526730076 1526730077 1526730078 1526730079 1526730080 1526730081 1526730082 1526730083 1526730084 1526730085 1526730086 1526730087 1526730088 1526730089 1526730090 1526730091 1526730092 1526730093 1526730094 1526730095 1526730096 1526730097 1526730098 1526730099 1526730146 1526730147 1526730148 1526730848 1526730849 1526733006 1526733007 1526733008 1526733009 1526733190 1526733191 1526733192 '
Key: 'sql.args.6.type'
Value: '12'
Key: 'sql.args.6.value'
Value: 'LTHAB0113422'
Key: 'sql.table'
Value: 'NIFI'
Key: 'uuid'
Value: 'bc5aa92e-de98-4516-b6a3-674f67e4787b'
--------------------------------------------------
INSERT INTO RCS_STG.NIFI (INFOID, BEGIN_TIME, MEASVALUE, MEASRESULTS, MEASTYPES, NODE_NAME) VALUES (?, ?, ?, ?, ?, ?)
... View more
07-21-2016
06:46 PM
Where will I find the "sql.args.1.type" and "sql.args.1.value" information logged?
... View more
07-21-2016
05:09 PM
I have some XML files that I've successfully parsed, converted to JSON (using AttributesToJSON), and then ConvertJSONToSQL, If I putfile from AttributesToJSON I have the following sample row: {"INFOID":"1526726757","BEGIN_TIME":"2016-07-12T05:00:00-06:00","MEASVALUE":"LTHAB0113422/ECELL_WCELL:eNodeB Function Name=LTHAB0113422, RNC cell ID=30005, Local cell ID=41, Mobile country code=302, Mobile network code=220, RNC ID=1209","MEASRESULTS":"0 0 0 ","MEASTYPES":"1526729565 1526729566 1526729567 ","NODE_NAME":"LTHAB0113422"} Yet the SQL looks like: INSERT INTO RCS_STG.NIFI (MEASRESULTS, BEGIN_TIME, NODE_NAME, INFOID, MEASTYPES, MEASVALUE) VALUES (?, ?, ?, ?, ?, ?) The Table however only has a rows that is all nulls? What am I missing?
... View more
Labels:
- Labels:
-
Apache NiFi
07-19-2016
01:15 PM
Here is my sample, if I remove the xmlns parts Xpath works just fine. <measCollecFile xmlns="http://latest/nmc-omc/cmNrm.doc#measCollec" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://latest/nmc-omc/cmNrm.doc#measCollec schema\pmResultSchedule.xsd">
... View more
07-19-2016
12:54 PM
I have an XML document with namespace, how do I parse it with Xpath or Xquery
... View more
Labels:
- Labels:
-
Apache NiFi