Member since
06-19-2017
62
Posts
1
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2522 | 03-17-2022 10:37 AM | |
1759 | 12-10-2021 04:25 AM | |
2324 | 08-18-2021 02:20 PM | |
6278 | 07-16-2021 08:41 AM | |
1153 | 07-13-2021 07:03 AM |
10-13-2021
05:41 AM
Hello All, I am trying to drop the partitions less than 14 days ago in Hive and hive query below is throwing error. Hive query: ALTER TABLE audit_logs DROP PARTITION (evt_date < 'date_format(date_sub(from_unixtime(unix_timestamp('20211013','yyyyMMdd'),'yyyy-MM-dd'),14),'yyyyMMdd')'); We have partitioned evt_date as string date type and evt_date is stored in this date format yyyyMMdd i.e 20211013 as string in the table . Error : Error while compiling statement: FAILED: ParseException line 4:118 cannot recognize input near ''date_format(date_sub(from_unixtime(unix_timestamp('' '20211013' '','' in constant Input to the unix_timestamp('20211013','yyyyMMdd') method also comes as yyyyMMdd in string format from another oozie parameter/another program. individual select query select date_format(date_sub(from_unixtime(unix_timestamp('20211013','yyyyMMdd'),'yyyy-MM-dd'),14),'yyyyMMdd'); Output : 20210929 ALTER TABLE audit_logs DROP PARTITION (evt_date < '20211029'); - Hive query works fine . Could someone assist on this. Thanks @hive
... View more
Labels:
- Labels:
-
Apache Hive
10-03-2021
05:26 AM
Hi All, I am running distcp command which copies all the audit logs HDFS folder to another HDFS folder for further processing purpose . The distcp command used to work fine till 2 weeks ago and started failing since last week .I checked detailed MR logs and understand that only particular file copy failed and other folder/files of audit logs like kafka,hive,nifi and hbase are copied . some specific files copy processing is failing. distcp command : hadoop distcp -filters $filter_file_loc ranger/audit /data/audit_logs/staging Distribution : Cloudera Data Platform version 7.1.7 Please find the detail error messages . java.io.IOException: File copy failed: hdfs://namenode/ranger/audit/kafka/kafka/20210927/kafka_ranger_audit_svl.host.int.log --> hdfs://namenode/data/audit_logs/staging/audit/kafka/kafka/20210927/kafka_ranger_audit_svl.host.int.log Caused by: org.apache.hadoop.hdfs.CannotObtainBlockLengthException: Cannot obtain block length for LocatedBlock{BP-1024772623-10.107.146.29-1593441936031:blk_1183449574_109711397; getBlockSize()=64553182; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.107.145.208:9866,DS-b11e932b-0460-47b7-a281-3743ecf9c581,DISK]]} of /ranger/audit/kafka/kafka/20210927/kafka_ranger_audit_svl.host.int.log
at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:370)
at org.apache.hadoop.hdfs.DFSInputStream.getLastBlockLength(DFSInputStream.java:279)
at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:260)
at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:203)
at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:187)
at org.apache.hadoop.hdfs.DFSClient.openInternal(DFSClient.java:1056)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1019)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:338)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:334)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:351)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:954)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:331) @distcp
... View more
Labels:
- Labels:
-
Apache Hadoop
-
MapReduce
09-23-2021
03:20 AM
Hi All, I am able to launch the Oozie work flow using Oozie Webservice API via CURL utility .Like about perform HTTP GET and POST requests . Example curl command: curl -u guestuser:guestpwd -X POST -H "Content-Type: application/xml;charset=UTF-8" -d @Job.xml "https://knowxhost:8443/gateway/cdp-proxy-api/oozie/v1/jobs?action=start" url: https://knowxhost:8443/gateway/cdp-proxy-api/oozie/v1/job/0000009-210910123241282-oozie-oozi-W' Also in python , I can fetch the Oozie executed workflow. response = requests.get(url, auth=HTTPBasicAuth('guestuser','guestpwd '), verify=False).text But unable to perform the POST request and getting 500 Error . url = 'https://knowxhost:8443/gateway/cdp-proxy-api/oozie/v1/jobs?action=start' response = requests.post(url, auth=HTTPBasicAuth('guestuser','guestpwd'), verify=False, data=xml, headers=headers).text Exception: HTTP ERROR 500 javax.servlet.ServletException: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. URI:STATUS:MESSAGE:SERVLET:CAUSED BY:CAUSED BY:CAUSED BY:CAUSED BY:CAUSED BY:CAUSED BY: /gateway/cdp-proxy-api/oozie/v1/jobs 500 javax.servlet.ServletException: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. cdp-proxy-api-knox-gateway-servlet javax.servlet.ServletException: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error. java.io.IOException: java.io.IOException: Service connectivity error. java.io.IOException: Service connectivity error. Detailed stackTrace: GatewayFilter.java:doFilter(169)) - Gateway processing failed: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error.
javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error.
at org.apache.shiro.web.servlet.AdviceFilter.cleanup(AdviceFilter.java:196)
at org.apache.shiro.web.filter.authc.AuthenticatingFilter.cleanup(AuthenticatingFilter.java:155)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:148)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
at org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:450)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:387)
at org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.filter.ResponseCookieFilter.doFilter(ResponseCookieFilter.java:49)
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:58)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.filter.XForwardedHeaderFilter.doFilter(XForwardedHeaderFilter.java:50)
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:58)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.GatewayFilter.doFilter(GatewayFilter.java:167)
at org.apache.knox.gateway.GatewayFilter.doFilter(GatewayFilter.java:92)
at org.apache.knox.gateway.GatewayServlet.service(GatewayServlet.java:135)
at org.eclipse.jetty.servlet.ServletHolder$NotAsync.service(ServletHolder.java:1443)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:791)
at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1626)
at org.eclipse.jetty.websocket.server.WebSocketUpgradeFilter.doFilter(WebSocketUpgradeFilter.java:228)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1435)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1350)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.apache.knox.gateway.trace.TraceHandler.handle(TraceHandler.java:51)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.apache.knox.gateway.filter.CorrelationHandler.handle(CorrelationHandler.java:41)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.apache.knox.gateway.filter.PortMappingHelperHandler.handle(PortMappingHelperHandler.java:106)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.websocket.server.WebSocketHandler.handle(WebSocketHandler.java:115)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:279)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:540)
at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:395)
at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:383)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:882)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1036)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error.
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:389)
at org.apache.knox.gateway.filter.ShiroSubjectIdentityAdapter.doFilter(ShiroSubjectIdentityAdapter.java:71)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
... 77 more
Caused by: java.security.PrivilegedActionException: java.io.IOException: java.io.IOException: Service connectivity error.
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.knox.gateway.filter.ShiroSubjectIdentityAdapter$CallableChain.call(ShiroSubjectIdentityAdapter.java:142)
at org.apache.knox.gateway.filter.ShiroSubjectIdentityAdapter$CallableChain.call(ShiroSubjectIdentityAdapter.java:74)
at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:387)
... 83 more
Caused by: java.io.IOException: java.io.IOException: Service connectivity error.
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.failoverRequest(ConfigurableHADispatch.java:269)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequest(ConfigurableHADispatch.java:163)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.failoverRequest(ConfigurableHADispatch.java:263)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequest(ConfigurableHADispatch.java:163)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.failoverRequest(ConfigurableHADispatch.java:263)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequest(ConfigurableHADispatch.java:163)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.failoverRequest(ConfigurableHADispatch.java:263)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequest(ConfigurableHADispatch.java:163)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequestWrapper(ConfigurableHADispatch.java:125)
at org.apache.knox.gateway.dispatch.DefaultDispatch.doPost(DefaultDispatch.java:343)
at org.apache.knox.gateway.dispatch.GatewayDispatchFilter$PostAdapter.doMethod(GatewayDispatchFilter.java:182)
at org.apache.knox.gateway.dispatch.GatewayDispatchFilter.doFilter(GatewayDispatchFilter.java:125)
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:58)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.ranger.authorization.knox.RangerPDPKnoxFilter.doFilter(RangerPDPKnoxFilter.java:171)
at org.apache.ranger.authorization.knox.RangerPDPKnoxFilter.doFilter(RangerPDPKnoxFilter.java:110)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.identityasserter.common.filter.AbstractIdentityAssertionFilter.doFilterInternal(AbstractIdentityAssertionFilter.java:193)
at org.apache.knox.gateway.identityasserter.common.filter.AbstractIdentityAssertionFilter.access$000(AbstractIdentityAssertionFilter.java:53)
at org.apache.knox.gateway.identityasserter.common.filter.AbstractIdentityAssertionFilter$1.run(AbstractIdentityAssertionFilter.java:161)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.knox.gateway.identityasserter.common.filter.AbstractIdentityAssertionFilter.doAs(AbstractIdentityAssertionFilter.java:156)
at org.apache.knox.gateway.identityasserter.common.filter.AbstractIdentityAssertionFilter.continueChainAsPrincipal(AbstractIdentityAssertionFilter.java:146)
at org.apache.knox.gateway.identityasserter.common.filter.CommonIdentityAssertionFilter.doFilter(CommonIdentityAssertionFilter.java:94)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.filter.rewrite.api.UrlRewriteServletFilter.doFilter(UrlRewriteServletFilter.java:57)
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:58)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.filter.ShiroSubjectIdentityAdapter$CallableChain$1.run(ShiroSubjectIdentityAdapter.java:90)
at org.apache.knox.gateway.filter.ShiroSubjectIdentityAdapter$CallableChain$1.run(ShiroSubjectIdentityAdapter.java:87)
... 90 more
Caused by: java.io.IOException: Service connectivity error.
at org.apache.knox.gateway.dispatch.DefaultDispatch.executeOutboundRequest(DefaultDispatch.java:188)
at org.apache.knox.gateway.ha.dispatch.ConfigurableHADispatch.executeRequest(ConfigurableHADispatch.java:159)
... 123 more Can someone assist for this issue to be solved .The user has access all the hadoop components via knox.
... View more
Labels:
- Labels:
-
Apache Oozie
09-18-2021
10:00 PM
Hi @Daggers You can store avro schema file in HDFS folder and point that folder for your hive table ..There is avro.schema.url property with value can be passed while Hive creating the table ..This solution works for versioning the avro schema file in HDFS for the respective table. You can explore schema registry as well
... View more
09-05-2021
05:30 AM
Hi , We are orchestrating NiFi flow status(completion/Failed) through Control-M. This is the step below to be followed . 1. Control-M put the files in source folder (SFTP server) and will be waiting for status through Status API (which you have to build separate REST API flow using Request processor to expose the status message). 2. NiFi List SFTP will be keep listening the file(.txt) and once new file has placed by Control-M , NiFi will process the file and NiFi processor load the content into Database , Database processor will have Success and Failure relationship . Success flow files you can capture status as success and 1 value using Update attribute processor , this values should be stored into distributed cache/other storage area using relevant processor .Same process for failure (-1) flow as well . 3. Now status message stored into distributed cache/other storage area, You can query the status from distributed cache/other storage area using Fetch processor and pass to Response processor to the waiting Control-M job (Control-M should keep polling the status until it receive response 1 and -1. 4. When Control-M finds 1 value then the flow is success and if -1 then processing has failed . Thanks
... View more
08-18-2021
02:20 PM
Hello , If your input json is coming below format { "TOT_NET_AMT" : "55.00", "H_OBJECT" : "File", "H_GROSS_AMNT" : ["55.00","58.00"], "TOT_TAX_AMT" : "9.55" } If the value of H gross amount is in List of String ["55.00","58.00"] instead of String "55.00,58.00" , then you can use Jolt transformation JSON NiFi processor to define jolt spec to convert into the required output. { "TOT_NET_AMT" : "55.00", "H_OBJECT" : "File", "H_GROSS_AMNT" : "58.00", "TOT_TAX_AMT" : "9.55" } Jolt transformation JSON configuration. Jolt specification : [ { "operation": "modify-overwrite-beta", "spec": { "H_GROSS_AMNT": "=lastElement(@(1,H_GROSS_AMNT))" } } ] If the input is not list of String for H Gross amount and only string with comma speparated, then please follow the below steps in order. you will have to extract whole json into single attribute using extracttext processor and the attribute will hold entire json content , Next you can use EvaluateJsonPath which extracts each json element into each attributes , once you have all 4 attributes with value after EvaluateJsonPath , then you construct the json in Replace text processor ,H_GROSS_AMNT will still hold the string comma separated and u can use H_GROSS_AMNT:substringAfterLast(,) which extracts last element from the string value . Examples of EvaluateJsonPath https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#jsonpath
... View more
07-30-2021
01:09 PM
Hi, Once you extracted the header into Flowfile attribute using ExtractText processor, next you are going to convert the header flowfile attribute into Flow file content OR you can keep the header value as in attribute ..The stack overflow explains about extracting header into flowfile attribute and next they have pass the headers as file into destination .To convert flowfile attribute to file/flowfile content ,we will have to use ReplaceText processor where you can pass flowfile attributes .. The success relationship of ReplaceText will only contains header in flowfile content and the original csv file will be replaced with header as content . The flowfile content, you can transfer to destination or next processor in the flow. Hope this information you are looking for .. Thanks
... View more
07-19-2021
07:59 AM
Hi , If Controller Services created at base processor group level , then you will not get controller services when you create a template only for the subprocessor group . You will have to create a template for base processor group of immediate child processor group or define the distributed mapcache controller service in the processor group which you would want to create a template. For example 'Stream Ingestion' is immediate super processorgroup of 'TestSample' processorgorup . If i create a template for 'TestSample' processor group then in the template i can not see the DistributedMapCacheServer controller service because the specific controller service scope is in Streaming Ingestion processgroup as a template. Controller Services created within a Process Group will be available/referenced to all descendant components. Thanks
... View more
07-19-2021
07:22 AM
Hi alexmarco, I did not find the 'Upload/Attach' option to upload the template file . Could you please follow the steps/screenshots mentioned , it should work for your example well. thanks
... View more
07-16-2021
08:41 AM
Hi alexmarco, If your final json format is fixed and input json also coming in same format always with values different then you extract keyword value (foo or new value) into flowfile attribute and you can use attribute in following replacetext processor to pass the value . ExtractText processor --> UpdateAttributeProcessor --> ReplaceText Processor 1. Add a new property(keyword_value) in ExtractText and value/expression should be below ("keyword":.*) 2. Remove space,double quotes from the extracted keyword_value attribute in UpdateAttributeProcessor. (You can add the updateattribute logic directly in Replacetext processor itself for retrieving the keyword_value also as ReplaceText processor supports NiFi expression language. Its optional and you avoid Updateattribute processor in this flow then if you chose ) 3. Append the keyword_value in ReplaceText processor (keep the final json ) as in sample . 4. Connect the success flow into Invokehttp processor. * CreateKeywordvalueAttribute (Extract processor) expression below : ("keyword":.*) * Updateattributeprocessor ${keyword_value:substringAfter(':'):trim():replace('"', '')} * FinalReplaceText processor : Place the below JSON into Replacement Value section of the processor {
"requests": [
{
"source": "blablabla",
"params": {
"keywords": [
"${keyword_value}"
],
"sub-field1": "1"
}
}
],
"field1": "1",
"field2": "2",
"field3": false
} I have attached the sample tested flow(.xml) for your reference . Please accept the solution if it works as expected. Thanks Adhi
... View more