Member since
06-08-2017
1049
Posts
517
Kudos Received
312
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9904 | 04-15-2020 05:01 PM | |
5936 | 10-15-2019 08:12 PM | |
2414 | 10-12-2019 08:29 PM | |
9580 | 09-21-2019 10:04 AM | |
3506 | 09-19-2019 07:11 AM |
07-07-2020
02:42 AM
Hi Friends I tried running the curl from my Linux session on which Nifi installed. This is a secured nifi instance . I got the forbidden error. Please see the error below -bash-4.2$ curl -i -X PUT -H 'Content-Type: application/json' -d '{"id":"e3d11d1d-327b-1b5b-96a7-3cb59c48df17","state":"RUNNING"}' https://XXXXX:8443/nifi//nifi-api/flow/process-groups/e3d11d1d-327b-1b5b-96a7-3cb59c48df17; HTTP/1.1 403 Forbidden Server: squid/3.5.20 Mime-Version: 1.0 Date: Tue, 07 Jul 2020 09:34:57 GMT Content-Type: text/html;charset=utf-8 Content-Length: 3347 X-Squid-Error: ERR_ACCESS_DENIED 0 Vary: Accept-Language Content-Language: en X-Cache: MISS from XXXXX X-Cache-Lookup: NONE from XXXX Via: 1.1 XXXXXsquid/3.5.20) Connection: keep-alive curl: (56) Received HTTP code 403 from proxy after CONNECT @Shu_ashu - can you please help me here sir. Is it that I need to specify the certs somewhere ?
... View more
06-17-2020
04:59 AM
How to do this for putHDFS processor? I don't see any Last Modified Time property in that.
... View more
06-02-2020
05:14 AM
Hi, Would you please elaborate on why Hive configuration is needed? Thanks
... View more
05-29-2020
08:11 AM
@Shu_ashu this approach works has a problem that clear-state is working only on stopped processor. I am using ScrollElasticsearch processor and it needs to be cleared before it can be executed again. I tried curl -i -X POST http://localhost:8080/nifi-api/processors/0172101b-be82-11aa-1249-d1383cb1ceba/state/clear-requests but it end-up with conflict status I must stop processor in order to clear-state Do I really have to stop processor? manually or via API - it doesn't seems to me as a good design. Could you Help or give any advice please? Thank u. Petr
... View more
05-05-2020
11:45 AM
@Shu_ashu I don't understand how ScanHBase would work as an alternative when it has an input requirement i.e. it can't be used as a root node in a graph for gathering records from HBase indiscriminately. Would you agree? It seems like the only viable solution then is use the RestApi as you've suggested.
... View more
04-22-2020
08:39 AM
I am also facing the same issue of not able to update mongo records using exiting _id . I just need to update 2-3 fields in existing mongo document
... View more
04-19-2020
10:41 PM
1 Kudo
I have written a blog on this, Kindly refer to blog to setup dbcp connection pool Lookup controller service and execute same query in multiple databases. Please follow this link, it is with an example with step by step instructions to setup the same: https://bigdata-galaxy.blogspot.com/2020/04/nifi-querying-multiple-databases-using.html
... View more
04-19-2020
05:13 PM
Hi @DarkStar
As this thread was marked 'Solved' in March of 2018 you would have a better chance of receiving a resolution by starting a new thread. This will also provide the opportunity to provide details specific to your XML source that could aid others in providing a more targeted answer to your question.
... View more
04-15-2020
05:01 PM
Hi @ChineduLB , You can use `.groupBy` and `concat_ws(",",collect_list)` functions and to generate `ID` use `row_number` window function. val df=Seq(("1","User1","Admin"),("2","User1","Accounts"),("3","User2","Finance"),("4","User3","Sales"),("5","User3","Finance")).toDF("ID","USER","DEPT") import org.apache.spark.sql.expressions.Window df.groupBy("USER"). agg(concat_ws(",",collect_list("DEPT")).alias("DEPARTMENT")). withColumn("ID",row_number().over(w)). select("ID","USER","DEPARTMENT").show()
... View more
04-15-2020
02:14 PM
If you don't care about Avro format and want to go directly to JSON you can use the ExecuteSQLRecord processor where you can specify the output format.
... View more