Member since
03-03-2017
74
Posts
9
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2134 | 06-13-2018 12:02 PM | |
3653 | 11-28-2017 10:32 AM |
11-25-2020
11:57 AM
Hi Simon, I am facing the same issue but my case might be different. Are you using temporary credentials? i.e. assuming role. If so, you can't provide aws_session_token property in nifi unfortunately and this will throw the error you are facing. There is an open issue here https://issues.apache.org/jira/browse/NIFI-7900 /Mahmoud
... View more
10-06-2020
08:47 AM
Do you have this flow available for download? Would be an excellent template
... View more
08-13-2020
04:23 AM
Only a partial answer but in general I do not think REGEX_REPLACE cuts large strings. It will be hard to figure this out in more detail unless you can share a reproducible example. Here is what i tested just now: 1. Create a table that contains a string of 60000+ characters (lorem ipsum) 2. Create a new table by selecting the regex replace of that string (i replaced every a with b) 3. Counting the length of the field in the new table --- As said, it may well be that you are using a very specific string or regex that together create this problem, it would be interesting to see if this could be reduced to a minimal example. -- Also keep in mind that though they are very similar, there are many ways a regex itself can be parsed, perhaps the test you did is simply slightly different than the implementation in Hive.
... View more
12-18-2018
07:32 AM
Thats great news thank you Dan.
... View more
07-20-2018
08:03 AM
If you have to convert simple XML files then this approach works well. However, if you have a very large volume of XML files that are based on an industry data standard such as FpML, HL7 etc. then this manual approach becomes very time-consuming. I am also facing in custom nursing essay very must so what I did, I Use 3rd party tools that automate the whole XML conversion process on various big data frameworks such as Hive, Impala, Spark etc.
... View more
06-13-2018
12:02 PM
I found the solution my self, much more simple than i first thought, just cast the json type to text and avro will accept it. select cast (json column as text ) columnName from table
... View more
03-02-2018
02:01 PM
2 Kudos
@Simon Jespersen It states that "myuser" is not authorized to access this resource. Would need to look at your authorizer now to determine why. Assuming you are using NiFi's built-in file based authorizer, you would need to check teh users.xml and authorizations.xml files or carefullly inspect what permissions have been created for "myuser" within teh NiFi UI.
... View more
06-22-2018
02:16 PM
1 Kudo
The process group name can actually be found with the attached groovy code: def flowFile = session.get()
if(!flowFile) return
processGroupName = context.procNode?.getProcessGroup().getName()
flowFile = session.putAttribute(flowFile, 'processGroupName', processGroupName)
session.transfer(flowFile, REL_SUCCESS)
... View more
01-16-2018
03:41 PM
1 Kudo
Thank you this worked for me
... View more