Member since
01-02-2020
40
Posts
3
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
155 | 12-23-2020 09:33 AM | |
252 | 05-18-2020 01:27 AM | |
342 | 04-28-2020 11:02 AM | |
538 | 04-23-2020 12:20 PM | |
272 | 01-25-2020 11:50 PM |
03-03-2021
07:54 PM
Hi, I have configured PutEmail to mailtrap.io. the putEmail configuration The exception : 2021-03-04 03:08:56,560 ERROR [Timer-Driven Process Thread-9] o.a.nifi.processors.standard.PutEmail PutEmail[id=5744df4c-7f6d-3fdb-3243-eb175358dd83] PutEmail[id=5744df4c-7f6d-3fdb-3243-eb175358dd83] failed to process session due to java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport; Processor Administratively Yielded for 1 sec: java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport at javax.activation.MailcapCommandMap.<init>(MailcapCommandMap.java:179) at javax.activation.CommandMap.getDefaultCommandMap(CommandMap.java:85) at javax.activation.DataHandler.getCommandMap(DataHandler.java:167) at javax.activation.DataHandler.getDataContentHandler(DataHandler.java:629) at javax.activation.DataHandler.writeTo(DataHandler.java:329) at javax.mail.internet.MimeUtility.getEncoding(MimeUtility.java:340) at javax.mail.internet.MimeBodyPart.updateHeaders(MimeBodyPart.java:1575) at javax.mail.internet.MimeMessage.updateHeaders(MimeMessage.java:2271) at javax.mail.internet.MimeMessage.saveChanges(MimeMessage.java:2231) at javax.mail.Transport.send(Transport.java:123) at org.apache.nifi.processors.standard.PutEmail.send(PutEmail.java:541) at org.apache.nifi.processors.standard.PutEmail.onTrigger(PutEmail.java:395) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1174) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.ClassNotFoundException: com.sun.activation.registries.LogSupport at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) I know LogSupport classes is missing tried using LogSupport classes , javax.mail and activation-1.0.2, but not finding the LogSupport classes. Any idea what could be solution. Thanks --Murali
... View more
Labels:
02-28-2021
10:23 PM
1 Kudo
@murali2425 Here are two possible solutions. At solution 1 the value is set into an array. At solution 2 I took the flowfile-content (json) and set it into an attribute. Then you can work with expression language to get the value. For testing the syntax I recommend this site: http://jsonpath.herokuapp.com/
... View more
02-25-2021
07:23 PM
Hi I have a situation where I will be reading csv file with different fields and values using Nifi, I want extract the value of the particular filed(in my case description) and create new csv flow file with that value. the new flow file should contain only value of the description as like bellow. Eldon Base for stackable storage shelf, platinum Thanks -Murali
... View more
Labels:
01-13-2021
04:47 AM
Did you already think about to transform your csv to json and then rebuild it with JOLT?
... View more
01-08-2021
05:23 AM
1 Kudo
@murali2425 The solution you are looking for is QueryRecord configured with a CSV Record Reader and Record Writer. You also have UpdateRecord and ConvertRecord which can use the Readers/Writers. This method is preferred over splitting the file and adds some nice functionality. This method allows you to provide a schema for both the inbound csv (reader) and the downstream csv (writer). Using QueryRecord you should be able to split the file, and set attribute of filename set to column1. At the end of the flow you should be able to leverage that filename attribute to resave the new file. You can find some specific examples and configuration screen shots here: https://community.cloudera.com/t5/Community-Articles/Running-SQL-on-FlowFiles-using-QueryRecord-Processor-Apache/ta-p/246671 If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. Thanks, Steven
... View more
12-23-2020
09:33 AM
Hi Matt, Great, with your suggestion, I got what I was expecting. Thank You, --Murali
... View more
12-22-2020
10:31 PM
Hi, @murali2425 @vchhipa It seems some dependency issue while building your custom NiFi Processor, org.apache.nifi:nifi-standard-services-api-nar dependency needs to be added in pom.xml of nifi-*-nar module. Ref here <dependency>
<groupId>org.apache.nifi</groupId>
<artifactId>nifi-standard-services-api-nar</artifactId>
<version> 1.11 .3 </version>
<type>nar</type>
</dependency> Please modify your pom.xml and rebuild and see whether that fixes the issue. Please accept the answer you found most useful.
... View more
11-22-2020
08:24 AM
Hi All, I have situation where I have to pull jira ticket information using NiFi. I am just googling on this, not finding right information on this. Guys any one tried this? Do we have to use any drivers to connect to Jira from Nifi?. Looks there are some db drivers available from Jira, do we have to install those to integrate with Nifi? https://confluence.atlassian.com/doc/database-jdbc-drivers-171742.html Thanks --Murali
... View more
Labels:
06-02-2020
02:38 AM
Do we have to take any kind of licences, when we use Apachi Nifi at enterprise level? How securely we can use it for finical data at client environment? What kind of data transformation does NiFi and Spark combination supports? Does only supports Streaming data ? or it also supports historical big data? How is to integrate NiFi and Sprak? two communication is possible?
... View more
Labels:
05-18-2020
01:27 AM
The issue is with the missing of square brackets at starting and @ ending .. The working query is .. [{ "$group": { "_id": { "X": "$X", "Y_DT": "$Y_DT", "Z": "$Z" }, "adj": {"$sum": "$adj" }, "bjc": {"$sum": "$bjc" }, "jbc": {"$sum": "$jbc" }, "mnk": {"$sum": "$mnk"} } }]
... View more
05-14-2020
10:59 AM
Hi Friends, I have scenario, where I m using the getMongo processor to read all documents(rows) present in the mongodb colelction. The getMongo is reading one document as one flow file, suppose in a collection if I have 50 documents(rows), then each row is reading as one flowfile(50 flowfiles). Now I want to convert each flowfile into csv type and put into only one flowfile(csv file). How to do that? GetMogo-->convertrecord ?
... View more
- Tags:
- NiFi
Labels:
05-11-2020
07:54 AM
Hi Friends, I have a scenario, need to import big documents(.xlsx), then converting to csv and then to json , then using putMongo using Nifi. Now I want to query on these collections(2), like any select sql query with where clause ... the SQL query: select t1.X, t1.Y_DT,t1.Z,t1.adj,t1.bjc,t1.jbc,t1.mnk,t2.adj1,t2.bjc1,t2.jbc1,t2.mnk1 from inpt1 t1, input2 t2 where t1.X = t2.X AND t1.Y_DT=t2.Y_DT AND t1.Z = t2.Z; A similar mongodb query needed…, How to do using Nifi ? something like above. If there is any sample workflow, plz point to me to that.
... View more
- Tags:
- NiFi
Labels:
05-04-2020
11:17 AM
Hi, i think your data-format .xlsx is Content of your Flowfile and you don´t have to convert it firtstly? So you can easy add a UpdateAttribute Processor and add the Dynamic-Property "filename" With this Property you overwrite the actually set Filename (you can see there is already an Attribute called "filename" from beginning). As Value you can set the name via Attribute and add the format there e. g. ${filename:append('.xlsx')} or just hard coded like file123.xlsx And after that you add a PutFile or FTP or whatever Processor and send it to the Path you set there.
... View more
04-30-2020
09:27 AM
Hi Friends, I have scenario, where I am reading the flowfile using python script from ExecuteStreamCommand, then after doing some operation on the dataframe I want send back the data type as flowfile which should be of type xlsx. In my correct scenario the python script simply passing object of the xlsx file to ExecuteStreamCommand, as " <pandas.io.excel._xlsxwriter._XlsxWriter object at 0x000001C856D99E88> " the code snippet is ... import sys, csv; import pandas as pd class CustomPythonScript(): pd.set_option( 'display.max_rows' , None ) pd.set_option( 'display.max_columns' , None ) pd.set_option( 'display.width' , None ) pd.set_option( 'display.max_colwidth' , - 1 ) df = pd.read_csv(sys.stdin, sep = ' \t ' ) data = df.groupby([ 'X_temp250' , 'Y_DT_temp250' , 'Z_temp250' , 'X_temp350' , 'Y_DT_temp350' , 'Z_temp350' ]) \ .sum( numeric_only = True ).reindex( columns = set (df.columns) - { 'X_temp250' , 'Y_DT_temp250' , 'Z_temp250' , 'X_temp350' , 'Y_DT_temp350' , 'Z_temp350' }).reset_index() df1 = pd.DataFrame(data) writerExcel = pd.ExcelWriter( 'report_groupby.xlsx' , engine = 'xlsxwriter' ) df1.to_excel(writerExcel, sheet_name = 'report_groupby' , index = False ) writerExcel.save() print (writerExcel)
... View more
- Tags:
- NiFi
Labels:
04-28-2020
11:02 AM
it did work after adding '\t' to read_csv as 2nd arg.
... View more
04-23-2020
12:20 PM
Hi, Faerballert, Really it did work, thank you very much.
... View more
04-20-2020
03:14 AM
Is it possible that you can send clean code without confusing codelines and lost brackets?
... View more
04-07-2020
12:36 PM
Would any one help out to point to custom processor project with nifi dbcp servce controller in github for reference . I am struggling to build the nar file for the above combination getting build issues from maven plugin side, looking for any existing successfully build custom processor(sample)project, to take it from there. Great Thanks
... View more
Labels:
03-29-2020
06:09 AM
@murali2425 After you get the data from QueryDatabaseTableRecord next use updateAttribute and set filename attribute accordingly. I think (${table-name}.xls) assuming you have the attribute table-name. Next you would store the contents of flowfile (putFile, PutHDFS, S3 Bucket, etc).
... View more
03-28-2020
09:26 AM
1 Kudo
great and Thanks it was very useful for my work. Here I will be having data in the excel, I need to extract the column names(1st row) and form the create the table with primary key and foreign key and data(2nd row) in the database(mysql) from nifi.
... View more
01-25-2020
11:50 PM
Hi, This issue is resolved, it was the issue with the kafka server, which was stopped due /tmp log issue
... View more
01-23-2020
03:02 AM
Thanks Matt
... View more
01-23-2020
03:01 AM
Failed to start web server: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'accessPolicyResource' defined in file [F:\2019\softwares\nifi-registry-0.5.0-bin\nifi-registry-0.5.0\work\jetty\nifi-registry-web-api-0.5.0.war\webapp\WEB-INF\classes\org\apache\nifi\registry\web\api\AccessPolicyResource.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'authorizerFactory' defined in URL [jar:file:/F:/2019/softwares/nifi-registry-0.5.0-bin/nifi-registry-0.5.0/work/jetty/nifi-registry-web-api-0.5.0.war/webapp/WEB-INF/lib/nifi-registry-framework-0.5.0.jar!/org/apache/nifi/registry/security/authorization/AuthorizerFactory.class]: Unsatisfied dependency expressed through constructor parameter 3; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'registryService' defined in URL [jar:file:/F:/2019/softwares/nifi-registry-0.5.0-bin/nifi-registry-0.5.0/work/jetty/nifi-registry-web-api-0.5.0.war/webapp/WEB-INF/lib/nifi-registry-framework-0.5.0.jar!/org/apache/nifi/registry/service/RegistryService.class]: Unsatisfied dependency expressed through constructor parameter 1; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'getFlowPersistenceProvider' defined in class path resource [org/apache/nifi/registry/provider/StandardProviderFactory.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.nifi.registry.flow.FlowPersistenceProvider]: Factory method 'getFlowPersistenceProvider' threw exception; nested exception is org.apache.nifi.registry.provider.ProviderCreationException: Failed to load a git repository "F:\2019\nifiRegistry"
the updated class in the providers.xml is ...
the java version is ..
C:\Users\saiteju>java -version java version "1.8.0_231" Java(TM) SE Runtime Environment (build 1.8.0_231-b11) Java HotSpot(TM) 64-Bit Server VM (build 25.231-b11, mixed mode)
What's wrong from above.
Thanks
--Murali
... View more
01-12-2020
05:44 AM
2 Kudos
@murali2425 You could build a nifi flow that could “copy” files to guthub. This would need to be done by creating a custom processor or maybe even just ExecuteScript to run a custom python script. Either route you take you would need to make sure that all nifi nodes are setup with permissions to write to the github repo. Then inside of your custom processor or script you would execute the required git commands to commit (“copy”) the file(s).
... View more
01-10-2020
05:05 AM
Ahh I did not pay attention to that in the original screen shot, just tried to offer the syntax for the json parsing. Glad you got it to work! Isn't learning NiFi fun? I love it.
... View more
01-02-2020
09:56 AM
Hi, I have data(say key) generated from one of machine learning model, I red this key from kafka and passed to Nifi, based on the key i need to execute the file or script(say python) or bot script( this might restart the system ) from NiFi? . Plz provide me some input it will be great helpful?
... View more
Labels: