Member since
01-02-2020
40
Posts
3
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2977 | 12-23-2020 09:33 AM | |
961 | 05-18-2020 01:27 AM | |
1070 | 04-28-2020 11:02 AM | |
2427 | 04-23-2020 12:20 PM | |
1189 | 01-25-2020 11:50 PM |
03-03-2021
07:54 PM
Hi, I have configured PutEmail to mailtrap.io. the putEmail configuration The exception : 2021-03-04 03:08:56,560 ERROR [Timer-Driven Process Thread-9] o.a.nifi.processors.standard.PutEmail PutEmail[id=5744df4c-7f6d-3fdb-3243-eb175358dd83] PutEmail[id=5744df4c-7f6d-3fdb-3243-eb175358dd83] failed to process session due to java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport; Processor Administratively Yielded for 1 sec: java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport java.lang.NoClassDefFoundError: com/sun/activation/registries/LogSupport at javax.activation.MailcapCommandMap.<init>(MailcapCommandMap.java:179) at javax.activation.CommandMap.getDefaultCommandMap(CommandMap.java:85) at javax.activation.DataHandler.getCommandMap(DataHandler.java:167) at javax.activation.DataHandler.getDataContentHandler(DataHandler.java:629) at javax.activation.DataHandler.writeTo(DataHandler.java:329) at javax.mail.internet.MimeUtility.getEncoding(MimeUtility.java:340) at javax.mail.internet.MimeBodyPart.updateHeaders(MimeBodyPart.java:1575) at javax.mail.internet.MimeMessage.updateHeaders(MimeMessage.java:2271) at javax.mail.internet.MimeMessage.saveChanges(MimeMessage.java:2231) at javax.mail.Transport.send(Transport.java:123) at org.apache.nifi.processors.standard.PutEmail.send(PutEmail.java:541) at org.apache.nifi.processors.standard.PutEmail.onTrigger(PutEmail.java:395) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1174) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.ClassNotFoundException: com.sun.activation.registries.LogSupport at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) I know LogSupport classes is missing tried using LogSupport classes , javax.mail and activation-1.0.2, but not finding the LogSupport classes. Any idea what could be solution. Thanks --Murali
... View more
Labels:
- Labels:
-
Apache NiFi
02-28-2021
07:16 PM
Hi, I have json flowfile, [{"prediction":"Test2"},{"prediction":"Test2"}]. I am want extract the 1st part of the array ie prediction":"Test2 my destination is flowfile-attribute using EvaluateJsonPath . I have added a custom attribute mlresult --->$.prediction[0], to extract the "Test2" into mlresult, but I am getting empty string for the mlresults for the flow file attribute section. flow file : [{"prediction":"Test2"},{"prediction":"Test2"}] EvaluateJsonPath configuration: How to get the value(only one) Test2 to mlresult? Thanks --Murali
... View more
Labels:
- Labels:
-
Apache NiFi
01-12-2021
07:09 PM
Hi, I have a scenario where I will be reading csv file, from that i have extract column value(s) and assign to custom attribute(s). From custom attributes I have to form a json file. I have gone through the extractText processor we need to pass regular expression to get the text, not quite understanding with that, is there any better and easy way to get above requirement? sample csv file. ID Description Status project 8075 John Done xyz100 Extract ID and project and create custom attributes like projectID=ID and modelProject=project. Thanks --Murali
... View more
Labels:
- Labels:
-
Apache NiFi
01-07-2021
09:29 AM
Hi All,
I have a scenario where I will get a number of records from csv file, the task is to read the csv and split each record as one file and save the each file and sheet name with the record name of the 1st column(exclude 1st row as header).
1. What I have done is read the csv file using Getfile,
2. then used splittext processor, to split each record as one csv file by setting property Header line count to 1.
3. then need to extract the 1st record at column1, use that record(2nd row and 2nd column) value as file name and sheet name for the each individual file
Original csv file:
after split the there should two files, one with the name ab123.csv and c35ks.csv and also sheet name also should be changed.
ID
Description
status
ab123
Eldon Base for stackable storage shelf, platinum
ab123.csv
ID
Description
status
c35ks
1.7 Cubic Foot Compact "Cube" Office Refrigerators
c35ks.csv
How to get above out puts after the work flow.
... View more
Labels:
- Labels:
-
Apache NiFi
12-23-2020
09:33 AM
Hi Matt, Great, with your suggestion, I got what I was expecting. Thank You, --Murali
... View more
12-23-2020
05:29 AM
I have a scenario where list of files are coming from previous processor, where for each file, I have to create json file with attributes of the flowfile. In AttributesToJSON processor configuration there is option to extract pipeline attributes and can create json files/object, if we set Include Core Attributes to true, it will read some of the file properties and forms the json file. the out for the above case in my scenario is …
{"fragment.size":"125"
file.group:"root",
file.lastModifiedTime:"2020-12-22T15:09:13+0000",
fragment.identifier:"ee5770ea-8406-400a-a2fd-2362bd706fe0",
fragment.index:"1",
file.creationTime:"2020-12-22T15:09:13+0000",
file.lastAccessTime:"2020-12-22T17:34:22+0000",
segment.original.filename:"Sample-Spreadsheet-10000-rows.csv",
file.owner:"root",
fragment.count:"2",
file.permissions:"rw-r--r--",
text.line.count:"1"}
}
But the files has other properties, like absolute.path, filename, uuid are missing in the above json file.
My requirement is, get the absolute.path, filename and uuid and concatenate absolute.path+/+filename, assign this to custom attribute say filepath:absolute.path+/+filename and also add uuid to json object. so my json file should like { uuid:"file uuid value", filepath:"absolute.path+/+filename" } any inputs to get above form of json file.
... View more
Labels:
- Labels:
-
Apache NiFi
05-18-2020
01:27 AM
The issue is with the missing of square brackets at starting and @ ending .. The working query is .. [{ "$group": { "_id": { "X": "$X", "Y_DT": "$Y_DT", "Z": "$Z" }, "adj": {"$sum": "$adj" }, "bjc": {"$sum": "$bjc" }, "jbc": {"$sum": "$jbc" }, "mnk": {"$sum": "$mnk"} } }]
... View more
05-14-2020
09:44 AM
Hi friends, Help me out get out this situation
... View more
05-13-2020
12:29 PM
Hi Friends, I have mongo query which running perfectly fine from mongo shell. b.test650.aggregate( [ { $group: { "_id": { X: "$X", Y_DT: "$Y_DT", Z: "$Z" }, adj: {$sum: "$adj" }, bjc: {$sum: "$bjc" }, jbc: {$sum: "$jbc" }, mnk: {$sum: "$mnk"} } } ] ) The same query when in ran from Nifi , RunMangoAggregation throwing out error, though the mongo aggregation query changed to json type query ... { "$group": { "_id": { "X": "$X", "Y_DT": "$Y_DT", "Z": "$Z" }, "adj": {"$sum": "$adj" }, "bjc": {"$sum": "$bjc" }, "jbc": {"$sum": "$jbc" }, "mnk": {"$sum": "$mnk"} } } Getting following error .. error run mongodb aggregation query.: com.fasterxml.jackson.databind.exc.MismatchedInputException:canot deserialize instace of java.util.ArayList' out of START_OBJECT token Nifi workflow.. Processor(runMangoAggregation) configuration What is the change I need to do in json query which supposed to executed @ runMongoAggregation processor?
... View more
- Tags:
- NiFi
Labels:
- Labels:
-
Apache NiFi
05-03-2020
05:25 AM
Hi Friends, I have scenario, where I need to convert flowfile as .xlsx file on to disk, is it possible with nifi? Thanks
... View more
Labels:
- Labels:
-
Apache NiFi
04-28-2020
11:02 AM
it did work after adding '\t' to read_csv as 2nd arg.
... View more
04-28-2020
07:47 AM
I have scenario where I am getting the file(stream:flowfile of NIFI) as of type csv file, then creating the dataframe and dumping it thats it. But after creating the dataframe the structure of the file got disturbed, if I open the same flowfile on my disk i could see clear structure with columns separated with tab, but with python dataframe I am not getting the same structure, if I get same structure i can perform row manipulation. Here What I am doing :
1 : using ExecuteSQL processor , I am getting database record ,
2 : then passing this record to ConvertRecord processor to convert this avro record type csv file separated by tab .
convertRecordSetWriter settings...
The output of the flowfile is ... 3 : Then the reading flowfile ( step 2 folowfile ) as python data using ExecuteStreamCommand , coz I am want to perform some action on the database record , to do this my record structure has been changed in data frame .
... View more
Labels:
- Labels:
-
Apache NiFi
04-23-2020
12:20 PM
Hi, Faerballert, Really it did work, thank you very much.
... View more
04-22-2020
10:49 AM
I have a situation, where I have to get the combined data from two tables from Mysql database, ExecuteSql processor throwing out bellow error, where as the same query is working fine from Mysql Workbench.
2020-04-22 14:55:06,597 ERROR [Timer-Driven Process Thread-8] o.a.nifi.processors.standard.ExecuteSQL ExecuteSQL[id=9d81c7ae-0171-1000-f638-2f1e60bfe48e] Unable to execute SQL select query select t1.CODE_CLOCK_ID, t1.COMPANY_CD, t1.GROUP_SEGMENT_L1, t1.GROUP_SEGMENT_CD, t2.CODE_CLOCK_ID, t2.COMPANY_CD, t2.GROUP_SEGMENT_L1, t2.GROUP_SEGMENT_CD from sheet26 t1 INNER JOIN sheet27 t2 on t1.CODE_CLOCK_ID = t2.CODE_CLOCK_ID and t1.COMPANY_CD = t2.COMPANY_CD LIMIT 0, 1000; due to org.apache.nifi.processor.exception.ProcessException: org.apache.avro.AvroRuntimeException: Duplicate field CODE_CLOCK_ID in record any.data.sheet26: CODE_CLOCK_ID type:UNION pos:4 and CODE_CLOCK_ID type:UNION pos:0.. No FlowFile to route to failure: org.apache.nifi.processor.exception.ProcessException: org.apache.avro.AvroRuntimeException: Duplicate field CODE_CLOCK_ID in record any.data.sheet26: CODE_CLOCK_ID type:UNION pos:4 and CODE_CLOCK_ID type:UNION pos:0. org.apache.nifi.processor.exception.ProcessException: org.apache.avro.AvroRuntimeException: Duplicate field CODE_CLOCK_ID in record any.data.sheet26: CODE_CLOCK_ID type:UNION pos:4 and CODE_CLOCK_ID type:UNION pos:0. at org.apache.nifi.processors.standard.AbstractExecuteSQL.lambda$onTrigger$1(AbstractExecuteSQL.java:301) at org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2746) at org.apache.nifi.processors.standard.AbstractExecuteSQL.onTrigger(AbstractExecuteSQL.java:297) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:830) Caused by: org.apache.avro.AvroRuntimeException: Duplicate field CODE_CLOCK_ID in record any.data.sheet26: CODE_CLOCK_ID type:UNION pos:4 and CODE_CLOCK_ID type:UNION pos:0. at org.apache.avro.Schema$RecordSchema.setFields(Schema.java:651) at org.apache.avro.SchemaBuilder$FieldAssembler.endRecord(SchemaBuilder.java:2013) at org.apache.nifi.util.db.JdbcCommon.createSchema(JdbcCommon.java:636) at org.apache.nifi.util.db.JdbcCommon.convertToAvroStream(JdbcCommon.java:239) at org.apache.nifi.processors.standard.sql.DefaultAvroSqlWriter.writeResultSet(DefaultAvroSqlWriter.java:49) at org.apache.nifi.processors.standard.AbstractExecuteSQL.lambda$onTrigger$1(AbstractExecuteSQL.java:299) ... 13 common frames omitted
what could be issue and what I have to change in query to get the record set from database?
I am using Nifi 1.11.4.
This issue happens even excluding INNER or even replacing INNER JOIN with "where" in the select query.
... View more
Labels:
- Labels:
-
Apache NiFi
04-15-2020
11:25 AM
I am trying to send the original flow file as input to the next processor, but ending up getting the error, plz someone help out to get out this situation. public void onTrigger ( ProessorContext context , ProcessSessionsession ) throws ProceException {
FlowFile flowfile = session . get ();
if ( flowfile == null ){
return ;
}
ArrayList < String > headData = new ArrayList < String >();
try { session . read ( flowfile , new InputStreamCallback (){
final DBCPService = context . getProperty ( CONNECTION_POOL ). asContollerService ( DBCPService . class );
String query = " CREATE TABLE MODEL (" ;
@SuppressWarnings ( "deprecation" )
public void process ( InputStream inputStream ) throws IOException {
try {
OPCPackage pkg = OPCPackage . open ( inputStream );
XSSFWorkbook workbook = new XSSFWorkbook ( pkg ); workbook . getAllNames ();
String dateheader = "date"
XSSFSheet sheetName = workbook . getSheet ( 0 );
Row row = sheetName . getRow ( 0 );
for ( Cell cell : row ) {
switch ( cell . getCellType ()){
case NUMERIC :
if ( HSSDataUtil . isCellDateFromated ( cell )){
DataFormatter dataFromatter = new DataFormatter (); headData . add ( dataFromatter . formatCellValue ( cell ); query += dataFromatter . formatCellValue ( cell )+ " " + "INT" ;
} else { headData . add ( String . valueOf ( cell . getNumericCellValue ()));
}
break ;
case STRING : headData . add ( cell . getStringCellValue ());
if ( cell . getStringCellValue (). toLowerCase (). contains ( dateheader )) query += cell . getStringCellValue () + " " + "TIMESTAMP," ;
else query += cell . getStringCellValue () + " + " VARCHAR ( 50 ), ";
break ;
case BOOLEAN : headData . add ( String . valueOf ( cell . getBooleanCellValue ());
break ;
default : headData . add ( "" );
break ;
}
} query = query . substring ( 0 , query . length () - 1 ); query += ")" ; workbook . close ();
final Connection con = DBCPService . getConnection ();
try { java . sql . PreparedStatement = con . prepareStatement ( query );
PreparedStatement . execute (); con . commit (); session . transfer ( flowfile , REL_SUCCESS );
} catch ( SQL Exception e ){ e . printStackTrace (); session . transfer ( flowfile , REL_FAILURE );
}
} catch ( InvalidFromatException ife ){ getLogger (). error ( " only .xlsx excel files are supprted" , ife ); thrownew UnsupportedOperationException ( "Only .xlsx OOXML files are substring" , ife );
}
}
});
{ catch ( RuntimeException ex ) { getLogger (). error ( "Failed to process incoming Excel document. " + ex . getMessage (), ex );
FlowFile failedFlowFile = session . putAttribute ( flowfile , testxlsqlProcessor . class . getMessage ());
}
final StringBuilder stringBuilder = new StringBuilder (); flowfile = session . write ( flowfile , new StreamCallback (){
public void process ( InputStream in , OutputStream out ) throws IOException { stringBuilder . append ( IOUtils . copy ( in , out ));
}
});
}
} The problamatic code .. ********************************************************** final StringBuilder stringBuilder = new StringBuilder (); flowfile = session . write ( flowfile , new StreamCallback (){
public void process ( InputStream in , OutputStream out ) throws IOException { stringBuilder . append ( IOUtils . copy ( in , out ));
}
}); ***************************************************** If I don't add outputstream, I am getting the exception transfer relationship not specified.
... View more
Labels:
- Labels:
-
Apache NiFi
04-07-2020
06:52 AM
I am using latest nifi version .. [INFO] Generating documentation for NiFi extensions in the NAR... [INFO] Found a dependency on version 1.11.4 of NiFi API
... View more
04-06-2020
10:48 AM
Hi, I am trying to build custom nifi processor with controller service, I have configured projects setting to build nar files, while building getting following error, this looks to be from nifi libraries, plz let me know is there any solution or workaround exiting for this. [ERROR] Could not generate extensions' documentation org.apache.maven.plugin.MojoExecutionException: Failed to create Extension Documentation at org.apache.nifi.NarMojo.generateDocumentation (NarMojo.java:596) at org.apache.nifi.NarMojo.execute (NarMojo.java:499) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289) at org.apache.maven.cli.MavenCli.main (MavenCli.java:193) at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:567) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347) Caused by: java.lang.NullPointerException at org.apache.nifi.NarMojo.getRequiredServiceDefinitions (NarMojo.java:708) at org.apache.nifi.NarMojo.writeDocumentation (NarMojo.java:634) at org.apache.nifi.NarMojo.writeDocumentation (NarMojo.java:605) at org.apache.nifi.NarMojo.generateDocumentation (NarMojo.java:577) at org.apache.nifi.NarMojo.execute (NarMojo.java:499) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289) at org.apache.maven.cli.MavenCli.main (MavenCli.java:193) at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:567) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347) [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for custom-processor 1.0-SNAPSHOT: [INFO] [INFO] custom-processor ................................... SUCCESS [ 6.729 s] [INFO] nifi-sample-api .................................... SUCCESS [ 4.566 s] [INFO] nifi-sample ........................................ SUCCESS [ 12.118 s] [INFO] nifi-custom-processors ............................. SUCCESS [ 10.350 s] [INFO] nifi-custom-nar .................................... FAILURE [ 4.535 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 44.067 s [INFO] Finished at: 2020-04-06T22:23:28+05:30 [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.nifi:nifi-nar-maven-plugin:1.3.1:nar (default-nar) on project nifi-custom-nar: Failed to create Extension Documentation: NullPointerException -> [Help 1]
... View more
Labels:
- Labels:
-
Apache NiFi
03-28-2020
09:26 AM
1 Kudo
great and Thanks it was very useful for my work. Here I will be having data in the excel, I need to extract the column names(1st row) and form the create the table with primary key and foreign key and data(2nd row) in the database(mysql) from nifi.
... View more
03-28-2020
09:20 AM
I have a configuration where the QueryDatabaseTableRecord queries the table from mysql, i have used RecordWriter as CSVRecordSetWriter and used CSV Format as Tab-Delimited. I want to store this flowfile as table-name with the extension xlsx, any help really appreciated.
... View more
Labels:
- Labels:
-
Apache NiFi
03-27-2020
12:22 AM
Hi,
I did nifi and mysql setup in stand alone machine, where I have scenario the input will be in excel form, i need to create table in mysql db.
I have seen processors where we can do UPDATE, INSERT, or DELETE SQL, does create will be possible?
Also Do I need to write custom processor to extract the column names from the excel?
... View more
Labels:
- Labels:
-
Apache NiFi
01-25-2020
11:50 PM
Hi, This issue is resolved, it was the issue with the kafka server, which was stopped due /tmp log issue
... View more
01-25-2020
10:59 PM
Hi,
I have a scenarios getting the flow file from NLPProcessor as jason object, then passing to PublishKafka, the PublishKafka throwing out following error TimeoutException error, what is wrong in the configurations.
PublishKafka configurations...
The issue looks to be configuration, any help really appreciated.
Thank you,
--Murali
... View more
Labels:
- Labels:
-
Apache Kafka
01-23-2020
03:02 AM
Thanks Matt
... View more
01-21-2020
09:54 PM
The Apache NiFi registry installation guide is not mentioned support ability for windows, could any one tried installing it on Windows?
... View more
Labels:
- Labels:
-
Apache NiFi
01-12-2020
04:48 AM
H Charan, Thanks for the information provided, here what what I looking is something like GetFile and PutFile to GitHub, not actually upload/download or push/pull. I am looking for copy and past something like that. Thanks --Murali
... View more
01-11-2020
09:27 AM
Hi,
I have a scenario, where I have to copy files to github from local directory is it possible?
if any reference templates available plz share.
Thanks
--Murali
... View more
Labels:
- Labels:
-
Apache NiFi
01-10-2020
03:17 AM
2 Kudos
The issue is I was not using right value for the attribute Destination, the value suppose to be flowfile-attribute in my case. Thanks Steven for suggestions.
... View more
01-09-2020
07:12 PM
Hi Steven, Great Thank full to you, I tried the workaround but its failed, the problem is the attribute is not coming out from EvaluateJasonPath, the attribute which I supposed to use in RouteOnAttribute for comparison. The following is the output of attributes property file from EvalueJasonPath processor. If I get attribute HR.finance.user from EvaluateJasonPath, then i can use that attribute in RouteOnAttribute for comparison of attribute value.
... View more
01-09-2020
11:43 AM
Hi Steven, I doing a poc(learning nifi and doing it), getting the data successfully from EvaluteJasonPath its going to matched processor with the only extracted data, but the flow file is missing when it is going to RouteOnAttribute, I could see only data extracted present but total flow file is missing, because of this my condition is not getting validated and going to unmatched status.
... View more
01-09-2020
11:17 AM
ohh my BAD, thank you Steven.
... View more