Member since
10-19-2016
12
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
268 | 01-13-2017 02:41 PM |
07-03-2018
06:27 AM
There are around 15 tables in my sentry database. I am not able to relate to few tables and also couldn't find proper documentation. Tables in sentry database: AUTHZ_PATH AUTHZ_PATHS_MAPPING AUTHZ_PATHS_SNAPSHOT_ID SENTRY_GM_PRIVILEGE SENTRY_PATH_CHANGE SENTRY_PERM_CHANGE SEQUENCE_TABLE It would be great if I can find some documentation or experienced opinions 🙂
... View more
Labels:
01-13-2017
02:41 PM
No, there are no logs in nifi-app.log file, The issue is solved now.. thanks for replying. Issue was with authorizations.xml which had policies for old instance. I cleaned that and its working fine now.
... View more
01-13-2017
09:38 AM
I am using rest-api of nifi to upload a template, create instance of it, modify processors and run. I am able to do this with http protocol. Now, I secured my nifi with https and user credentials. I am trying to achieve the same (upload, create instance, etc) with rest-api. I need to change the policies on uploaded template before creating the instance everytime and again on the template instance to modify it. Is it the correct behaviour of nifi or am I missing something? Also, after creating the instance of template it gives "500 Internal Server Error" and I cannot get the id of the instance from api.
... View more
Labels:
11-23-2016
12:25 PM
I want to dump my data in Oracle database to Hdfs in Avro format (later I am using create external table command to create a table in hive over it). I used ExecuteSQL -> PutHDFS processors and data was successfully dumped onto Hdfs in avro (default) format. But when I check the schema in avro, it shows all the datatypes of all columns as string. This will create a problem when I create external hive table over this dump. I want to know if this is a bug or i am doing something wrong.
... View more
Labels:
11-19-2016
09:22 AM
@Matt Burgess Yes, using "success" relationship I would only know if current (single) flowfile has been wirtten successfully onto hdfs.. how would I know if all my files are finished processing exactly once?
... View more
11-18-2016
07:48 AM
I am trying to copy files from my local machine to a remote hdfs. I am using GetFile -> PutHDFS processors. My exact usecase is: - I want to know as soon as the copy is done (Currently I am using rest api to track bytes tranferred to know this) - Copy just once - Keep the source files Problems I am getting: - If I configure for keeping the sources files and scheduling time to 0 secs, GetFile processor is creating flowfiles again and again for same files - I dont think I should configure scheduling time to large value as each task processes only one file and waits for next schedule Please help. Open to try other approaches, Thanks.
... View more
Labels:
11-18-2016
07:27 AM
Can you update with the logs from log file? It will show the actual error trace.
... View more
10-20-2016
11:13 AM
Thanks you for the information. So, if I set "Run Schedule" to default i.e ) sec, It will run tasks one after other again and again. And If I want it to execute only once, I would make it to some huge value or use event or CRON timer.
... View more
10-19-2016
01:50 PM
I am using PutHiveQL processor. How can I make sure that the content my flow file is a Hive QL statement? Also, if I use the QueryDatabaseTable -> PutHiveQL flow for my usecase would it work? And what processor should I use to create flow files with HiveQL? I thought that PutHiveQL processor takes flow files with records as input and converts them into HiveQL statements.
... View more
10-19-2016
01:41 PM
I did not change anything on scheduling tab as I wanted it to run asap. The problem is that the processor is running continuously. So If I have 10 records in actual table, I see the count on hdfs keep on increasing (dumping these 10 again and again). I expect it to stop after moving just these initial 10 records onto hdfs.
... View more
10-19-2016
10:33 AM
1 Kudo
I am trying to move a table from oracle to hive. After referring some usecases here, I used QueryDatabaseTable -> convertAvroToOrc -> putHdfs -> ReplaceText -> putHiveQL processors for my usecase. but the flow is failing at putHive processor. I am getting the following exception: Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 1:0 cannot recognize input near 'ORC' 'P' '"\nNIFI1NIFI3P\00NIFI1NIFI2NIFI3\00\n\00\n\n\n\00Asia/Kolkata\n\nP\00\n"'
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315) ~[hive-service-1.2.1.jar:1.2.1]
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:112) ~[hive-service-1.2.1.jar:1.2.1]
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:181) ~[hive-service-1.2.1.jar:1.2.1]
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257) ~[hive-service-1.2.1.jar:1.2.1]
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:419) ~[hive-service-1.2.1.jar:1.2.1]
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:406) ~[hive-service-1.2.1.jar:1.2.1] PS: Table in oracle has just one column (varchar(100)) with 3 records only and values (NIFI1, NIFI2, NIFI3)
... View more
Labels:
10-19-2016
10:10 AM
I am trying move a table from Oracle to HDFS. I used QueryDatabaseTable -> putHdfs processors and configured them. I can see data coming to hdfs.. but the process is running continuously and records are being added again and again. Am I doing anything wrong or missing something?
... View more
Labels: