Member since
09-22-2016
25
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2148 | 04-28-2017 07:39 PM |
06-05-2017
05:45 PM
@vperiasamy I added this after my post : nifi.security.identity.mapping.pattern.kerb = ^(.*?)@(.*?)$ nifi.security.identity.mapping.value.kerb = $1 The policy is now working but I get the following error : Untrusted proxy corenifi01-vm.zzzzz.com Do I have to add the nodes of my cluster in Active Directory as well or do I have to add the nodes of my cluster in Ranger (principal is : corenifi01-vm.zzzzz.com@ZZZZZ.COM) ? I added them at the beginning but with this name : corenifi01-vm.zzzzz.com@AA.ZZZZ.COM
... View more
06-05-2017
04:52 PM
Hi, I am using a cluster with LDAP, Ranger et I just installed Kerberos. I can authentificate with a user defined in my AD but I get an error : Untrusted proxy nifi01-vm.zzzzz.com Do I have to add the nodes of my cluster in tha AD as well? Thank you.
... View more
06-05-2017
02:38 PM
Hi, I have a cluster with 2 nodes, installed HDF and use Ranger for security policies. I just installed kerberos on my cluster using an existing AD. I am now trying to connect to NiFi UI but I have insufficient privileges (login/password is ok). I created a policy READ/WRITE for my user raphael.mary (existing in AD) on /* like following : When I try to connect to NiFi I have insufficient privileges and I get this in Ranger Audt : The user trying to connect is raph.mary@ZZZZ.COM 1. Is that normal that the user name is with the realm name in the audit log? 2. When I try to connect I use raphael.mary as login, do I need to specify another user name? Thank you for your help.
... View more
Labels:
05-08-2017
01:20 PM
@Ward Bekker yes settings are 512MB in NiFi for nifi.initial_mem and nifi.max_mem. Is there a way to set the best values for these parameters ? Like 1/2 * Amout of RAM ?
... View more
05-02-2017
04:07 PM
Thank you @Ward Bekker Can I use it for any DBs?
... View more
05-02-2017
03:44 PM
Hi, I would like to create an external table in Hive on different databases (MySQL, Oracle, DB2..) because I do not want to move the data, either in HDFS or in Hive directly. How can I do that?
... View more
Labels:
04-28-2017
07:39 PM
1 Kudo
It happened that I imported a template with the processor GetTwitter and the version of the processor seemed to be old. I removed it and added a new one from the tool bar. It works now!
... View more
04-28-2017
06:31 PM
I did but I have no more information in debug mode : 2017-04-28 14:05:35,507 ERROR [Timer-Driven Process Thread-8] o.a.nifi.processors.twitter.GetTwitter GetTwitter[id=60003148-812e-14ce-9c95-0df056d3325d] Received error HTTP_ERROR: HTTP/1.1 401 Authorization Required. Will attempt to reconnect
... View more
04-28-2017
06:14 PM
@apsaltis I already did and I even tested with another application (in java) and it works. Do I need to test the connection to Twitter in command line like a ping or something ? I can ping dev.twitter.com
... View more
04-28-2017
06:06 PM
@Matt Clarke Even with debug level I get this in nifi-app.log : 2017-04-28 14:05:35,507 ERROR [Timer-Driven Process Thread-8] o.a.nifi.processors.twitter.GetTwitter GetTwitter[id=60003148-812e-14ce-9c95-0df056d3325d] Received error HTTP_ERROR: HTTP/1.1 401 Authorization Required. Will attempt to reconnect
... View more
04-28-2017
05:33 PM
@Michael Young I added the certificate of twitter (twittercom.crt) in NiFi keystore (with keytool) but get the same error. Do I have to add it in the keystore or the trustore? I did not find any doc on that. The error mentions HTTP_ERROR, but if I am using SSL, why is it not HTTPS?
... View more
04-28-2017
02:50 PM
@Michael Young I tried sample and filter endpoint but I am still getting the same error. Do I need to use certificates because I actived ssl?
... View more
04-28-2017
02:45 PM
@bhagan I just changed the replacement value to : "${id:replaceEmpty('')}"||"${social_id:replaceEmpty('')}"||"${name:replaceEmpty('')}"||"${screenName:replaceEmpty('')}"||"${location:replaceEmpty('')}" An event payload would be (JSON format) : {"id":"1759930","social_id":"116161419","name":"Lucy Lloyd","screenName":"lucy_lloyd1","location":"Tunbridge Wells, England ","bio":"","lang":"en","created_date":"2010-02-21 11:35:07.0","image_url":"http://pbs.twimg.com/profile_images/787703929786232833/FRoV7x0L_normal.jpg","backgroud_image":"http://pbs.twimg.com/profile_background_images/600155425/xgqomh222q1tiq387bch.jpeg","followers":1011,"friends":862,"status":11429,"timezone":"London","url":null,"poids_social":23.98221972589741,"age":null,"gender":null,"smile":null,"glasses":null,"moustache":null,"beard":null,"sideburns":null,"country":null,"source":"LOCATION"} I am trying to convert a json to csv format. I guess that the fact to add double quotes "" in the replacement valued helped to solve the problem. I think there was special characters in group 5 (like the character $). Using double quotes happens to solve my problem but is there another solution to handle special characters like $ and carriage return? cause some of my fields have some of them.
... View more
04-28-2017
01:08 PM
Hi, I am extracting a big table and I am using a processor Replace Text but I am getting the error : The processor is configured as follow : (I extended Maximum Buffer Size to 100 MB) Any idea why I am getting this error?
... View more
Labels:
04-27-2017
01:19 PM
@mqureshi Hi, the variable hive.security.authorization.createtable.owner.grants is not in the file hive-site.xml. However I noticed there is a difference between create table and create external table. I am able to create an interne table but cannot create an external table : in the last case, I have to be owner of the source file. Is that right? Is there a way where I can be member of the group of the source file and not owner?
... View more
04-26-2017
01:32 PM
Hi, I would like to create a user that can read/write/create tables in a database. I am using a role ("developer") so that every user who has this role, can read/write/create tables in a database. I executed the following code : ----------------------------------------- create role developer; grant developer to user user1; create database db1; alter database db1 set owner role developer; grant all on database db1 to role developer; ----------------------------------------- But with this : - user1 cannot create a table in db1 - user1 cannot read tables in db1 unless I grant the user on the table Is there a way to give all the privileges to a role so that every user who has this role can read/write/create tables in a database? Do I have to grant every user on a table? If the user has the grant at the database level he should have the same grant by default on every table of the database no?
... View more
Labels:
04-23-2017
01:45 PM
Correct. I am testing 3 ways actually, and I want to know what is the best wayt to do it : 1. For ExecuteSQL : error : pb memory ----------------------------------------------------- 2. For QueryDatabaseTable : I can extract data but 1 record = 1 flow file. and it is very slow. Is it possible to generate a flow file every 1000 records for example and then merge theses flow files into a single one? ----------------------------------------------------- 3. For GenerateTableFetch : error : failed to invoke @OnScheduled method due to java.lang.RuntimeException
... View more
04-21-2017
07:51 PM
QueryDatabaseTable is like this : GenerateTableFetch is like this :
... View more
04-21-2017
07:23 PM
1 Kudo
Hi, I am using NiFi on a VM on Azure. I installed HDF and I am trying to use the processor GetTwitter. I created a Twitter App (which works, I checked), I configured the server with NTP. I installed SSL for NiFi, Ranger and Kerberos to handle the connection to the UI. But I have this error : Received Error HTTP_ERROR : HTTP/1.1 401 Authorization Required Any idea what is wrong?
... View more
Labels:
04-21-2017
07:23 PM
1 Kudo
Hi, I would like to extract a big table (MySQL, more than 3 millions rows) and to write it as a file in HDFS. What would be the best way to do it ? I tried the following processors : - ExecuteSQL : error : pb memory - QueryDatabaseTable : error : pb memory - GenerateTableFetch : error : failed to invoke @OnScheduled method due to java.lang.RuntimeException I have 20 Go of memory. What would be the best way to do it ? Can I set up parameters so that I generate more than 1 DataFlow, then merge in NiFi before loading to HDFS ? Thank you.
... View more
Labels:
04-21-2017
07:23 PM
Hi, I would like to extract a big table (MySQL, more than 3 millions rows) and to write it as a file in HDFS. What would be the best way to do it ? I tried the following processors : - ExecuteSQL : error : pb memory - QueryDatabaseTable : error : pb memory - GenerateTableFetch : error : failed to invoke @OnScheduled method due to java.lang.RuntimeException I have 20 Go of memory. What would be the best way to do it ? Can I set up parameters so that I generate more than 1 DataFlow, then merge in NiFi before loading to HDFS ? Thank you.
... View more
Labels:
04-20-2017
03:03 PM
No, only one node and 1 concurrent tasks. I changed to 0 0 10 * * ? in order to specify minutes and seconds. It is working now!
... View more
04-20-2017
02:39 PM
I am using a GetHDFS Processor with CRON driven strategy : sheduled to run every day at 10am. I have one input file to read but when the dataflow starts it gets the source file multiple times instead of 1 time (9 times in my case). Why? As a result, when I write the output dataflow, I get the following warning : file with same name already exists Should I modify the parameter Plling Interval ? (set to 0 sec by default)
... View more
Labels: