Member since
04-11-2016
471
Posts
325
Kudos Received
118
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2677 | 03-09-2018 05:31 PM | |
| 3540 | 03-07-2018 09:45 AM | |
| 3239 | 03-07-2018 09:31 AM | |
| 5441 | 03-03-2018 01:37 PM | |
| 2950 | 10-17-2017 02:15 PM |
02-07-2017
07:25 PM
Could you share a screenshot or logs showing which user is launching beeline, what is the request you execute and what is the result of the command? Also is the Hive plugin for Ranger enabled in Ambari?
... View more
02-07-2017
07:13 PM
Hi @Jacqualin jasmin, Why are you saying the policy does not work? Please keep in mind, that Ranger policies on Hive are only applied if you access Hive through the Hive server (JDBC/ODBC connections, Beeline CLI, etc). If you are using the Hive CLI, then only the Ranger policies on HDFS are applied.
... View more
02-07-2017
10:44 AM
Please provide the logs of both name nodes to understand what is going on.
... View more
02-07-2017
09:34 AM
Hi @Rayudu c, If both NameNodes are down, your HDFS filesystem cannot be used by the platform and there is not much you can do since most of the applications rely on HDFS.
... View more
02-03-2017
12:48 PM
1 Kudo
Hi @Roger Young, The processor uses the 'filename' attribute of the flow file. You can use an UpdateAttribute processor to update this attribute to give a custom name to your file. Hope this helps.
... View more
02-02-2017
03:40 PM
1 Kudo
You can have a look at unit tests of this processor: https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestListenHTTP.java
... View more
02-02-2017
02:54 PM
3 Kudos
Hi @Andy Liang, If you want to send data into NiFi through HTTP REST calls, you can use ListenHTTP processor. The data send to the endpoint you define in this processor will be the content of the generated flow files. If you to manage what responses are sent to your client, you can also use the combination of HandleHttpRequest and HandleHttpResponse. The content of the flow file coming in HandleHttpResponse will be the body of the HTTP response sent to the client. This way you can, for example, create web services for external clients that need access to some data. Hope this helps.
... View more
01-24-2017
09:45 AM
2 Kudos
Hi @Edgar Orendain, The best way to understand the REST API is to open the developer view of your browser when being on NiFi UI and do what you want through the UI to see what are the REST API calls performed in the background: the UI is strictly using the same API. Regarding your question, here is what I have: Request sent to http://localhost:8080/nifi-api/process-groups/8f83b1c2-0159-1000-c091-37f5eb918bca/process-groups Where 8f83b1c2-0159-1000-c091-37f5eb918bca is the ID of my root process group. The content of my request is the following JSON: {"revision":{"clientId":"cfc9c5fd-0159-1000-e150-054ac8339ef7","version":0},"component":{"name":"test","position":{"x":647.5,"y":-299.5}}} test is the name of the process group I created and you have the coordinates of the process group. Based on your needs, you can remove some data in the JSON but the name is required if I recall correctly. Hope this helps.
... View more
01-24-2017
09:30 AM
6 Kudos
Hi @regie canada, If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop. If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database. You can have a look to the documentation here: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.GenerateTableFetch/index.html https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.QueryDatabaseTable/index.html You have additional SQL/JDBC processors based on your needs. This article should get you started: https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.html Hope this helps.
... View more
01-23-2017
09:16 PM
3 Kudos
Hi @Sean Murphy, You shouldn't have any issue with logging features. You should be able to call getLogger() in your custom processor to get an instance of ComponentLog. Another way to debug processors, if you are familiar with remote debugging, is to uncomment the following line in the ./conf/bootstrap.conf file: #java.arg.debug=-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000 This way you are able to attach a Java debugger to the NiFi instance on port 8000 using your favorite IDE. NiFi needs to be restarted. Another option is to use the code provided here (https://github.com/olegz/nifi-ide-integration), it gives you the possibility to directly launch NiFi in your IDE. But if your problem appears time to time it may not be the best approach. Hope this helps.
... View more