Member since
04-11-2016
471
Posts
325
Kudos Received
118
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2677 | 03-09-2018 05:31 PM | |
| 3541 | 03-07-2018 09:45 AM | |
| 3239 | 03-07-2018 09:31 AM | |
| 5441 | 03-03-2018 01:37 PM | |
| 2950 | 10-17-2017 02:15 PM |
01-18-2017
07:54 PM
Hi @Roger Young, New dependencies have been added and the install procedure has not been updated accordingly. You have two options: build the new dependency (in this case, the schema registry that you can find here: https://github.com/hortonworks/registry/tree/master/schema-registry), or go to a previous version of iot-trucking app on github. Hope this helps.
... View more
01-18-2017
02:52 PM
1 Kudo
Hi @Andy Liang, The use of Kafka and Storm will generally occur when you need to perform complex operations on your data before pushing the data in your HDP cluster (operations that cannot be performed by NiFi). Such operations can be, for example, window aggregations, complex joins, etc. If you don't need to perform such operations before your data land in the HDP cluster, then you can use NiFi + PutHDFS. Hope this helps.
... View more
01-18-2017
02:49 PM
2 Kudos
Hi @Chad Woodhedad, This is expected. Hive rules in Ranger will apply only during access through HiveServer2 (JDBC). When accessing Hive with Hive CLI, only the HDFS rules in Ranger will apply (no masking then).
... View more
01-10-2017
03:47 PM
1 Kudo
Hi @MPH, The best practice for a production environment is to have a dedicated cluster for HDF (it is easier for high availability and resources management). However, if you are not looking for high availability with only one HDF node, then you could imagine the situation where HDF is running on an edge node. However, keep in mind that, at the moment, HDP and HDF are managed by two different Ambari. Hope this helps.
... View more
01-08-2017
08:02 PM
2 Kudos
Hi @Roger Young, Based on the stacktrace, it seems that during the NiFi installation the password provided for the property "Encrypt Configuration Master Key Password" is not respecting the conditions: Cannot derive key from empty/short password -- password must be at least 12 characters
... View more
01-03-2017
11:34 AM
Hey @Sebastian Carroll, At the moment, this is not configurable. But you could imagine a custom Reporting Task scheduled every minutes to compute statistics (similarly to the Ambari reporting task).
... View more
12-15-2016
08:49 PM
2 Kudos
Hi @srini, One way to do it is to use the MonitorActivity processor that can emit a flow file if there is no activity since X minutes. This generated flow file can then be routed to a processor sending emails. Hope this helps.
... View more
11-18-2016
08:29 AM
3 Kudos
Hi @Karthik Manchala, To achieve what you are looking for, I'd replace the GetFile processor by the combination of ListFile and FetchFile processors. The first one will list files according to your conditions and will emit an empty flow files for each listed file with an attribute containing the path of the file to retrieve. The second one will actually fetch the content of the file for the given path. The first processor has a "state" and will keep information regarding already processed files so that it won't consume the same file multiple times. Besides, this approach is also recommended to allow a better load distribution when you have a NiFi cluster. Hope this helps.
... View more
11-17-2016
06:01 AM
1 Kudo
Hi Kumar, You need to set the processor with the following properties Command: /bin/ls Arguments: -lrt It is better to indicate an absolute path for the command. If you want to indicate a path (example /my/directory) for your listing, that would be: Command: /bin/ls Arguments: -lrt|/my/directory Because the delimiter is configured with | character based on your screenshot. Hope this helps.
... View more
11-15-2016
05:09 PM
Hi @bala krishnan , Not a solution but just to let you know that with the next version of NiFi (coming soon) you will be able to use ValidateCSV processor to achieve what you are looking for. In the meantime, I think that splitting the file is not going to help. Maybe trying something custom with ExecuteScript processor but probably not ideal. Hope this helps.
... View more