Member since
09-13-2015
59
Posts
18
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1801 | 12-19-2017 08:14 PM | |
1353 | 10-11-2017 02:21 PM | |
1913 | 06-12-2017 09:26 PM | |
3249 | 06-08-2017 01:36 PM | |
1111 | 11-04-2016 08:35 PM |
08-21-2018
06:50 PM
2 Kudos
Restarting NiFi nodes fails with an error message such as the one below:
<br>File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 278, in
Master().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
method(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 993, in restart
self.start(env, upgrade_type=upgrade_type)
File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 152, in start
nifi_cli.create_or_update_reg_client(params.nifi_registry_host, params.nifi_registry_url)
File "/u/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi_cli.py", line 175, in create_or_update_reg_client
existing_clients = list_reg_clients()
File "/u/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi_cli.py", line 144, in list_reg_clients
outputType="json"
File "/u/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi_cli.py", line 73, in nifi_cli
raise Fail("Failed to execute nifi cli.sh command")
Cause: This occurs when the nifi.initial.admin.identity does not have permission to access the /controller API. Workaround: Create a local user in Ranger with the name of the initial admin idenity Add a Ranger Policy for NiFi with Read/Write permissions for /controller
... View more
Labels:
07-27-2018
08:10 PM
Toad for Hadoop has been discontinued and is no longer available from Dell.
... View more
07-12-2018
04:46 PM
I believe this doc contains the information you're looking for: https://github.com/apache/ambari/blob/branch-2.1/ambari-server/docs/api/v1/alert-definitions.md#metric METRIC
METRIC source fields are used to define JMX endpoints that can be queried for values. The source/reporting and jmx/value fields are parameterized to match the property_list specified.
"source" : {
"jmx" : {
"property_list" : [
"java.lang:type=OperatingSystem/SystemCpuLoad",
"java.lang:type=OperatingSystem/AvailableProcessors"
],
"value" : "{0} * 100"
},
"reporting" : {
"ok" : {
"text" : "{1} CPU, load {0:.1%}"
},
"warning" : {
"text" : "{1} CPU, load {0:.1%}",
"value" : 200.0
},
"critical" : {
"text" : "{1} CPU, load {0:.1%}",
"value" : 250.0
},
"units" : "%"
},
"type" : "METRIC",
"uri" : {
"http" : "{{hdfs-site/dfs.namenode.http-address}}",
"https" : "{{hdfs-site/dfs.namenode.https-address}}",
"https_property" : "{{hdfs-site/dfs.http.policy}}",
"https_property_value" : "HTTPS_ONLY",
"default_port" : 0.0,
"high_availability" : {
"nameservice" : "{{hdfs-site/dfs.nameservices}}",
"alias_key" : "{{hdfs-site/dfs.ha.namenodes.{{ha-nameservice}}}}",
"http_pattern" : "{{hdfs-site/dfs.namenode.http-address.{{ha-nameservice}}.{{alias}}}}",
"https_pattern" : "{{hdfs-site/dfs.namenode.https-address.{{ha-nameservice}}.{{alias}}}}"
}
}
}
... View more
02-28-2018
11:02 PM
Unfortunately the jolt spec isn't working as intended. Is is replacing the field names with their values. On to Plan B...
... View more
02-28-2018
10:51 PM
Thank you @Matt Burgess. I will give this a shot. My alternate strategy is to infer an avro schema, and then use @Timothy Spann 's attribute cleaner and then convert back to JSON. https://github.com/tspannhw/nifi-attributecleaner-processor
... View more
02-28-2018
09:34 PM
I have json content streaming in, and I would like to rename the field names, specifically I need to remove "." as I am inserting into MongoDB which does not support periods in key names. Requirements: Only field names are modified, not content. So {"host.ip":"192.168.1.1"} -> {"host_ip":"192.168.1.1"} Field names are unknown, so I can't explicitly replace I would prefer not to split the json array into individual flow files. Has anyone tackled an issue like this before?
... View more
Labels:
- Labels:
-
Apache NiFi
01-02-2018
10:51 PM
It looks like SSH is working. This usually is DNS related; do you have the hosts file set on all the nodes? Do you have the firewall/iptables disabled?
... View more
01-02-2018
10:22 PM
The connection string looks correct to me. Do you have the license jar in the same directory as the jdbc jar?
... View more
12-19-2017
08:47 PM
So if I am understanding this correctly, you want to do the following: Windows Files Share ----> NiFi -----> Hadoop Off the top of my head I can think of a couple ways to do it. 1) Setup the Windows directory to share via FTP. This can be done using IIS on the Windows machine, or a 3rd party FTP server 2) Install MiNiFi or NiFi on the Windows machine to transmit data using site-to-site protocol to the NiFi Cluster Is there a particular reason you don't want to mount the share to the NiFi host?
... View more
12-19-2017
08:14 PM
1 Kudo
There is no native processor to open access databases files. You can handle it a couple ways... Easiest would be to export the data from access into csv and ingest that. The other method would be to connect to the server running MS Access via JDBC. However, I believe this would require a 3rd party driver, as Microsoft only has ODBC drivers for Access (but this may no longer be the case)
... View more