Member since
12-23-2016
62
Posts
1
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1740 | 05-28-2018 04:42 AM | |
2425 | 05-25-2018 02:07 AM | |
1237 | 04-26-2018 09:50 AM | |
6420 | 04-13-2018 11:32 AM | |
1098 | 03-23-2018 04:27 AM |
03-19-2018
02:48 PM
@Geoffrey Shelton Okot yes, i follow the installation guide via mabari on this link, except i installl metron on cluster which already exist. but after i install metron and tried to start it zk_load_configs.sh is not available. how do i solve this? thanks
... View more
03-19-2018
11:42 AM
Hi, I was installing metron via ambari and it succeed, but when starting the service i got this error. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 117, in <module>
Enrichment().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 111, in restart
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 48, in configure
metron_service.init_zk_config(params)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/metron_service.py", line 41, in init_zk_config
path=ambari_format("{java_home}/bin")
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hcp/1.4.1.0-18/metron/bin/zk_load_configs.sh --zk_quorum masternode-02:2181,masternode-01:2181,insight-svr:2181 --mode PUSH --input_dir /usr/hcp/1.4.1.0-18/metron/config/zookeeper' returned 127. /bin/bash: /usr/hcp/1.4.1.0-18/metron/bin/zk_load_configs.sh: No such file or directory Thanks.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Metron
03-19-2018
11:36 AM
the problem solved. i upgrade the ambari mpack into 1.4.1.0-18 and it works now
... View more
03-16-2018
10:53 AM
Hi, i tried to install Metron via Ambari and encounter failure with stderr and stdout below: Stderr Python script has been killed due to timeout after waiting 1800 secs stdout 2018-03-16 17:16:02,008 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-03-16 17:16:02,015 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-03-16 17:16:02,017 - Group['metron'] {}
2018-03-16 17:16:02,018 - Group['ranger'] {}
2018-03-16 17:16:02,018 - Group['hdfs'] {}
2018-03-16 17:16:02,019 - Group['zeppelin'] {}
2018-03-16 17:16:02,019 - Group['hadoop'] {}
2018-03-16 17:16:02,019 - Group['nifi'] {}
2018-03-16 17:16:02,020 - Group['users'] {}
2018-03-16 17:16:02,020 - Group['knox'] {}
2018-03-16 17:16:02,021 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,023 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,024 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,025 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,027 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2018-03-16 17:16:02,028 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['nifi'], 'uid': None}
2018-03-16 17:16:02,030 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,031 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,033 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,034 - User['streamline'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,035 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,037 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,038 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,040 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,041 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-03-16 17:16:02,043 - User['metron'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,044 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,045 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,047 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,048 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,050 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-03-16 17:16:02,051 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,053 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,054 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,055 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,056 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,059 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-03-16 17:16:02,073 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-03-16 17:16:02,074 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-03-16 17:16:02,075 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,078 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,079 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-03-16 17:16:02,099 - call returned (0, '1053')
2018-03-16 17:16:02,101 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1053'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-03-16 17:16:02,115 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1053'] due to not_if
2018-03-16 17:16:02,117 - Group['hdfs'] {}
2018-03-16 17:16:02,118 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-03-16 17:16:02,119 - FS Type:
2018-03-16 17:16:02,120 - Directory['/etc/hadoop'] {'mode': 0755}
2018-03-16 17:16:02,156 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-03-16 17:16:02,157 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-03-16 17:16:02,197 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,216 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,218 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,219 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,226 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,227 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,228 - Repository['HDF-3.0-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,235 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,236 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,237 - Repository['elasticsearch-5.x-repo-1'] {'append_to_file': True, 'base_url': 'https://artifacts.elastic.co/packages/5.x/yum', 'action': ['create'], 'components': [u'ELASTICSEARCH', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,244 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,245 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,246 - Repository['ES-Curator-5.x-repo-1'] {'append_to_file': True, 'base_url': 'http://packages.elastic.co/curator/5/centos/7', 'action': ['create'], 'components': [u'CURATOR', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,254 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,254 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,255 - Repository['kibana-5.x-repo-1'] {'append_to_file': True, 'base_url': 'https://artifacts.elastic.co/packages/5.x/yum', 'action': ['create'], 'components': [u'KIBANA', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,262 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,263 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,264 - Repository['HCP-1.4.0.0-38-repo-1'] {'append_to_file': True, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HCP/centos6/1.x/BUILDS/1.4.0.0-38', 'action': ['create'], 'components': [u'METRON', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,271 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,272 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,273 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,467 - Skipping installation of existing package unzip
2018-03-16 17:16:02,467 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,488 - Skipping installation of existing package curl
2018-03-16 17:16:02,488 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,508 - Skipping installation of existing package hdp-select
2018-03-16 17:16:02,518 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-03-16 17:16:02,529 - Skipping stack-select on METRON because it does not exist in the stack-select package structure.
2018-03-16 17:16:03,009 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-03-16 17:16:03,013 - Package['metron-common'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:03,113 - Skipping installation of existing package metron-common
2018-03-16 17:16:03,114 - Package['metron-data-management'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:03,125 - Installing package metron-data-management ('/usr/bin/yum -d 0 -e 0 -y install metron-data-management')
Command failed after 1 tries Thank You
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Metron
03-12-2018
11:32 AM
Hi, i tried to use PutHBaseJSON but i got ans error like attached picture. Thanks
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache NiFi
02-26-2018
08:59 AM
Hi, i want to install hawq on ambari managed cluster, but when i follow guide from here, but on download repo section it said "End of Avalaibility". how am i suppose to download HDB repo? Thanks.
... View more
Labels:
- Labels:
-
Apache Ambari
02-06-2018
04:38 AM
Thanks, it works now.
... View more
02-05-2018
10:43 AM
mapred-mapred-historyserver-masternode-01.zip I've installed HDP via ambari and starting the services, but MapReduce2 History server service stopped several seconds after it successfully started. Attached log from /var/log/hadoop-mapreduce/mapred/mapred-mapred-historyserver-masternode-01.log thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
02-02-2018
09:18 AM
i change the permission of .jar file on /var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/files/nifi-toolkit-1.2.0.3.0.1.1-5/lib/ and it works now. Thanks.
... View more
02-02-2018
07:09 AM
@Bryan Bende below are written on nifi-ca.stderr : Error: Could not find or load main class org.apache.nifi.toolkit.tls.TlsToolkitMain
... View more
- « Previous
- Next »