Member since
12-23-2016
62
Posts
1
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
446 | 05-28-2018 04:42 AM | |
510 | 05-25-2018 02:07 AM | |
350 | 04-26-2018 09:50 AM | |
1418 | 04-13-2018 11:32 AM | |
272 | 03-23-2018 04:27 AM |
09-01-2019
07:31 PM
Hi, What up to date free Threat intelligence feed i can use for metron? i use hailataxii, but hailatxii last update is 25 May 2018. Thanks.
... View more
Labels:
08-14-2019
04:44 AM
Hi, I cannot hide dissmissed alert and resolved alert in metron alert UI, Everytime i enable the button it will be disable again. please help. Thanks
... View more
Labels:
08-14-2019
04:38 AM
Hi, i got a problem with high frequency in time frame window alert. example : - firewall alert with 2000x denied trafic in 5 minutes. i use profiler to create the alert, all the event after alert trigered wich match for each condition became an alert became an alert. How i handle this kind of alert with metron? Thanks.
... View more
Labels:
06-07-2018
08:02 AM
Hi all, I'm trying to create an alert for in HCP if ther is certain field with certain value it will triger the alert. How can i do it? Thanks
... View more
- Tags:
- CyberSecurity
- Metron
Labels:
05-28-2018
08:39 AM
Hi all, I was trying to create syslog sensor in HCP, the sensor is working and the data successfully parsed and stored in elasticsearch. But year field is not available in syslog, so when i set the time in the log as 'timestamp' in GROK JasonRaw, the timestamp in elasticsearch for syslog index became year 1970. Is there any way to solve this? Thank You.
... View more
Labels:
05-28-2018
04:42 AM
Problem solved, i follow this link, and add this to the JSON Raw after "patternLabel": "timestampField": "timestamp",
"timeFields": ["timestamp"],
"dateFormat": "dd/MMM/yyyy:HH:mm:ss Z"
... View more
05-28-2018
04:31 AM
i never used storm with logstash, but in my case i need to specify the field of the log which will be used for timestamp. example : there is timestamp field in my log, and i named it 'timestamp'. then i need to specify that 'timestamp' field will be used as timestamp which will be sent to elasticsearch. and after that my problem solved. hope it helps
... View more
05-25-2018
02:07 AM
the problem solved. i follow this link : https://community.hortonworks.com/questions/167633/metron-grok-parser-indexing-and-enrichment-not-wor.html and it helps.
... View more
05-24-2018
04:10 AM
Hi all, I'm trying to add new telemetry of http-access log which have date format to HCP, but for the sensor to work i need to specify the timestampField on the sensor's JSON RAW which only recognize epoch format. How can i convert the date format of log ( ex. 20/May/2017:12:19:39 +0700) to epoch format with grok parsher? Thanks
... View more
Labels:
05-17-2018
03:13 AM
Addition, after i try and make the parser error, the error successfully sent to elastic search and hdfs with the error index. But if the parser is right it failed to sent to elasticsearch and HDFS
... View more
04-26-2018
09:54 AM
I tried to add new telemetry sensor on metron. on Metron management UI and STORM UI the sensor shows it running and transmitting. but the ElasticSearch won't create index for the sensor. Please Help. Thanks
... View more
Labels:
04-26-2018
09:50 AM
It worked after i reinstall the kafka broker on the node which once has kafka broker on it.
... View more
04-16-2018
04:46 AM
Hi all, When i try to add or edit metron sensor and input Kafka topic, i got some problem : 1. when i input the topics, the notification below textbox said "kafka topic exists. Not emitting" when my topic actually emitting message when i check it with consumer console. (picture 1) 2. After i saved the configuration, when i check the configuration again in kafka section it stated "no kafka topic". (picture 2). Thank You.
... View more
Labels:
04-13-2018
11:32 AM
It actually running after i restart the server
... View more
04-03-2018
04:20 AM
@Constantin Stanca How do i list the application ID? i tried 'yarn application -list' command but it is empty.
... View more
03-28-2018
09:25 AM
@asubramanian below are urllib3 : [root@workernode-01 ~]# pip list | grep url
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
pycurl (7.19.0)
urlgrabber (3.10)
urllib3 (1.22)
... View more
03-28-2018
08:23 AM
@asubramanian after i install request 2.6.1 like your advice, this error still persist : Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", line 59, in run_file
imp.load_source('__main__', script)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/indexing_master.py", line 18, in <module>
import requests
File "/usr/lib/python2.7/site-packages/requests/__init__.py", line 53, in <module>
from .packages.urllib3.contrib import pyopenssl
File "/usr/lib/python2.7/site-packages/requests/packages/__init__.py", line 3, in <module>
from . import urllib3
File "/usr/lib/python2.7/site-packages/requests/packages/__init__.py", line 61, in load_module
AttributeError: 'NoneType' object has no attribute 'modules'
... View more
03-28-2018
07:03 AM
@asubramanian
here is the result : [root@workernode-01 ~]# pip list | grep requests
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
requests (2.18.4)
[root@workernode-01 ~]# my cluster build on vm with RHIF bare metal, with metron version 0.4.1.1.4.1.0
... View more
03-28-2018
03:25 AM
@asubramanian I look at ambari-agent.log and find this : Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", line 59, in run_file
imp.load_source('__main__', script)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/indexing_master.py", line 18, in <module>
import requests
File "/usr/lib/python2.7/site-packages/requests/__init__.py", line 43, in <module>
import urllib3
File "/usr/lib/python2.7/site-packages/urllib3/__init__.py", line 8, in <module>
from .connectionpool import (
File "/usr/lib/python2.7/site-packages/urllib3/connectionpool.py", line 11, in <module>
from .exceptions import (
File "/usr/lib/python2.7/site-packages/urllib3/exceptions.py", line 2, in <module>
from .packages.six.moves.http_client import (
File "/usr/lib/python2.7/site-packages/urllib3/packages/six.py", line 198, in load_module
return sys.modules[fullname]
AttributeError: 'NoneType' object has no attribute 'modules'
I already update the python-request but it looks like the problem is python-urllib3 now.
... View more
03-27-2018
06:42 AM
When i start metron indexing service via ambari, the service will start successfully but stopped after few second. i look into /var/log/metron/ directory but only found metron-rest.log. where is the log file for other metron component? Thanks.
... View more
- Tags:
- CyberSecurity
- Metron
Labels:
03-26-2018
12:10 PM
It works, thanks for the advice
... View more
03-23-2018
12:36 PM
Hi, I upgraded my HCP mpack from HCP-1.4.0.0-38 to HCP-1.4.1.0-18, but the repository ambari-hdp-1.repo on each host still pointing to HCP-1.4.0.0-38 : path=/
enabled=1
gpgcheck=0
[HCP-1.4.0.0-38-repo-1]
name=HCP-1.4.0.0-38-repo-1
baseurl=http://s3.amazonaws.com/dev.hortonworks.com/HCP/centos6/1.x/BUILDS/1.4.0.0-38
and when i install the package it always fail with this error : Python script has been killed due to timeout after waiting 1800 secs Thanks.
... View more
Labels:
03-23-2018
04:27 AM
Problem solved. i need to create user nifi folder on hadoop.
... View more
03-21-2018
10:00 AM
@Geoffrey Shelton Okot I install it on bare metal, i have 1 dedicated server for ambari. if i want to uninstall metron and reinstall it on the cluster again, how to do it?
... View more
03-20-2018
07:11 AM
@asubramanian i tried to install metron via ambari with HCP-1.4.0.0-38 mpack, but failed. and i delete the service via ambari UI. then i upgrade the mpack to HCP-1.4.1.0-18 mpack, but the repo in /etc/yum.repos.d/ambari-hdp-1.repo on each host still pointing to CP-1.4.0.0-38. so i downlaod the repo for HCP-1.4.1.0-18 and place it on each host. after that the installation success. there are two version folder on my /usr/hcp folder (1.4.0.0-38 and 1.4.1.0-18). and the zk_load_configs.sh are available in 1.4.0.0-38 foder but not in 1.4.0.0-38 folder. If i want to re-install metron from the start, how to remove metron cleanly from the cluster. thank you
... View more
03-20-2018
02:24 AM
@Geoffrey Shelton Okot here are my cluster : 2 masternode 3 workernode 1 database node 1 edge node for metron component, i install it on workernode-01 and edge node (for UI). this because metron component has collocation rule with other component (e.g. metron parsher must on the same node with Kafka broker). How do i check if zookeeper already have quorum since i install and start it via ambari
... View more
03-19-2018
02:48 PM
@Geoffrey Shelton Okot yes, i follow the installation guide via mabari on this link, except i installl metron on cluster which already exist. but after i install metron and tried to start it zk_load_configs.sh is not available. how do i solve this? thanks
... View more
03-19-2018
11:42 AM
Hi, I was installing metron via ambari and it succeed, but when starting the service i got this error. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 117, in <module>
Enrichment().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 111, in restart
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/enrichment_master.py", line 48, in configure
metron_service.init_zk_config(params)
File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.1.1.4.1.0/package/scripts/metron_service.py", line 41, in init_zk_config
path=ambari_format("{java_home}/bin")
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hcp/1.4.1.0-18/metron/bin/zk_load_configs.sh --zk_quorum masternode-02:2181,masternode-01:2181,insight-svr:2181 --mode PUSH --input_dir /usr/hcp/1.4.1.0-18/metron/config/zookeeper' returned 127. /bin/bash: /usr/hcp/1.4.1.0-18/metron/bin/zk_load_configs.sh: No such file or directory Thanks.
... View more
Labels:
03-19-2018
11:36 AM
the problem solved. i upgrade the ambari mpack into 1.4.1.0-18 and it works now
... View more
03-16-2018
10:53 AM
Hi, i tried to install Metron via Ambari and encounter failure with stderr and stdout below: Stderr Python script has been killed due to timeout after waiting 1800 secs stdout 2018-03-16 17:16:02,008 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-03-16 17:16:02,015 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-03-16 17:16:02,017 - Group['metron'] {}
2018-03-16 17:16:02,018 - Group['ranger'] {}
2018-03-16 17:16:02,018 - Group['hdfs'] {}
2018-03-16 17:16:02,019 - Group['zeppelin'] {}
2018-03-16 17:16:02,019 - Group['hadoop'] {}
2018-03-16 17:16:02,019 - Group['nifi'] {}
2018-03-16 17:16:02,020 - Group['users'] {}
2018-03-16 17:16:02,020 - Group['knox'] {}
2018-03-16 17:16:02,021 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,023 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,024 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,025 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,027 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2018-03-16 17:16:02,028 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['nifi'], 'uid': None}
2018-03-16 17:16:02,030 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,031 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,033 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,034 - User['streamline'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,035 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,037 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,038 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,040 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,041 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-03-16 17:16:02,043 - User['metron'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,044 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,045 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,047 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-16 17:16:02,048 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,050 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-03-16 17:16:02,051 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,053 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,054 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,055 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-16 17:16:02,056 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,059 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-03-16 17:16:02,073 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-03-16 17:16:02,074 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-03-16 17:16:02,075 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,078 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-16 17:16:02,079 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-03-16 17:16:02,099 - call returned (0, '1053')
2018-03-16 17:16:02,101 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1053'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-03-16 17:16:02,115 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1053'] due to not_if
2018-03-16 17:16:02,117 - Group['hdfs'] {}
2018-03-16 17:16:02,118 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-03-16 17:16:02,119 - FS Type:
2018-03-16 17:16:02,120 - Directory['/etc/hadoop'] {'mode': 0755}
2018-03-16 17:16:02,156 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-03-16 17:16:02,157 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-03-16 17:16:02,197 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,216 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,218 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,219 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,226 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,227 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,228 - Repository['HDF-3.0-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,235 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-16 17:16:02,236 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,237 - Repository['elasticsearch-5.x-repo-1'] {'append_to_file': True, 'base_url': 'https://artifacts.elastic.co/packages/5.x/yum', 'action': ['create'], 'components': [u'ELASTICSEARCH', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,244 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,245 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,246 - Repository['ES-Curator-5.x-repo-1'] {'append_to_file': True, 'base_url': 'http://packages.elastic.co/curator/5/centos/7', 'action': ['create'], 'components': [u'CURATOR', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,254 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,254 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,255 - Repository['kibana-5.x-repo-1'] {'append_to_file': True, 'base_url': 'https://artifacts.elastic.co/packages/5.x/yum', 'action': ['create'], 'components': [u'KIBANA', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,262 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,263 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,264 - Repository['HCP-1.4.0.0-38-repo-1'] {'append_to_file': True, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HCP/centos6/1.x/BUILDS/1.4.0.0-38', 'action': ['create'], 'components': [u'METRON', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-03-16 17:16:02,271 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': ...}
2018-03-16 17:16:02,272 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-03-16 17:16:02,273 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,467 - Skipping installation of existing package unzip
2018-03-16 17:16:02,467 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,488 - Skipping installation of existing package curl
2018-03-16 17:16:02,488 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:02,508 - Skipping installation of existing package hdp-select
2018-03-16 17:16:02,518 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-03-16 17:16:02,529 - Skipping stack-select on METRON because it does not exist in the stack-select package structure.
2018-03-16 17:16:03,009 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-03-16 17:16:03,013 - Package['metron-common'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:03,113 - Skipping installation of existing package metron-common
2018-03-16 17:16:03,114 - Package['metron-data-management'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-16 17:16:03,125 - Installing package metron-data-management ('/usr/bin/yum -d 0 -e 0 -y install metron-data-management')
Command failed after 1 tries Thank You
... View more
Labels: