Created 12-29-2015 09:24 PM
Created 12-29-2015 09:55 PM
You can use Ambari VNC service: https://community.hortonworks.com/repos/8321/ambar...
This will install desktop, vnc, and (optionally) eclipse/intellij, maven on sandbox so you can 'remote desktop' into it and start developing on it. Also provides instructions to import sample code for Storm, Spark, Nifi.
Alternatively, you can use the mini-clusters project to bring up hadoop components in your local Eclipse/IntelliJ environment: http://hortonworks.com/partners/learn/#dev
Created 12-29-2015 09:55 PM
You can use Ambari VNC service: https://community.hortonworks.com/repos/8321/ambar...
This will install desktop, vnc, and (optionally) eclipse/intellij, maven on sandbox so you can 'remote desktop' into it and start developing on it. Also provides instructions to import sample code for Storm, Spark, Nifi.
Alternatively, you can use the mini-clusters project to bring up hadoop components in your local Eclipse/IntelliJ environment: http://hortonworks.com/partners/learn/#dev
Created 09-29-2020 10:10 AM
Hi
I had follow your all the instruction given in your blob.
I used the below URL for installation of Eclipse
rathter than
as neon eclipse was not avilable on the abovev URL.
I am getting VNC server install failed and I am getting below error
raceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/VNCSERVER/package/scripts/master.py", line 132, in <module> Master().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/VNCSERVER/package/scripts/master.py", line 31, in install Execute('yum groupinstall -y Desktop >> '+params.log_location) File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run returns=self.resource.returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'yum groupinstall -y Desktop >> /var/log/vnc-stack.log' returned 1. There is no installed groups file. Maybe run: yum groups mark convert (see man yum) http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden Trying other mirror. To address this issue please refer to the below wiki article https://wiki.centos.org/yum-errors If above article doesn't help to resolve this issue please use https://bugs.centos.org/. http://ftp.acc.umu.se/mirror/ius.io/stable/CentOS/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. To address this issue please refer to the below wiki article https://wiki.centos.org/yum-errors If above article doesn't help to resolve this issue please use https://bugs.centos.org/. http://ius.mirror.constant.com/stable/CentOS/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirrors.tuna.tsinghua.edu.cn/ius/stable/CentOS/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://hkg.mirror.rackspace.com/ius/stable/CentOS/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://ftp.rediris.es/mirror/IUS_Community/stable/CentOS/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. https://download.postgresql.org/pub/repos/yum/9.6/redhat/rhel-7-x86_64/repodata/repomd.xml: [Errno 14] curl#60 - "Peer's Certificate issuer is not recognized." Trying other mirror. It was impossible to connect to the CentOS servers. This could mean a connectivity issue in your environment, such as the requirement to configure a proxy, or a transparent proxy that tampers with TLS security, or an incorrect system clock. You can try to solve this issue by using the instructions on https://wiki.centos.org/yum-errors If above article doesn't help to resolve this issue please use https://bugs.centos.org/. http://mirrors.up.pt/pub/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] curl#52 - "Empty reply from server" Trying other mirror. http://lon.mirror.rackspace.com/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://syd.mirror.rackspace.com/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirror.amsiohosting.net/iuscommunity.org/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 403 - Forbidden Trying other mirror. http://muug.ca/mirror/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirror.its.dal.ca/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://dfw.mirror.rackspace.com/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://ftp.upcnet.ro/mirrors/iuscommunity.org/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://ius.mirror.digitalpacific.com.au/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirror.ehv.weppel.nl/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirrors.ircam.fr/pub/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. http://mirrors.kernel.org/ius/stable/CentOS/7/x86_64/repodata/primary.sqlite.bz2: [Errno 14] HTTPS Error 301 - Moved Permanently Trying other mirror. http://mirrors.viethosting.com/centos/7.8.2003/updates/x86_64/repodata/3ceff108624118bff78dd0800f6d36b716cce8ad5606653c446b834e76b3b915-primary.sqlite.bz2: [Errno 12] Timeout on http://mirrors.viethosting.com/centos/7.8.2003/updates/x86_64/repodata/3ceff108624118bff78dd0800f6d36b716cce8ad5606653c446b834e76b3b915-primary.sqlite.bz2: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds') Trying other mirror. Warning: group Desktop does not exist. Maybe run: yum groups mark install (see man yum) Error: No packages in any requested group available to install or update
2020-09-29 16:55:19,254 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0 2020-09-29 16:55:19,341 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2020-09-29 16:55:19,362 - Group['livy'] {} 2020-09-29 16:55:19,382 - Group['spark'] {} 2020-09-29 16:55:19,385 - Group['ranger'] {} 2020-09-29 16:55:19,395 - Group['hdfs'] {} 2020-09-29 16:55:19,398 - Group['zeppelin'] {} 2020-09-29 16:55:19,400 - Group['hadoop'] {} 2020-09-29 16:55:19,402 - Group['users'] {} 2020-09-29 16:55:19,405 - Group['knox'] {} 2020-09-29 16:55:19,425 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,439 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,452 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,465 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,484 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,510 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,522 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2020-09-29 16:55:19,541 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,552 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None} 2020-09-29 16:55:19,565 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2020-09-29 16:55:19,582 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None} 2020-09-29 16:55:19,596 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None} 2020-09-29 16:55:19,610 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,629 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None} 2020-09-29 16:55:19,642 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2020-09-29 16:55:19,655 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,671 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None} 2020-09-29 16:55:19,685 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,703 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,723 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,735 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2020-09-29 16:55:19,750 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None} 2020-09-29 16:55:19,757 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2020-09-29 16:55:19,799 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2020-09-29 16:55:20,330 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2020-09-29 16:55:20,334 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2020-09-29 16:55:20,350 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2020-09-29 16:55:20,374 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2020-09-29 16:55:20,386 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2020-09-29 16:55:20,481 - call returned (0, '1015') 2020-09-29 16:55:20,487 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2020-09-29 16:55:20,557 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if 2020-09-29 16:55:20,565 - Group['hdfs'] {} 2020-09-29 16:55:20,571 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']} 2020-09-29 16:55:20,579 - FS Type: HDFS 2020-09-29 16:55:20,581 - Directory['/etc/hadoop'] {'mode': 0755} 2020-09-29 16:55:20,840 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2020-09-29 16:55:20,850 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match 2020-09-29 16:55:20,854 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2020-09-29 16:55:21,025 - Repository['DAS-1.0.2.0-6-repo-1'] {'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6', 'action': ['prepare'], 'components': [u'dasbn-repo', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2020-09-29 16:55:21,122 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2020-09-29 16:55:21,151 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0 is not created due to its tags: set([u'GPL']) 2020-09-29 16:55:21,153 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2020-09-29 16:55:21,180 - Repository[None] {'action': ['create']} 2020-09-29 16:55:21,199 - File['/tmp/tmpVORN_U'] {'content': '[DAS-1.0.2.0-6-repo-1]\nname=DAS-1.0.2.0-6-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2020-09-29 16:55:21,204 - Writing File['/tmp/tmpVORN_U'] because contents don't match 2020-09-29 16:55:21,214 - File['/tmp/tmpoYQoZv'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')} 2020-09-29 16:55:21,225 - Writing File['/tmp/tmpoYQoZv'] because contents don't match 2020-09-29 16:55:21,231 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2020-09-29 16:55:22,958 - Skipping installation of existing package unzip 2020-09-29 16:55:22,961 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2020-09-29 16:55:23,131 - Skipping installation of existing package curl 2020-09-29 16:55:23,133 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2020-09-29 16:55:23,271 - Skipping installation of existing package hdp-select 2020-09-29 16:55:23,363 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2020-09-29 16:55:23,451 - Skipping stack-select on VNC because it does not exist in the stack-select package structure. 2020-09-29 16:55:26,340 - Execute['echo "installing Desktop" >> /var/log/vnc-stack.log'] {} 2020-09-29 16:55:26,405 - Execute['yum groupinstall -y Desktop >> /var/log/vnc-stack.log'] {} 2020-09-29 16:59:04,098 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2020-09-29 16:59:04,212 - Skipping stack-select on VNC because it does not exist in the stack-select package structure. Command failed after 1 tries
Please let me know what mistake I am doing.
Created 06-16-2021 07:30 AM
Hi abajwa,
I tried to install VNC using the following link, however, getting 11 errors which was posted here before.
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/VNCSERVER/package/scripts/master.py", line 132, in <module>
Master().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/VNCSERVER/package/scripts/master.py", line 31, in install
Execute('yum groupinstall -y Desktop >> '+params.log_location)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'yum groupinstall -y Desktop >> /var/log/vnc-stack.log' returned 1. There is no installed groups file.
Maybe run: yum groups mark convert (see man yum)
http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
To address this issue please refer to the below wiki article
https://wiki.centos.org/yum-errors
If above article doesn't help to resolve this issue please use https://bugs.centos.org/.
http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.7.1.0/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
Warning: group Desktop does not exist.
Maybe run: yum groups mark install (see man yum)
Error: No packages in any requested group available to install or update
stdout:
2021-06-16 14:15:21,973 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2021-06-16 14:15:21,982 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2021-06-16 14:15:21,986 - Group['livy'] {}
2021-06-16 14:15:21,991 - Group['spark'] {}
2021-06-16 14:15:21,991 - Group['ranger'] {}
2021-06-16 14:15:21,992 - Group['hdfs'] {}
2021-06-16 14:15:21,992 - Group['zeppelin'] {}
2021-06-16 14:15:21,992 - Group['hadoop'] {}
2021-06-16 14:15:21,993 - Group['users'] {}
2021-06-16 14:15:21,993 - Group['knox'] {}
2021-06-16 14:15:21,994 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:21,997 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:21,999 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,005 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,007 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,010 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,012 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-06-16 14:15:22,014 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,018 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-06-16 14:15:22,020 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-06-16 14:15:22,021 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2021-06-16 14:15:22,022 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-06-16 14:15:22,026 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,028 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-06-16 14:15:22,030 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-06-16 14:15:22,033 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,035 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-06-16 14:15:22,037 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,038 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,040 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,042 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-06-16 14:15:22,044 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2021-06-16 14:15:22,046 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-06-16 14:15:22,058 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-06-16 14:15:22,081 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-06-16 14:15:22,082 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-06-16 14:15:22,087 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-06-16 14:15:22,090 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-06-16 14:15:22,093 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-06-16 14:15:22,149 - call returned (0, '1015')
2021-06-16 14:15:22,151 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-06-16 14:15:22,169 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2021-06-16 14:15:22,170 - Group['hdfs'] {}
2021-06-16 14:15:22,170 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-06-16 14:15:22,172 - FS Type: HDFS
2021-06-16 14:15:22,172 - Directory['/etc/hadoop'] {'mode': 0755}
2021-06-16 14:15:22,214 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-06-16 14:15:22,216 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2021-06-16 14:15:22,218 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-06-16 14:15:22,241 - Repository['DAS-1.0.2.0-6-repo-1'] {'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6', 'action': ['prepare'], 'components': [u'dasbn-repo', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-06-16 14:15:22,258 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-06-16 14:15:22,262 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0 is not created due to its tags: set([u'GPL'])
2021-06-16 14:15:22,262 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-06-16 14:15:22,267 - Repository[None] {'action': ['create']}
2021-06-16 14:15:22,268 - File['/tmp/tmpxKCqje'] {'content': '[DAS-1.0.2.0-6-repo-1]\nname=DAS-1.0.2.0-6-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/DAS/centos7/1.x/BUILDS/1.0.2.0-6\n\npath=/\nenabled=1\ng...'}
2021-06-16 14:15:22,271 - Writing File['/tmp/tmpxKCqje'] because contents don't match
2021-06-16 14:15:22,273 - File['/tmp/tmpMxNzOX'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2021-06-16 14:15:22,281 - Writing File['/tmp/tmpMxNzOX'] because contents don't match
2021-06-16 14:15:22,283 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-06-16 14:15:22,665 - Skipping installation of existing package unzip
2021-06-16 14:15:22,665 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-06-16 14:15:22,680 - Skipping installation of existing package curl
2021-06-16 14:15:22,681 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-06-16 14:15:22,695 - Skipping installation of existing package hdp-select
2021-06-16 14:15:22,702 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2021-06-16 14:15:22,708 - Skipping stack-select on VNC because it does not exist in the stack-select package structure.
2021-06-16 14:15:22,966 - Execute['echo "installing Desktop" >> /var/log/vnc-stack.log'] {}
2021-06-16 14:15:22,972 - Execute['yum groupinstall -y Desktop >> /var/log/vnc-stack.log'] {}
2021-06-16 14:15:54,364 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed