Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error while HDP-2.6.0.3 set up

Error while HDP-2.6.0.3 set up

Contributor

Hi Guys,

We are setting up a HDP-2.6.0.3 cluster on 4 nodes having CentOS6.5. While setting up through Ambari we are getting below error. Please note that we are using local repository for set up. Need your URGENT help.

Spark2 client install issue:

stderr:   /var/lib/ambari-agent/data/errors-450.txt
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 64, in <module>
    MysqlServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 33, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 605, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-server' returned 1. Error: Nothing to dostdout:   /var/lib/ambari-agent/data/output-450.txt
2017-06-13 14:16:23,830 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-06-13 14:16:23,830 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-06-13 14:16:23,831 - Group['livy'] {}
2017-06-13 14:16:23,832 - Group['spark'] {}
2017-06-13 14:16:23,832 - Group['hadoop'] {}
2017-06-13 14:16:23,833 - Group['users'] {}
2017-06-13 14:16:23,833 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,833 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,834 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,834 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,835 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-13 14:16:23,835 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,836 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,836 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,837 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-13 14:16:23,837 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,837 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,838 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,838 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,839 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-13 14:16:23,839 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-06-13 14:16:23,840 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-06-13 14:16:23,846 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-06-13 14:16:23,846 - Group['hdfs'] {}
2017-06-13 14:16:23,846 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-06-13 14:16:23,847 - FS Type: 
2017-06-13 14:16:23,847 - Directory['/etc/hadoop'] {'mode': 0755}
2017-06-13 14:16:23,865 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-06-13 14:16:23,865 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-06-13 14:16:23,878 - Initializing 2 repositories
2017-06-13 14:16:23,879 - Repository['HDP-2.6'] {'base_url': 'http://3.209.124.205/HDP/centos6', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-06-13 14:16:23,886 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://3.209.124.205/HDP/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-13 14:16:23,887 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://3.209.124.205/HDP-UTILS-1.1.0.21', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-06-13 14:16:23,891 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://3.209.124.205/HDP-UTILS-1.1.0.21\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-13 14:16:23,892 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-13 14:16:23,936 - Skipping installation of existing package unzip
2017-06-13 14:16:23,936 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-13 14:16:23,946 - Skipping installation of existing package curl
2017-06-13 14:16:23,946 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-13 14:16:23,956 - Skipping installation of existing package hdp-select
2017-06-13 14:16:24,104 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-06-13 14:16:24,109 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2017-06-13 14:16:24,126 - call returned (0, 'hive-server2 - 2.6.0.3-8')
2017-06-13 14:16:24,127 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-06-13 14:16:24,130 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://nn.tcsgegdc.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2017-06-13 14:16:24,131 - Not downloading the file from http://nn.tcsgegdc.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2017-06-13 14:16:24,132 - checked_call[('/usr/java/jdk1.8.0_131/bin/java', '-cp', '/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', 'jceks://file/var/lib/ambari-agent/cred/conf/hive/hive-site.jceks')] {}
2017-06-13 14:16:24,683 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nJun 13, 2017 2:16:24 PM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nhive')
2017-06-13 14:16:24,688 - Package['mysql-server'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-13 14:16:24,739 - Installing package mysql-server ('/usr/bin/yum -d 0 -e 0 -y install mysql-server')
2017-06-13 14:16:24,992 - Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-server' returned 1. Error: Nothing to do
2017-06-13 14:16:24,993 - Failed to install package mysql-server. Executing '/usr/bin/yum clean metadata'
2017-06-13 14:16:25,093 - Retrying to install package mysql-server after 30 seconds

Command failed after 1 tries

7 REPLIES 7

Re: Error while HDP-2.6.0.3 set up

Contributor

Also in another we are getting below error.

MySQL server install error:

stderr:   /var/lib/ambari-agent/data/errors-450.txt
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 64, in <module>
    MysqlServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 33, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 605, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-server' returned 1. Error: Nothing to do

Re: Error while HDP-2.6.0.3 set up

Contributor

Sorry for wrong log. Spark2 client install error log as below.

stderr:   /var/lib/ambari-agent/data/errors-414.txt
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/spark_client.py", line 60, in <module>
    SparkClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/spark_client.py", line 36, in install
    self.configure(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 117, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/spark_client.py", line 42, in configure
    setup_spark(env, 'client', upgrade_type=upgrade_type, action = 'config')
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/setup_spark.py", line 56, in setup_spark
    mode=0644
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/properties_file.py", line 54, in action_create
    mode = self.resource.mode
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/spark2-client/conf/spark-defaults.conf'] failed, parent directory /usr/hdp/current/spark2-client/conf doesn't exist

Highlighted

Re: Error while HDP-2.6.0.3 set up

Contributor

Really need your help as we are stuck for this set up.......

Re: Error while HDP-2.6.0.3 set up

Contributor

Hi Guys,

Looking for your help here. We are stuck with the cluster set up.

Thanks and Regards,

Rajdip

Re: Error while HDP-2.6.0.3 set up

Contributor

MySQL server issue we have handled now. But Spark2 issue still persists. For the time being we are shifting to Spark but need your help in resolving the spark2 issue.

Re: Error while HDP-2.6.0.3 set up

Contributor

Hi Team,

After addressing the 2 issues we are able to progress with the cluster set up activity but facing below error under the step "Check YARN" in one of the nodes. Detailed log below. Really looking for your help as we are not getting any help over the community.

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/service_check.py", line 181, in <module>
    ServiceCheck().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/service_check.py", line 117, in service_check
    user=params.smokeuser,
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'yarn org.apache.hadoop.yarn.applications.distributedshell.Client -shell_command ls -num_containers 1 -jar /usr/hdp/current/hadoop-yarn-client/hadoop-yarn-applications-distributedshell.jar -timeout 300000 --queue default' returned 2. 17/06/13 15:56:38 INFO distributedshell.Client: Initializing Client
17/06/13 15:56:38 INFO distributedshell.Client: Running Client
17/06/13 15:56:39 INFO client.RMProxy: Connecting to ResourceManager at nn.tcsgegdc.com/3.209.124.205:8050
17/06/13 15:56:39 INFO client.AHSProxy: Connecting to Application History server at nn.tcsgegdc.com/3.209.124.205:10200
17/06/13 15:56:39 INFO distributedshell.Client: Got Cluster metric info from ASM, numNodeManagers=3

.........................................................................

17/06/13 15:56:40 INFO impl.YarnClientImpl: Submitted application application_1497349747602_0006
17/06/13 15:56:41 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=AM container is launched, waiting for AM container to Register with RM, appMasterHost=N/A, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=ACCEPTED, distributedFinalState=UNDEFINED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:42 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=AM container is launched, waiting for AM container to Register with RM, appMasterHost=N/A, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=ACCEPTED, distributedFinalState=UNDEFINED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:43 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=AM container is launched, waiting for AM container to Register with RM, appMasterHost=N/A, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=ACCEPTED, distributedFinalState=UNDEFINED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:44 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=, appMasterHost=dn3.tcsgegdc.com/3.209.124.208, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=RUNNING, distributedFinalState=UNDEFINED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:45 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=, appMasterHost=dn3.tcsgegdc.com/3.209.124.208, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=RUNNING, distributedFinalState=UNDEFINED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:46 INFO distributedshell.Client: Got application report from ASM for, appId=6, clientToAMToken=null, appDiagnostics=Diagnostics., total=1, completed=1, allocated=1, failed=1, appMasterHost=dn3.tcsgegdc.com/3.209.124.208, appQueue=default, appMasterRpcPort=-1, appStartTime=1497349916566, yarnAppState=FINISHED, distributedFinalState=FAILED, appTrackingUrl=http://nn.tcsgegdc.com:8088/proxy/application_1497349747602_0006/, appUser=ambari-qa
17/06/13 15:56:46 INFO distributedshell.Client: Application did finished unsuccessfully. YarnState=FINISHED, DSFinalStatus=FAILED. Breaking monitoring loop
17/06/13 15:56:46 ERROR distributedshell.Client: Application failed to complete successfully

Re: Error while HDP-2.6.0.3 set up

Expert Contributor

Check application log for application_1497349747602_0006 in RM. Paste the stdout/stderr from job history

Don't have an account?
Coming from Hortonworks? Activate your account here