Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hbase don't restart when I change come configuration files In HDP 3.0.1

Hbase don't restart when I change come configuration files In HDP 3.0.1

New Contributor

Hi Team,

We are not able to restart our cluster when we change some configuration properties. Please find below logs,

Can one help on this .....

stderr: /var/lib/ambari-agent/data/errors-9321.txt

2019-01-31 09:50:55,850 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpZhqa5e'. Return Code: 1, Output: .
2019-01-31 09:50:55,966 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpxw6jiA'. Return Code: 1, Output: .
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 992, in restart
    self.status(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HBASE/package/scripts/hbase_master.py", line 106, in status
    check_process_status(status_params.hbase_master_pid_file)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/check_process_status.py", line 43, in check_process_status
    raise ComponentIsNotRunning()
ComponentIsNotRunning

The above exception was the cause of the following exception:

2019-01-31 09:50:59,471 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpmmubZi'. Return Code: 1, Output: .
2019-01-31 09:50:59,503 - The 'hbase-master' component did not advertise a version. This may indicate a problem with the component packaging.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HBASE/package/scripts/hbase_master.py", line 170, in <module>
    HbaseMaster().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 1003, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HBASE/package/scripts/hbase_master.py", line 87, in start
    self.configure(env) # for security
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HBASE/package/scripts/hbase_master.py", line 45, in configure
    hbase(name='master')
  File "/usr/lib/ambari-agent/lib/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HBASE/package/scripts/hbase.py", line 224, in hbase
    owner=params.hbase_user
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 672, in action_create_on_execute
    self.action_delayed("create")
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 669, in action_delayed
    self.get_hdfs_resource_executor().action_delayed(action_name, self)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 360, in action_delayed
    main_resource.kinit()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 701, in kinit
    user=user
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
    returns=self.resource.returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-hcdl_dev@DEVHCDLRIL.COM' returned 1. kinit: Pre-authentication failed: Permission denied while getting initial credentials

stdout: /var/lib/ambari-agent/data/output-9321.txt

2019-01-31 09:50:55,039 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2019-01-31 09:50:55,060 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-01-31 09:50:55,376 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2019-01-31 09:50:55,382 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-01-31 09:50:55,385 - Group['livy'] {}
2019-01-31 09:50:55,386 - Group['spark'] {}
2019-01-31 09:50:55,386 - Group['ranger'] {}
2019-01-31 09:50:55,386 - Group['hdfs'] {}
2019-01-31 09:50:55,387 - Group['zeppelin'] {}
2019-01-31 09:50:55,387 - Group['hadoop'] {}
2019-01-31 09:50:55,387 - Group['users'] {}
2019-01-31 09:50:55,387 - Group['knox'] {}
2019-01-31 09:50:55,388 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,390 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,392 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,393 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,395 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,396 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-31 09:50:55,398 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,399 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,400 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-01-31 09:50:55,402 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-31 09:50:55,403 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-01-31 09:50:55,405 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-01-31 09:50:55,406 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,408 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-01-31 09:50:55,409 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-31 09:50:55,410 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,412 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-01-31 09:50:55,413 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,415 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,416 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,418 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-31 09:50:55,419 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-01-31 09:50:55,420 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-31 09:50:55,422 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-01-31 09:50:55,428 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-01-31 09:50:55,429 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-01-31 09:50:55,430 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-31 09:50:55,431 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-31 09:50:55,432 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-01-31 09:50:55,441 - call returned (0, '1016')
2019-01-31 09:50:55,442 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1016'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-01-31 09:50:55,448 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1016'] due to not_if
2019-01-31 09:50:55,449 - Group['hdfs'] {}
2019-01-31 09:50:55,449 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-01-31 09:50:55,450 - FS Type: HDFS
2019-01-31 09:50:55,450 - Directory['/etc/hadoop'] {'mode': 0755}
2019-01-31 09:50:55,466 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2019-01-31 09:50:55,467 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-01-31 09:50:55,490 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2019-01-31 09:50:55,499 - Skipping Execute[('setenforce', '0')] due to not_if
2019-01-31 09:50:55,499 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2019-01-31 09:50:55,502 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2019-01-31 09:50:55,502 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2019-01-31 09:50:55,503 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2019-01-31 09:50:55,507 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2019-01-31 09:50:55,509 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2019-01-31 09:50:55,515 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2019-01-31 09:50:55,527 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2019-01-31 09:50:55,527 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2019-01-31 09:50:55,528 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2019-01-31 09:50:55,533 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2019-01-31 09:50:55,537 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2019-01-31 09:50:55,542 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2019-01-31 09:50:55,773 - The unlimited key JCE policy is required, and appears to have been installed.
2019-01-31 09:50:55,850 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpZhqa5e'. Return Code: 1, Output: .
2019-01-31 09:50:55,851 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-31 09:50:55,876 - call returned (0, '2.6.4.0-91\n3.0.1.0-187')
2019-01-31 09:50:55,966 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpxw6jiA'. Return Code: 1, Output: .
2019-01-31 09:50:55,967 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-31 09:50:55,993 - call returned (0, '2.6.4.0-91\n3.0.1.0-187')
2019-01-31 09:50:56,355 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2019-01-31 09:50:56,369 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-01-31 09:50:56,374 - checked_call['hostid'] {}
2019-01-31 09:50:56,379 - checked_call returned (0, '1a0a8925')
2019-01-31 09:50:56,388 - Execute['/usr/hdp/current/hbase-master/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-master/conf stop master'] {'only_if': 'ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-master.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-master.pid` >/dev/null 2>&1', 'on_timeout': '! ( ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-master.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-master.pid` >/dev/null 2>&1 ) || ambari-sudo.sh -H -E kill -9 `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-master.pid`', 'timeout': 30, 'user': 'hbase'}
2019-01-31 09:50:58,398 - File['/var/run/hbase/hbase-hbase-master.pid'] {'action': ['delete']}
2019-01-31 09:50:58,399 - Pid file /var/run/hbase/hbase-hbase-master.pid is empty or does not exist
2019-01-31 09:50:58,406 - Directory['/etc/hbase'] {'mode': 0755}
2019-01-31 09:50:58,406 - Directory['/usr/hdp/current/hbase-master/conf'] {'owner': 'hbase', 'group': 'hadoop', 'create_parents': True}
2019-01-31 09:50:58,407 - Directory['/tmp'] {'create_parents': True, 'mode': 0777}
2019-01-31 09:50:58,407 - Changing permission for /tmp from 1777 to 777
2019-01-31 09:50:58,408 - Directory['/tmp'] {'create_parents': True, 'cd_access': 'a'}
2019-01-31 09:50:58,409 - Execute[('chmod', '1777', u'/tmp')] {'sudo': True}
2019-01-31 09:50:58,419 - XmlConfig['hbase-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-master/conf', 'configuration_attributes': {}, 'configurations': ...}
2019-01-31 09:50:58,436 - Generating config: /usr/hdp/current/hbase-master/conf/hbase-site.xml
2019-01-31 09:50:58,436 - File['/usr/hdp/current/hbase-master/conf/hbase-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2019-01-31 09:50:58,489 - File['/usr/hdp/current/hbase-master/conf/hdfs-site.xml'] {'action': ['delete']}
2019-01-31 09:50:58,489 - File['/usr/hdp/current/hbase-master/conf/core-site.xml'] {'action': ['delete']}
2019-01-31 09:50:58,489 - XmlConfig['hbase-policy.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-master/conf', 'configuration_attributes': {}, 'configurations': {u'security.admin.protocol.acl': u'*', u'security.masterregion.protocol.acl': u'*', u'security.client.protocol.acl': u'*'}}
2019-01-31 09:50:58,498 - Generating config: /usr/hdp/current/hbase-master/conf/hbase-policy.xml
2019-01-31 09:50:58,498 - File['/usr/hdp/current/hbase-master/conf/hbase-policy.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2019-01-31 09:50:58,508 - File['/usr/hdp/current/hbase-master/conf/hbase-env.sh'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop'}
2019-01-31 09:50:58,509 - Writing File['/usr/hdp/current/hbase-master/conf/hbase-env.sh'] because contents don't match
2019-01-31 09:50:58,509 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2019-01-31 09:50:58,512 - File['/etc/security/limits.d/hbase.conf'] {'content': Template('hbase.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2019-01-31 09:50:58,513 - TemplateConfig['/usr/hdp/current/hbase-master/conf/hadoop-metrics2-hbase.properties'] {'owner': 'hbase', 'template_tag': 'GANGLIA-MASTER'}
2019-01-31 09:50:58,521 - File['/usr/hdp/current/hbase-master/conf/hadoop-metrics2-hbase.properties'] {'content': Template('hadoop-metrics2-hbase.properties-GANGLIA-MASTER.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2019-01-31 09:50:58,522 - Writing File['/usr/hdp/current/hbase-master/conf/hadoop-metrics2-hbase.properties'] because contents don't match
2019-01-31 09:50:58,522 - TemplateConfig['/usr/hdp/current/hbase-master/conf/regionservers'] {'owner': 'hbase', 'template_tag': None}
2019-01-31 09:50:58,524 - File['/usr/hdp/current/hbase-master/conf/regionservers'] {'content': Template('regionservers.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2019-01-31 09:50:58,525 - TemplateConfig['/usr/hdp/current/hbase-master/conf/hbase_master_jaas.conf'] {'owner': 'hbase', 'template_tag': None}
2019-01-31 09:50:58,527 - File['/usr/hdp/current/hbase-master/conf/hbase_master_jaas.conf'] {'content': Template('hbase_master_jaas.conf.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2019-01-31 09:50:58,528 - Directory['/var/run/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2019-01-31 09:50:58,528 - Directory['/var/log/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2019-01-31 09:50:58,532 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2019-01-31 09:50:58,533 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-hbase.json
2019-01-31 09:50:58,533 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-hbase.json'] {'content': Template('input.config-hbase.json.j2'), 'mode': 0644}
2019-01-31 09:50:58,537 - File['/usr/hdp/current/hbase-master/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2019-01-31 09:50:58,537 - HdfsResource['/apps/hbase/data'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://devcluster', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-hcdl_dev@DEVHCDLRIL.COM', 'user': 'hdfs', 'owner': 'hbase', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2019-01-31 09:50:58,539 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-hcdl_dev@DEVHCDLRIL.COM'] {'user': 'hdfs'}
2019-01-31 09:50:59,471 - Could not determine stack version for component hbase-master by calling '/usr/bin/hdp-select status hbase-master > /tmp/tmpmmubZi'. Return Code: 1, Output: .
2019-01-31 09:50:59,472 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-31 09:50:59,502 - call returned (0, '2.6.4.0-91\n3.0.1.0-187')
2019-01-31 09:50:59,503 - The 'hbase-master' component did not advertise a version. This may indicate a problem with the component packaging.
 

Command failed after 1 tries

1 REPLY 1

Re: Hbase don't restart when I change come configuration files In HDP 3.0.1

Mentor

@Mohammad Layeeq

Without any details of what you changed, there is no way one can guess. I can see a Kerberos problem and maybe a hbase version issue 2.6.4.0-91 Vs 3.0.1.0-187.

Can you share what you exactly did ?