Support Questions
Find answers, ask questions, and share your expertise

knox server won't start...[Errno 111] Connection refused / Certificate is Expired. Server will not start.

New Contributor

Hi, I installed docker desktop on Windows 10 and logged onto Ambari server and while no issues with any other service, Knox just won't work.  I tried everything.  Any help would be greatly appreciated.  Note that I am new to all of this.

 

Here is the output:

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/KNOX/0.5.0.2.2/package/scripts/knox_gateway.py", line 211, in <module>
KnoxGateway().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/KNOX/0.5.0.2.2/package/scripts/knox_gateway.py", line 145, in start
not_if=no_op_test
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/current/knox-server/bin/gateway.sh start' returned 1. Starting Gateway failed.
stdout: /var/lib/ambari-agent/data/output-206.txt

2021-10-07 17:04:52,678 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.0-292 -> 2.6.5.0-292
2021-10-07 17:04:52,696 - Using hadoop conf dir: /usr/hdp/2.6.5.0-292/hadoop/conf
2021-10-07 17:04:52,926 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.0-292 -> 2.6.5.0-292
2021-10-07 17:04:52,932 - Using hadoop conf dir: /usr/hdp/2.6.5.0-292/hadoop/conf
2021-10-07 17:04:52,933 - Group['livy'] {}
2021-10-07 17:04:52,935 - Group['spark'] {}
2021-10-07 17:04:52,935 - Group['ranger'] {}
2021-10-07 17:04:52,936 - Group['hdfs'] {}
2021-10-07 17:04:52,936 - Group['zeppelin'] {}
2021-10-07 17:04:52,937 - Group['hadoop'] {}
2021-10-07 17:04:52,937 - Group['users'] {}
2021-10-07 17:04:52,937 - Group['knox'] {}
2021-10-07 17:04:52,938 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,939 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,940 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,941 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,943 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,944 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2021-10-07 17:04:52,945 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,947 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2021-10-07 17:04:52,948 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2021-10-07 17:04:52,950 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2021-10-07 17:04:52,951 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2021-10-07 17:04:52,952 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,953 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,955 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,956 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2021-10-07 17:04:52,958 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,959 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,960 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2021-10-07 17:04:52,962 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,963 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,965 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,966 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,967 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,968 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2021-10-07 17:04:52,969 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-10-07 17:04:52,972 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-10-07 17:04:52,978 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-10-07 17:04:52,978 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-10-07 17:04:52,982 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-10-07 17:04:52,984 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-10-07 17:04:52,986 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-10-07 17:04:52,994 - call returned (0, '1014')
2021-10-07 17:04:52,994 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-10-07 17:04:53,000 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
2021-10-07 17:04:53,000 - Group['hdfs'] {}
2021-10-07 17:04:53,001 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2021-10-07 17:04:53,001 - FS Type:
2021-10-07 17:04:53,001 - Directory['/etc/hadoop'] {'mode': 0755}
2021-10-07 17:04:53,017 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-10-07 17:04:53,018 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-10-07 17:04:53,035 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2021-10-07 17:04:53,040 - Skipping Execute[('setenforce', '0')] due to not_if
2021-10-07 17:04:53,040 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2021-10-07 17:04:53,044 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2021-10-07 17:04:53,044 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2021-10-07 17:04:53,050 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2021-10-07 17:04:53,052 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2021-10-07 17:04:53,061 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2021-10-07 17:04:53,075 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-10-07 17:04:53,077 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2021-10-07 17:04:53,078 - File['/usr/hdp/2.6.5.0-292/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2021-10-07 17:04:53,083 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2021-10-07 17:04:53,087 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2021-10-07 17:04:53,481 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.0-292 -> 2.6.5.0-292
2021-10-07 17:04:53,495 - Stack version to use is 2.6.5.0
2021-10-07 17:04:53,502 - Detected stack with version 2.6.5.0-292, will use knox_data_dir = /usr/hdp/2.6.5.0-292/knox/data
2021-10-07 17:04:53,521 - Using hadoop conf dir: /usr/hdp/2.6.5.0-292/hadoop/conf
2021-10-07 17:04:53,526 - Directory['/usr/hdp/current/knox-server/data/'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'knox', 'recursive_ownership': True}
2021-10-07 17:04:56,081 - Directory['/var/log/knox'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'knox', 'recursive_ownership': True}
2021-10-07 17:04:56,131 - Directory['/var/run/knox'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'knox', 'recursive_ownership': True}
2021-10-07 17:04:56,131 - Creating directory Directory['/var/run/knox'] since it doesn't exist.
2021-10-07 17:04:56,131 - Changing owner for /var/run/knox from 0 to knox
2021-10-07 17:04:56,131 - Changing group for /var/run/knox from 0 to knox
2021-10-07 17:04:56,131 - Directory['/usr/hdp/current/knox-server/conf'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'knox', 'recursive_ownership': True}
2021-10-07 17:04:56,332 - Directory['/usr/hdp/current/knox-server/conf/topologies'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'knox', 'mode': 0755}
2021-10-07 17:04:56,334 - XmlConfig['gateway-site.xml'] {'owner': 'knox', 'group': 'knox', 'conf_dir': '/usr/hdp/current/knox-server/conf', 'configuration_attributes': {}, 'configurations': ...}
2021-10-07 17:04:56,353 - Generating config: /usr/hdp/current/knox-server/conf/gateway-site.xml
2021-10-07 17:04:56,353 - File['/usr/hdp/current/knox-server/conf/gateway-site.xml'] {'owner': 'knox', 'content': InlineTemplate(...), 'group': 'knox', 'mode': None, 'encoding': 'UTF-8'}
2021-10-07 17:04:56,366 - File['/usr/hdp/current/knox-server/conf/gateway-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox', 'mode': 0644}
2021-10-07 17:04:56,372 - File['/usr/hdp/current/knox-server/conf/topologies/default.xml'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox'}
2021-10-07 17:04:56,377 - File['/usr/hdp/current/knox-server/conf/topologies/admin.xml'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox'}
2021-10-07 17:04:56,383 - File['/usr/hdp/current/knox-server/conf/topologies/knoxsso.xml'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox'}
2021-10-07 17:04:56,384 - Execute['/usr/hdp/current/knox-server/bin/knoxcli.sh create-master --master [PROTECTED]'] {'environment': {'JAVA_HOME': u'/usr/lib/jvm/java'}, 'not_if': "ambari-sudo.sh su knox -l -s /bin/bash -c 'test -f /usr/hdp/current/knox-server/data/security/master'", 'user': 'knox'}
2021-10-07 17:04:56,427 - Skipping Execute['/usr/hdp/current/knox-server/bin/knoxcli.sh create-master --master [PROTECTED]'] due to not_if
2021-10-07 17:04:56,427 - Execute['/usr/hdp/current/knox-server/bin/knoxcli.sh create-cert --hostname sandbox-hdp.hortonworks.com'] {'environment': {'JAVA_HOME': u'/usr/lib/jvm/java'}, 'not_if': "ambari-sudo.sh su knox -l -s /bin/bash -c 'test -f /usr/hdp/current/knox-server/data/security/keystores/gateway.jks'", 'user': 'knox'}
2021-10-07 17:04:56,471 - Skipping Execute['/usr/hdp/current/knox-server/bin/knoxcli.sh create-cert --hostname sandbox-hdp.hortonworks.com'] due to not_if
2021-10-07 17:04:56,475 - File['/usr/hdp/current/knox-server/conf/ldap-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox', 'mode': 0644}
2021-10-07 17:04:56,476 - File['/usr/hdp/current/knox-server/conf/users.ldif'] {'content': ..., 'owner': 'knox', 'group': 'knox', 'mode': 0644}
2021-10-07 17:04:56,479 - Knox: Setup ranger: command retry not enabled thus skipping if ranger admin is down !
2021-10-07 17:04:56,480 - HdfsResource['/ranger/audit'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.5.0-292/hadoop/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'recursive_chmod': True, 'owner': 'hdfs', 'group': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/2.6.5.0-292/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/falcon', u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2021-10-07 17:04:56,483 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/ranger/audit?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp5eH3X_ 2>/tmp/tmpQsVXua''] {'logoutput': None, 'quiet': False}
2021-10-07 17:04:56,533 - call returned (0, '')
2021-10-07 17:04:56,534 - HdfsResource['/ranger/audit/knox'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.5.0-292/hadoop/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'recursive_chmod': True, 'owner': 'knox', 'group': 'knox', 'hadoop_conf_dir': '/usr/hdp/2.6.5.0-292/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/falcon', u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0700}
2021-10-07 17:04:56,535 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/ranger/audit/knox?op=GETFILESTATUS&user.name=hdf...'"'"' 1>/tmp/tmpKjJDLc 2>/tmp/tmpjgly6N''] {'logoutput': None, 'quiet': False}
2021-10-07 17:04:56,587 - call returned (0, '')
2021-10-07 17:04:56,590 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.5.0-292/hadoop/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/2.6.5.0-292/hadoop/conf', 'immutable_paths': [u'/apps/falcon', u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2021-10-07 17:04:56,591 - File['/usr/hdp/current/knox-server/conf/hdfs-site.xml'] {'action': ['delete']}
2021-10-07 17:04:56,593 - call['ambari-python-wrap /usr/bin/hdp-select status knox-server'] {'timeout': 20}
2021-10-07 17:04:56,620 - call returned (0, 'knox-server - 2.6.5.0-292')
2021-10-07 17:04:56,622 - RangeradminV2: Skip ranger admin if it's down !
2021-10-07 17:04:57,051 - amb_ranger_admin user already exists.
2021-10-07 17:04:57,271 - Knox Repository Sandbox_knox exist
2021-10-07 17:04:57,273 - File['/usr/hdp/current/knox-server/conf/ranger-security.xml'] {'content': InlineTemplate(...), 'owner': 'knox', 'group': 'knox', 'mode': 0644}
2021-10-07 17:04:57,274 - Writing File['/usr/hdp/current/knox-server/conf/ranger-security.xml'] because contents don't match
2021-10-07 17:04:57,275 - Directory['/etc/ranger/Sandbox_knox'] {'owner': 'knox', 'create_parents': True, 'group': 'knox', 'mode': 0775, 'cd_access': 'a'}
2021-10-07 17:04:57,276 - Directory['/etc/ranger/Sandbox_knox/policycache'] {'owner': 'knox', 'group': 'knox', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-10-07 17:04:57,277 - File['/etc/ranger/Sandbox_knox/policycache/knox_Sandbox_knox.json'] {'owner': 'knox', 'group': 'knox', 'mode': 0644}
2021-10-07 17:04:57,279 - XmlConfig['ranger-knox-audit.xml'] {'group': 'knox', 'conf_dir': '/usr/hdp/current/knox-server/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'knox', 'configurations': ...}
2021-10-07 17:04:57,302 - Generating config: /usr/hdp/current/knox-server/conf/ranger-knox-audit.xml
2021-10-07 17:04:57,302 - File['/usr/hdp/current/knox-server/conf/ranger-knox-audit.xml'] {'owner': 'knox', 'content': InlineTemplate(...), 'group': 'knox', 'mode': 0744, 'encoding': 'UTF-8'}
2021-10-07 17:04:57,316 - XmlConfig['ranger-knox-security.xml'] {'group': 'knox', 'conf_dir': '/usr/hdp/current/knox-server/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'knox', 'configurations': ...}
2021-10-07 17:04:57,330 - Generating config: /usr/hdp/current/knox-server/conf/ranger-knox-security.xml
2021-10-07 17:04:57,330 - File['/usr/hdp/current/knox-server/conf/ranger-knox-security.xml'] {'owner': 'knox', 'content': InlineTemplate(...), 'group': 'knox', 'mode': 0744, 'encoding': 'UTF-8'}
2021-10-07 17:04:57,340 - XmlConfig['ranger-policymgr-ssl.xml'] {'group': 'knox', 'conf_dir': '/usr/hdp/current/knox-server/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'knox', 'configurations': ...}
2021-10-07 17:04:57,353 - Generating config: /usr/hdp/current/knox-server/conf/ranger-policymgr-ssl.xml
2021-10-07 17:04:57,353 - File['/usr/hdp/current/knox-server/conf/ranger-policymgr-ssl.xml'] {'owner': 'knox', 'content': InlineTemplate(...), 'group': 'knox', 'mode': 0744, 'encoding': 'UTF-8'}
2021-10-07 17:04:57,362 - Execute[(u'/usr/hdp/2.6.5.0-292/ranger-knox-plugin/ranger_credential_helper.py', '-l', u'/usr/hdp/2.6.5.0-292/ranger-knox-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_knox/cred.jceks', '-k', 'sslKeyStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslKeyStore created successfully!
2021-10-07 17:04:59,740 - Execute[(u'/usr/hdp/2.6.5.0-292/ranger-knox-plugin/ranger_credential_helper.py', '-l', u'/usr/hdp/2.6.5.0-292/ranger-knox-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_knox/cred.jceks', '-k', 'sslTrustStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslTrustStore created successfully!
2021-10-07 17:05:01,187 - File['/etc/ranger/Sandbox_knox/cred.jceks'] {'owner': 'knox', 'group': 'knox', 'mode': 0640}
2021-10-07 17:05:01,188 - Stack does not support core-site.xml creation for Ranger plugin, skipping core-site.xml configurations
2021-10-07 17:05:01,189 - Link['/usr/hdp/current/knox-server/pids'] {'to': '/var/run/knox'}
2021-10-07 17:05:01,190 - Link['/usr/hdp/current/knox-server/pids'] replacing old symlink to /run/knox
2021-10-07 17:05:01,190 - Creating symbolic Link['/usr/hdp/current/knox-server/pids'] to /var/run/knox
2021-10-07 17:05:01,192 - Directory['/var/log/knox'] {'group': 'knox', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'knox', 'mode': 0755}
2021-10-07 17:05:01,194 - Execute['/usr/hdp/current/knox-server/bin/gateway.sh start'] {'environment': {'JAVA_HOME': u'/usr/lib/jvm/java'}, 'not_if': 'ls /var/run/knox/gateway.pid >/dev/null 2>&1 && ps -p `cat /var/run/knox/gateway.pid` >/dev/null 2>&1', 'user': 'knox'}
2021-10-07 17:05:03,596 - Execute['find /var/log/knox -maxdepth 1 -type f -name '*' -exec echo '==> {} <==' \; -exec tail -n 40 {} \;'] {'logoutput': True, 'ignore_failures': True, 'user': 'knox'}
==> /var/log/knox/gateway.out <==
==> /var/log/knox/gateway.err <==
log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.DailyRollingFileAppender.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.DailyRollingFileAppender.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
==> /var/log/knox/gateway.log <==
2021-10-07 17:05:02,298 INFO hadoop.gateway (GatewayServer.java:logSysProp(206)) - System Property: user.name=knox
2021-10-07 17:05:02,302 INFO hadoop.gateway (GatewayServer.java:logSysProp(206)) - System Property: user.dir=/home/knox
2021-10-07 17:05:02,302 INFO hadoop.gateway (GatewayServer.java:logSysProp(206)) - System Property: java.runtime.name=OpenJDK Runtime Environment
2021-10-07 17:05:02,302 INFO hadoop.gateway (GatewayServer.java:logSysProp(206)) - System Property: java.runtime.version=1.8.0_171-b10
2021-10-07 17:05:02,302 INFO hadoop.gateway (GatewayServer.java:logSysProp(206)) - System Property: java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.x86_64/jre
2021-10-07 17:05:02,603 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigResource(388)) - Loading configuration resource jar:file:/usr/hdp/2.6.5.0-292/knox/bin/../lib/gateway-server-0.12.0.2.6.5.0-292.jar!/conf/gateway-default.xml
2021-10-07 17:05:03,057 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigFile(376)) - Loading configuration file /usr/hdp/2.6.5.0-292/knox/bin/../conf/gateway-site.xml
2021-10-07 17:05:03,094 INFO hadoop.gateway (GatewayConfigImpl.java:initGatewayHomeDir(320)) - Using /usr/hdp/2.6.5.0-292/knox/bin/.. as GATEWAY_HOME via system property.
2021-10-07 17:05:03,525 INFO hadoop.gateway (JettySSLService.java:init(96)) - Credential store for the gateway instance found - no need to create one.
2021-10-07 17:05:03,540 INFO hadoop.gateway (JettySSLService.java:init(118)) - Keystore for the gateway instance found - no need to create one.
2021-10-07 17:05:03,545 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(148)) - The Gateway SSL certificate is issued to hostname: sandbox-hdp.hortonworks.com.
2021-10-07 17:05:03,547 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(151)) - The Gateway SSL certificate is valid between: 6/18/18 4:01 PM and 6/18/19 4:01 PM.
2021-10-07 17:05:03,553 FATAL hadoop.gateway (GatewayServer.java:main(164)) - Failed to start gateway: org.apache.hadoop.gateway.services.ServiceLifecycleException: Gateway SSL Certificate is Expired. Server will not start.
==> /var/log/knox/gateway-audit.log <==
18/06/18 16:01:25 |||audit||||||deploy|topology|admin|unavailable|
18/06/18 16:01:29 |||audit||||||deploy|topology|default|unavailable|
18/06/18 16:01:31 |||audit||||||deploy|topology|manager|unavailable|
18/06/18 16:01:33 |||audit||||||deploy|topology|knoxsso|unavailable|
==> /var/log/knox/knoxcli.log <==
2018-06-18 16:01:11,688 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigResource(388)) - Loading configuration resource jar:file:/usr/hdp/2.6.5.0-292/knox/bin/../lib/gateway-server-0.12.0.2.6.5.0-292.jar!/conf/gateway-default.xml
2018-06-18 16:01:12,498 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigFile(376)) - Loading configuration file /usr/hdp/2.6.5.0-292/knox/bin/../conf/gateway-site.xml
2018-06-18 16:01:12,562 INFO hadoop.gateway (GatewayConfigImpl.java:initGatewayHomeDir(320)) - Using /usr/hdp/2.6.5.0-292/knox/bin/.. as GATEWAY_HOME via system property.
2018-06-18 16:01:14,945 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigResource(388)) - Loading configuration resource jar:file:/usr/hdp/2.6.5.0-292/knox/bin/../lib/gateway-server-0.12.0.2.6.5.0-292.jar!/conf/gateway-default.xml
2018-06-18 16:01:15,709 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigFile(376)) - Loading configuration file /usr/hdp/2.6.5.0-292/knox/bin/../conf/gateway-site.xml
2018-06-18 16:01:15,759 INFO hadoop.gateway (GatewayConfigImpl.java:initGatewayHomeDir(320)) - Using /usr/hdp/2.6.5.0-292/knox/bin/.. as GATEWAY_HOME via system property.
==> /var/log/knox/gateway.log.2018-06-18 <==
2018-06-18 16:01:30,188 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,189 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,194 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,194 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,211 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,233 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,234 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,234 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,234 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,235 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,235 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,235 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,236 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,236 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,236 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,237 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:30,237 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:31,803 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(784)) - Activating topology default
2018-06-18 16:01:31,803 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(794)) - Activating topology default archive %2F
2018-06-18 16:01:31,805 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(898)) - Deploying topology manager to /usr/hdp/2.6.5.0-292/knox/bin/../data/deployments/manager.topo.1634e2eb550
2018-06-18 16:01:31,805 INFO hadoop.gateway (GatewayServer.java:internalDeactivateTopology(816)) - Deactivating topology manager
2018-06-18 16:01:32,072 INFO hadoop.gateway (DefaultGatewayServices.java:initializeContribution(197)) - Creating credential store for the cluster: manager
2018-06-18 16:01:32,647 INFO hadoop.gateway (DefaultGatewayServices.java:initializeContribution(201)) - Credential store found for the cluster: manager - no need to create one.
2018-06-18 16:01:33,017 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:33,018 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:33,094 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(784)) - Activating topology manager
2018-06-18 16:01:33,095 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(794)) - Activating topology manager archive %2Fadmin-ui
2018-06-18 16:01:33,096 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(794)) - Activating topology manager archive %2F
2018-06-18 16:01:33,098 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(898)) - Deploying topology knoxsso to /usr/hdp/2.6.5.0-292/knox/bin/../data/deployments/knoxsso.topo.16413a0e970
2018-06-18 16:01:33,098 INFO hadoop.gateway (GatewayServer.java:internalDeactivateTopology(816)) - Deactivating topology knoxsso
2018-06-18 16:01:33,498 INFO hadoop.gateway (DefaultGatewayServices.java:initializeContribution(197)) - Creating credential store for the cluster: knoxsso
2018-06-18 16:01:34,043 INFO hadoop.gateway (DefaultGatewayServices.java:initializeContribution(201)) - Credential store found for the cluster: knoxsso - no need to create one.
2018-06-18 16:01:34,398 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:34,399 INFO hadoop.gateway (GatewayConfigImpl.java:isCookieScopingToPathEnabled(879)) - Cookie scoping feature enabled: false
2018-06-18 16:01:34,472 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(784)) - Activating topology knoxsso
2018-06-18 16:01:34,472 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(794)) - Activating topology knoxsso archive %2Fknoxauth
2018-06-18 16:01:34,473 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(794)) - Activating topology knoxsso archive %2F
2018-06-18 16:01:34,639 INFO hadoop.gateway (GatewayServer.java:start(582)) - Topology port mapping feature enabled: true
2018-06-18 16:01:38,295 INFO hadoop.gateway (GatewayServer.java:start(607)) - Monitoring topologies in directory: /usr/hdp/2.6.5.0-292/knox/bin/../conf/topologies
2018-06-18 16:01:38,297 INFO hadoop.gateway (GatewayServer.java:startGateway(321)) - Started gateway on port 8,443.

Command failed after 1 tries

 

1 REPLY 1

Contributor

Hi Soheer,

From the logs it seems the gateway SSL cert got expired.

 

Can you please locate the gateway.jks file in the Knox host , by default it is in location : 

/var/lib/knox/data*/security/keystores/gateway.jks

 

Move the gateway.jks file to any backup location or rename it. Then restart the Knox service , while restart it will create the new gateway,jks and the service should start up.

; ;