Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Services (History Server, Falcon,Spark HistoryServer, HiveServer 2) failed to start (Ambari 2.5/ HDP 2.6)

Explorer

The reason for failure of these services looks similar : ("Bad Gateway" while transferring files to HDFS)

I am able to put the tars/files (as shown in failure logs) in HDFS by hdfs dfs -put src dest command but not through curl.

I am able to access name node @ 50070 and able to see stats of data nodes as well through browser.

History Server :

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py", line 190, in <module>
    HistoryServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py", line 101, in start
    skip=params.sysprep_skip_copy_tarballs_hdfs)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/copy_tarball.py", line 267, in copy_to_hdfs
    replace_existing_files=replace_existing_files,
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 555, in action_create_on_execute
    self.action_delayed("create")
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 552, in action_delayed
    self.get_hdfs_resource_executor().action_delayed(action_name, self)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 287, in action_delayed
    self._create_resource()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 303, in _create_resource
    self._create_file(self.main_resource.resource.target, source=self.main_resource.resource.source, mode=self.mode)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 418, in _create_file
    self.util.run_command(target, 'CREATE', method='PUT', overwrite=True, assertable_result=False, file_to_put=source, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 199, in run_command
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/2.6.0.3-8/hadoop/mapreduce.tar.gz -H 'Content-Type: application/octet-stream' 'http://xxx.net:50070/webhdfs/v1/hdp/apps/2.6.0.3-8/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'' returned status_code=502. 
<html><head><title>502 Bad Gateway</title></head>
<body><h1>DNS error</h1>
<p>DNS error (the host name of the page you are looking for does not exist)<br><br>Please check that the host name has been spelled correctly.<br></p>
<!--Zscaler/5.3--></body></html>

Falcon :

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/FALCON/0.5.0.2.1/package/scripts/falcon_server.py", line 177, in <module>
    FalconServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/FALCON/0.5.0.2.1/package/scripts/falcon_server.py", line 49, in start
    self.configure(env, upgrade_type=upgrade_type)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 117, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/FALCON/0.5.0.2.1/package/scripts/falcon_server.py", line 44, in configure
    falcon('server', action='config', upgrade_type=upgrade_type)
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/FALCON/0.5.0.2.1/package/scripts/falcon.py", line 185, in falcon
    source = params.falcon_extensions_source_dir)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 555, in action_create_on_execute
    self.action_delayed("create")
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 552, in action_delayed
    self.get_hdfs_resource_executor().action_delayed(action_name, self)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 287, in action_delayed
    self._create_resource()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 306, in _create_resource
    self._copy_from_local_directory(self.main_resource.resource.target, self.main_resource.resource.source)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 315, in _copy_from_local_directory
    self._copy_from_local_directory(new_target, new_source)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 315, in _copy_from_local_directory
    self._copy_from_local_directory(new_target, new_source)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 317, in _copy_from_local_directory
    self._create_file(new_target, new_source)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 418, in _create_file
    self.util.run_command(target, 'CREATE', method='PUT', overwrite=True, assertable_result=False, file_to_put=source, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 199, in run_command
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/current/falcon-server/extensions/hdfs-mirroring/META/hdfs-mirroring-properties.json -H 'Content-Type: application/octet-stream' 'http://xxx.net:50070/webhdfs/v1/apps/falcon/extensions/hdfs-mirroring/META/hdfs-mirroring-properties.json?op=CREATE&user.name=hdfs&overwrite=True'' returned status_code=502. 
<html><head><title>502 Bad Gateway</title></head>
<body><h1>DNS error</h1>
<p>DNS error (the host name of the page you are looking for does not exist)<br><br>Please check that the host name has been spelled correctly.<br></p>
<!--Zscaler/5.3--></body></html>

HiveServer 2:

File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 199, in run_command
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/2.6.0.3-8/tez/lib/tez.tar.gz -H 'Content-Type: application/octet-stream' 'http://xxx.net:50070/webhdfs/v1/hdp/apps/2.6.0.3-8/tez/tez.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'' returned status_code=502. 
<html><head><title>502 Bad Gateway</title></head>
<body><h1>DNS error</h1>
<p>DNS error (the host name of the page you are looking for does not exist)<br><br>Please check that the host name has been spelled correctly.<br></p>
<!--Zscaler/5.3--></body></html>

Spark HistoryServer :

  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 199, in run_command
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/2.6.0.3-8/spark/lib/spark-hdp-assembly.jar -H 'Content-Type: application/octet-stream' 'http://xxx.net:50070/webhdfs/v1/hdp/apps/2.6.0.3-8/spark/spark-hdp-assembly.jar?op=CREATE&user.name=hdfs&overwrite=True&permission=444'' returned status_code=502. 
<html><head><title>502 Bad Gateway</title></head>
<body><h1>DNS error</h1>
<p>DNS error (the host name of the page you are looking for does not exist)<br><br>Please check that the host name has been spelled correctly.<br></p>
<!--Zscaler/5.3--></body></html>
2 REPLIES 2

Super Mentor

This looks like a proxy setting issue. To resolve this issue, set the the "http_proxy" and "https_proxy" environment variable settings to a valid URL.

The http_proxy environment variables needs to be set correctly at the global UNIX profile recently.

For Ambari please see: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-reference/content/ch_setting_up_a... .

Similarly on other hosts as well that make the webhdfs call the proxy setting needs to be done.

.

Super Mentor

To setup the proxy environment variable as a global variable on the cluster hosts and on ambari server, open /etc/profile (for specific user the file will be "$HOME/.bash_profile") file abd then addd the proxy settings

# vi /etc/profile
export http_proxy=http://YOUR_PROXY_HOST:PORT/

.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.