Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HiveServer2 and MapReduce History Server won't start after Ambari installation

Highlighted

HiveServer2 and MapReduce History Server won't start after Ambari installation

New Contributor

I'm a complete newbie to both Hadoop and HDP. Today I installed HDP 2.6 on a five-node cluster with Ambari. After installation, the HiveServer2 and MapReduce2 History server aren't starting.

My hosts are running Ubuntu Server 16.04 LTS.

I distributed the services as follows:

HDPNAME001: NameNode, App Timeline, ResourceManager, History Server, Zookeeper Server.

HDPINFRA001: SNameNode, Hive Metastore, WebHCat Server, HiveServer2, Grafana, Metrics Collector/Analyzer/Explorer, HST Server, and this machine is running my MySQL instance for Ambari and Hive.

HDPDATA001, DATA002, DATA003: DataNode, NodeManager

Now... I'm just a Guy Who Read a Book, so it may be that this distribution of services doesn't really make sense, but it appeared to at the time when I was facing all of those menus in the installer, so that's where I put things.

As I said, after installation these two services wouldn't come up.

When I try to start the HiveServer2 service, I get:

resource_management.core.exceptions.ExecutionFailed:
Execution of 'curl -sS -L -w '%{http_code}' -X GET 'http://hdpname001.vssc.local:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs'
1>/tmp/tmp_27OZV 2>/tmp/tmpM61ApT' returned 7. curl: (7) Failed to
connect to hdpname001.vssc.local port 50070: Connection refused

When I try to start the MR2 History Server, I get:

resource_management.core.exceptions.Fail:
Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary
@/usr/hdp/2.6.0.3-8/hadoop/mapreduce.tar.gz -H 'Content-Type:
application/octet-stream' 'http://hdpname001.vssc.local:50070/webhdfs/v1/hdp/apps/2.6.0.3-8/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444''
returned status_code=403. 
{
"RemoteException": {
"exception": "IOException", 
"javaClassName": "java.io.IOException", 
"message": "Failed to find datanode, suggest to check cluster health. excludeDatanodes=null"
}
}

Based on Random Things I Read on the Internet, I ran 'hadoop namenode -format' and was asked if I wanted to Re-format the filsystem. Based on that prompt, I assumed the filesystem does exist and I aborted the format.

When I do "hdfs dfs -ls /" (at a command prompt on the name node) I get:

myname@HDPName001:~$ hdfs dfs -ls /

Found 7 items

drwxrwxrwx  - yarn  hadoop  0 2017-06-15 11:33 /app-logs

drwxr-xr-x  - yarn  hadoop  0 2017-06-15 11:33 /ats

drwxr-xr-x  - hdfs  hdfs  0 2017-06-15 11:33 /hdp

drwxr-xr-x  - mapred hdfs  0 2017-06-15 11:33 /mapred

drwxrwxrwx  - mapred hadoop  0 2017-06-15 11:33 /mr-history

drwxrwxrwx  - hdfs  hdfs  0 2017-06-15 11:33 /tmp

drwxr-xr-x  - hdfs  hdfs  0 2017-06-15 11:33 /user

When I run the same command from the Infra001 server, I get:

myname@HDPInfra001:~$ hdfs dfs -ls /

ls: Call From HDPInfra001.vssc.local/127.0.1.1 to hdpname001.vssc.local:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Now remember, folks, I'm a Newbie who Read a Book. (The most dangerous citizen of the Internet!) While I sincerely appreciate any help the community can provide, you might have to break it down into baby steps.

Can anyone help me figure out how to get these two services up?

1 REPLY 1

Re: HiveServer2 and MapReduce History Server won't start after Ambari installation

New Contributor

Hi,

Did you solve the above issue? Me too facing same problem.

Don't have an account?
Coming from Hortonworks? Activate your account here