Support Questions
Find answers, ask questions, and share your expertise

unable to start hiveserver2

Contributor

Hi all,

I had enable HiveServer2 Interactive on hdp2.6.2, later i disabled it and try to restart hive, but HiveServer2 is not able to start.

43443-hive-error.png

The error as attached file: hive2-error.txt

Could anyone advise how to fix it?

thanks

1 REPLY 1

Re: unable to start hiveserver2

Super Mentor

@forest lin


Your HDFS seems to be running too slow. Please check the HDFS logs (Specially the NN & DN logs)



While trying to start the Hive process ambari is trying to copy few resources to HDFS for testing and we see that it is taking too long time:

# grep 'copy' /Users/jsensharma/Downloads/43441-hive2-error.txt 

2017-11-06 16:15:16,149 - Called copy_to_hdfs tarball: mapreduce
2017-11-06 16:16:56,458 - DFS file /hdp/apps/2.6.2.0-205/mapreduce/mapreduce.tar.gz is identical to /usr/hdp/2.6.2.0-205/hadoop/mapreduce.tar.gz, skipping the copying
2017-11-06 16:16:56,458 - Will attempt to copy mapreduce tarball from /usr/hdp/2.6.2.0-205/hadoop/mapreduce.tar.gz to DFS at /hdp/apps/2.6.2.0-205/mapreduce/mapreduce.tar.gz.
2017-11-06 16:16:56,458 - Called copy_to_hdfs tarball: tez
2017-11-06 16:18:36,736 - DFS file /hdp/apps/2.6.2.0-205/tez/tez.tar.gz is identical to /usr/hdp/2.6.2.0-205/tez/lib/tez.tar.gz, skipping the copying
2017-11-06 16:18:36,737 - Will attempt to copy tez tarball from /usr/hdp/2.6.2.0-205/tez/lib/tez.tar.gz to DFS at /hdp/apps/2.6.2.0-205/tez/tez.tar.gz.
2017-11-06 16:18:36,737 - Called copy_to_hdfs tarball: pig
2017-11-06 16:20:16,902 - DFS file /hdp/apps/2.6.2.0-205/pig/pig.tar.gz is identical to /usr/hdp/2.6.2.0-205/pig/pig.tar.gz, skipping the copying
2017-11-06 16:20:16,903 - Will attempt to copy pig tarball from /usr/hdp/2.6.2.0-205/pig/pig.tar.gz to DFS at /hdp/apps/2.6.2.0-205/pig/pig.tar.gz.
2017-11-06 16:20:16,903 - Called copy_to_hdfs tarball: hive
2017-11-06 16:21:57,226 - DFS file /hdp/apps/2.6.2.0-205/hive/hive.tar.gz is identical to /usr/hdp/2.6.2.0-205/hive/hive.tar.gz, skipping the copying
2017-11-06 16:21:57,227 - Will attempt to copy hive tarball from /usr/hdp/2.6.2.0-205/hive/hive.tar.gz to DFS at /hdp/apps/2.6.2.0-205/hive/hive.tar.gz.
2017-11-06 16:21:57,227 - Called copy_to_hdfs tarball: sqoop
2017-11-06 16:23:37,471 - DFS file /hdp/apps/2.6.2.0-205/sqoop/sqoop.tar.gz is identical to /usr/hdp/2.6.2.0-205/sqoop/sqoop.tar.gz, skipping the copying
2017-11-06 16:23:37,471 - Will attempt to copy sqoop tarball from /usr/hdp/2.6.2.0-205/sqoop/sqoop.tar.gz to DFS at /hdp/apps/2.6.2.0-205/sqoop/sqoop.tar.gz.
2017-11-06 16:23:37,471 - Called copy_to_hdfs tarball: hadoop_streaming
2017-11-06 16:25:17,696 - DFS file /hdp/apps/2.6.2.0-205/mapreduce/hadoop-streaming.jar is identical to /usr/hdp/2.6.2.0-205/hadoop-mapreduce/hadoop-streaming.jar, skipping the copying
2017-11-06 16:25:17,697 - Will attempt to copy hadoop_streaming tarball from /usr/hdp/2.6.2.0-205/hadoop-mapreduce/hadoop-streaming.jar to DFS at /hdp/apps/2.6.2.0-205/mapreduce/hadoop-streaming.jar.  



.

Python script has been killed due to timeout after waiting 900 secs


Hence after 900 Seconds ambari is timing out the task.
Another approach will be to increase the timeout inside "/etc/ambari-server/conf/ambari.properties" to a higher value like 1800

# grep 900 /etc/ambari-server/conf/ambari.properties
agent.task.timeout=900

.