Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Connection Refused for Hive/MapReduce2/Spark

Highlighted

Connection Refused for Hive/MapReduce2/Spark

New Contributor

So I'm trying to setup this HDP 2.6.1 cluster and i have these last 4 errors I am trying to clear up. I am having connection failed on hive/mapreduce/spark services and I don't know what it could be. I tried restarting the components but no luck. In addition, i notice that my datanodes are all dead and when I go into HDFS and look at capacity... I see "Disk Remaining0 Bytes / 0 Bytes (0%): which doesnt make sense since I still have plenty of space on the data nodes VM's. I hope someone here can help me fix these errors... thanks in advance! (see screenshots below for more detailed info)

16515-hdp1.jpg

16516-hdp2.jpg

16517-hdp3.jpg

8 REPLIES 8

Re: Connection Refused for Hive/MapReduce2/Spark

Guru

You need to take a look at the logs to see why the services have not started. First take a look at datanode log in /var/log/hadoop and then hiveserver2 log in /var/log/hive on the node with hiveserver2. You can paste the snippets of the logs here if you see errors there.

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

Here is the hiveserver2 error from the ambari UI:

16518-hiveerror.jpg

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

jdbc:hive2//fqdn/10000 ....

You have returned 'invalid URL'. Check the metastore url setting value in Ambari > Hive.

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

Should the transportMode be binary?

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

Also, how do i know what the DB password should be? it says connection refused so i dont know if its a port problem or permission problem...

16527-hiveconfig.jpg

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

Fixed it! Had to edit and add all the FQDN hosts into /etc/hosts...

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor
2017-06-20 16:48:31,107 INFO  datanode.DataNode (BPServiceActor.java:register(713)) - Block pool BP-778938402-10.13.243.84-1497912510734 (Datanode Uuid e9e5ce82-2410-4550-944b-9545ff503fe1) service to qa-hdp-name-1.dev.ussd.verimatrix.com/10.13.243.84:8020 beginning handshake with NN
2017-06-20 16:48:31,109 ERROR datanode.DataNode (BPServiceActor.java:run(773)) - Initialization failed for Block pool BP-778938402-10.13.243.84-1497912510734 (Datanode Uuid e9e5ce82-2410-4550-944b-9545ff503fe1) service to qa-hdp-name-1.dev.ussd.verimatrix.com/10.13.243.84:8020 Datanode denied communication with namenode because hostname cannot be resolved (ip=10.13.243.85, hostname=10.13.243.85): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=e9e5ce82-2410-4550-944b-9545ff503fe1, infoPort=50075, infoSecurePort=0, ipcPort=8010, storageInfo=lv=-56;cid=CID-664d216a-5c58-4258-841f-bd9579e4a86c;nsid=557746791;c=0)
        at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:938)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:4823)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1424)
        at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:100)
        at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:31226)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)


2017-06-20 16:48:36,110 INFO  datanode.DataNode (BPServiceActor.java:register(713)) - Block pool BP-778938402-10.13.243.84-1497912510734 (Datanode Uuid e9e5ce82-2410-4550-944b-9545ff503fe1) service to qa-hdp-name-1.dev.ussd.verimatrix.com/10.13.243.84:8020 beginning handshake with NN
2017-06-20 16:48:36,112 ERROR datanode.DataNode (BPServiceActor.java:run(773)) - Initialization failed for Block pool BP-778938402-10.13.243.84-1497912510734 (Datanode Uuid e9e5ce82-2410-4550-944b-9545ff503fe1) service to qa-hdp-name-1.dev.ussd.verimatrix.com/10.13.243.84:8020 Datanode denied communication with namenode because hostname cannot be resolved (ip=10.13.243.85, hostname=10.13.243.85): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=e9e5ce82-2410-4550-944b-9545ff503fe1, infoPort=50075, infoSecurePort=0, ipcPort=8010, storageInfo=lv=-56;cid=CID-664d216a-5c58-4258-841f-bd9579e4a86c;nsid=557746791;c=0)
        at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:938)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:4823)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1424)
        at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:100)
        at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:31226)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)

Aove is the error log from the data node: ^^^


The last packet successfully received from the server was 0 milliseconds ago.  The last packet sent successfully to the server was 0 milliseconds ago.
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
        at org.datanucleus.api.jdo.JDOPersistenceManager.getDataStoreConnection(JDOPersistenceManager.java:2275)
        at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getProductName(MetaStoreDirectSql.java:168)
        at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.determineDbType(MetaStoreDirectSql.java:151)
        at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:121)
        at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:383)
        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:316)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:277)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:60)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:702)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:681)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:675)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:1309)
        at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
        at com.sun.proxy.$Proxy17.get_all_databases(Unknown Source)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_all_databases.getResult(ThriftHiveMetastore.java:9169)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_all_databases.getResult(ThriftHiveMetastore.java:9153)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure


Above is all I could find in my hivemetastore.log file: (not sure if thats the correct log) ^^^

It was under my datanode /var/log/hive directory.

Re: Connection Refused for Hive/MapReduce2/Spark

New Contributor

FYI, this isnt an answer above... someone asked for some logs and I posted it here... thanks!

Don't have an account?
Coming from Hortonworks? Activate your account here