<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question cant connect to hive in cloudera quickstart in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/59441#M67446</link>
    <description>&lt;P&gt;i need some help , i want to practice but i cant connect to hive in clouderaquickstart&amp;nbsp;&lt;BR /&gt;heres the code that comes when i write "hive" in the terminal&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[cloudera@quickstart /]$ sudo su&lt;BR /&gt;[root@quickstart /]# service hive start&lt;BR /&gt;hive: unrecognized service&lt;BR /&gt;[root@quickstart /]# service hive-metastore start&lt;BR /&gt;Starting Hive Metastore (hive-metastore): [ OK ]&lt;BR /&gt;Hive Metastore is running [ OK ]&lt;BR /&gt;[root@quickstart /]# hive&lt;BR /&gt;2017-08-30 07:43:36,417 WARN [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present. Continuing without it.&lt;/P&gt;&lt;P&gt;Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties&lt;BR /&gt;Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive. Name node is in safe mode.&lt;BR /&gt;The reported blocks 908 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 912.&lt;BR /&gt;The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4318)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:540)&lt;BR /&gt;at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)&lt;BR /&gt;at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.run(RunJar.java:221)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:136)&lt;BR /&gt;Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive. Name node is in safe mode.&lt;BR /&gt;The reported blocks 908 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 912.&lt;BR /&gt;The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4318)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1471)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1408)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)&lt;BR /&gt;at com.sun.proxy.$Proxy16.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:549)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)&lt;BR /&gt;at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3082)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3049)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:957)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:953)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:953)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:946)&lt;BR /&gt;at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3653)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:617)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:574)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:518)&lt;BR /&gt;... 8 more&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;by the way im newbie&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 12:10:47 GMT</pubDate>
    <dc:creator>Jonaae</dc:creator>
    <dc:date>2022-09-16T12:10:47Z</dc:date>
    <item>
      <title>cant connect to hive in cloudera quickstart</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/59441#M67446</link>
      <description>&lt;P&gt;i need some help , i want to practice but i cant connect to hive in clouderaquickstart&amp;nbsp;&lt;BR /&gt;heres the code that comes when i write "hive" in the terminal&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[cloudera@quickstart /]$ sudo su&lt;BR /&gt;[root@quickstart /]# service hive start&lt;BR /&gt;hive: unrecognized service&lt;BR /&gt;[root@quickstart /]# service hive-metastore start&lt;BR /&gt;Starting Hive Metastore (hive-metastore): [ OK ]&lt;BR /&gt;Hive Metastore is running [ OK ]&lt;BR /&gt;[root@quickstart /]# hive&lt;BR /&gt;2017-08-30 07:43:36,417 WARN [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present. Continuing without it.&lt;/P&gt;&lt;P&gt;Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties&lt;BR /&gt;Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive. Name node is in safe mode.&lt;BR /&gt;The reported blocks 908 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 912.&lt;BR /&gt;The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4318)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:540)&lt;BR /&gt;at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)&lt;BR /&gt;at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.run(RunJar.java:221)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:136)&lt;BR /&gt;Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive. Name node is in safe mode.&lt;BR /&gt;The reported blocks 908 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 912.&lt;BR /&gt;The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4318)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1471)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1408)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)&lt;BR /&gt;at com.sun.proxy.$Proxy16.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:549)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)&lt;BR /&gt;at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3082)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3049)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:957)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:953)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:953)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:946)&lt;BR /&gt;at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3653)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:617)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:574)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:518)&lt;BR /&gt;... 8 more&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;by the way im newbie&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 12:10:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/59441#M67446</guid>
      <dc:creator>Jonaae</dc:creator>
      <dc:date>2022-09-16T12:10:47Z</dc:date>
    </item>
    <item>
      <title>Re: cant connect to hive in cloudera quickstart</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/59444#M67447</link>
      <description>&lt;P&gt;already solved , i think i didnt import the vm correctly &amp;nbsp;so i deleted and re import it...suddenly it worked perfectly&lt;/P&gt;</description>
      <pubDate>Wed, 30 Aug 2017 15:23:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/59444#M67447</guid>
      <dc:creator>Jonaae</dc:creator>
      <dc:date>2017-08-30T15:23:36Z</dc:date>
    </item>
    <item>
      <title>Re: cant connect to hive in cloudera quickstart</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/373497#M67448</link>
      <description>&lt;P&gt;Hi Jonaae, this is to inform you that instead of re installing the VM. you can type this command in the shell (sudo -u hdfs hdfs -dfsadmin -safemode leave) this will make you leave your safe mode and you can enter into hive.&lt;/P&gt;</description>
      <pubDate>Sun, 02 Jul 2023 16:31:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cant-connect-to-hive-in-cloudera-quickstart/m-p/373497#M67448</guid>
      <dc:creator>yoddha</dc:creator>
      <dc:date>2023-07-02T16:31:38Z</dc:date>
    </item>
  </channel>
</rss>

