Support Questions

Find answers, ask questions, and share your expertise

Create HBASE Table using Beeline

avatar
Explorer

Hello We are trying to create a HBASE table in beeline and getting an error -

 

We have implemented Kerberos security...

 

This is how we are logging in to beeline -

 

beeline> !connect jdbc:hive2://<Servername>:10000/default;principal=hive/<Kerberos Realm> org.apache.hive.jdbc.HiveDriver                                                                              

scan complete in 3ms

Connecting to jdbc:hive2://<Servername>:10000/default;principal=hive/<kerberos Realm>

Enter password for jdbc:hive2://<Servername>:10000/default;principal=hive/<kerberos Realm>:

Connected to: Apache Hive (version 0.10.0)

Driver: Hive (version 0.10.0-cdh4.6.0)

Transaction isolation: TRANSACTION_REPEATABLE_READ

 

Here is the command we are running to create the HBASE Table -

 

0: jdbc:hive2://<Servername>:10000/> CREATE EXTERNAL TABLE hbase_supercells_01(key string, ssurgo map<string,binary>, counties map<string,binary>, mpe map<string,binary>)

. . . . . . . . . . . . . . . . . . . . . . .> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

. . . . . . . . . . . . . . . . . . . . . . .> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,ssurgo:,counties:,mpe:")

. . . . . . . . . . . . . . . . . . . . . . .> TBLPROPERTIES ("hbase.table.name" = "supercells.01");

 

 

Error –

 

2014-04-03 16:01:06,707 INFO org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation: getMaster attempt 4 of 14 failed; retrying after sleep of 2008

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive/<kerberos Realm> is not allowed to impersonate mv66708

 

We have updated the property in Cloudera Manager --> Services --> Hive1 -->HiveServer2(Default)

 

HiveServer2 Enable Impersonation - this is checked

 

We have updated the property in Cloudera Manager --> Services --> Hive1 -->HiveServer2 --> Advanced

 

HiveServer2 Configuration Safety Valve for hive-site.xml

<property>
<name>hive.server2.authentication</name>
<value>KERBEROS</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>file:///opt/cloudera/parcels/CDH/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.6.0.jar,file:///opt/cloudera/parcels/CDH/lib/hbase/hbase.jar,file:///opt/cloudera/parcels/CDH/lib/zookeeper/zookeeper.jar,file:///opt/cloudera/parcels/CDH/lib/hive/lib/guava-11.0.2.jar,file:///opt/cloudera/parcels/CDH/lib/hadoop/lib/hive-serdes-1.0-SNAPSHOT.jar,file:///opt/cloudera/parcels/CDH/lib/hadoop/lib/TwitterUtil.jar</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>mapred.input.pathFilter.class</name>
<value>com.twitter.util.FileFilterExcludeTmpFiles</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>ldxhdfsm3.dx.deere.com,ldxhdfsm2.dx.deere.com,ldxhdfsm1.dx.deere.com</value>
</property>
<property>
<name>hbase.master.kerberos.principal</name>
<value>hbase/_HOST@HADOOP.DEV.DEERE.COM</value>
</property>
<property>
<name>hbase.regionserver.kerberos.principal</name>
<value>hbase/_HOST@HADOOP.DEV.DEERE.COM</value>
</property>
<property>
<name>hbase.rpc.engine</name>
<value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
</property>

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hey Murthy,

 

I think the issue is that HBase does not allow hive to impersonate users.  So you'll need to setup hive as a proxy user in HBase.  Can you try the following:

 

- Go to the HDFS service configuration in CM.

- Go to Service-Wide->Advanced and add the following to "Cluster-wide Configuration Safety Valve for core-site.xml":

 

  <property>
    <name>hadoop.proxyuser.hive.hosts</name>
    <value>*</value>
  </property>
  <property>
    <name>hadoop.proxyuser.hive.groups</name>
    <value>*</value>
  </property>

 

- Then restart Hbase.

 

By default, CM does not add the hive proxuser config to hbase, that's why you see the errors you are seeing.

 

Let me know if you have any questions.

 

Thanks

Chris

 

 

 

View solution in original post

4 REPLIES 4

avatar
Super Collaborator

Hey Murthy,

 

I think the issue is that HBase does not allow hive to impersonate users.  So you'll need to setup hive as a proxy user in HBase.  Can you try the following:

 

- Go to the HDFS service configuration in CM.

- Go to Service-Wide->Advanced and add the following to "Cluster-wide Configuration Safety Valve for core-site.xml":

 

  <property>
    <name>hadoop.proxyuser.hive.hosts</name>
    <value>*</value>
  </property>
  <property>
    <name>hadoop.proxyuser.hive.groups</name>
    <value>*</value>
  </property>

 

- Then restart Hbase.

 

By default, CM does not add the hive proxuser config to hbase, that's why you see the errors you are seeing.

 

Let me know if you have any questions.

 

Thanks

Chris

 

 

 

avatar
Explorer

Thank You very much Chris - that fixed the issue...

 

Murthy

avatar
Super Collaborator

Glad it worked!

avatar
Super Collaborator
Glad to hear it!!