Member since
10-04-2017
7
Posts
0
Kudos Received
0
Solutions
07-01-2019
02:13 PM
In a HBase Table T01, I have 3 column families F1, F2 and F3. In F2, I have many columns, C20, C21, C22... I need to drop either column F1:C20 or clear data saved in the cells of this column, for all rowkeys. Is there any way to do it in HBase 1.12? Thanks
... View more
Labels:
09-26-2018
10:04 AM
Is there a way to revoke administration rights to a superadmin and limit it to the owner in a HBase table/namespace? I don't want to use Kerberos 🙂
... View more
Labels:
04-24-2018
10:31 AM
Hi Gerdan, This article has never been a solution as there are many small details in your article that are not very clear. For example, you are using the label "ALL_FQDN" in your jaas file. What can this all_fqdn be? It seems also that you are using a sandbox. Your configuration is not applicable in a distributed environement. Versions are also mising. We are nit sure wich version of Solr, Hbase-indexer or Hbase you are using. Thanks.
... View more
04-17-2018
03:14 PM
Hello, I'm trying to setup hbase-indexer (NGDATA) in a HDP (2.5)
kerberized environement. - Solr and Hbase-Indexer are on the same machine (host15) - HMaster is in a separate VM (host12) - 2 Region servers in two other machine (host13 and
host14) - All the VMs are on the same domain - I'm using a separate AD / Kerberos - ZK Quorum is host12 + host13 + host14 I followed the HortonWorks tutorial to set it up, but
this tutorial contains some "fuzzy" details such as the ALL_FQDN SPN in the jaas file. Note that the kerberized cluster is working fine and I'm
able to connect with my own code. Now, when it comes to Hbase-indexer, I'm getting this
exception (Kerberos principal name does NOT have the expected hostname part: hbase): 2018-04-17 16:41:51,032 DEBUG [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
security.HBaseSaslRpcClient: Creating SASL GSSAPI client. Server's Kerberos principal name is hbase/host15.secure.com@IBMDEV.LOCAL
2018-04-17 16:41:51,033 DEBUG [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
security.HBaseSaslRpcClient: Have sent token of size 1472 from initSASLContext. 2018-04-17
16:41:51,033 DEBUG [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
ipc.AbstractRpcClient: Exception encountered while connecting to the server :
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.security.AccessDeniedException):
Kerberos principal name does NOT have the expected hostname part: hbase 2018-04-17
16:41:55,780 DEBUG [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
security.HBaseSaslRpcClient: Creating SASL GSSAPI client. Server's Kerberos
principal name is hbase/host15.secure.com@IBMDEV.LOCAL 2018-04-17
16:41:55,781 DEBUG [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
security.HBaseSaslRpcClient: Have sent token of size 1473 from initSASLContext. 2018-04-17
16:41:55,781 WARN [ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
ipc.AbstractRpcClient: Couldn't setup connection for hbase/host13.secure.com@IBMDEV.LOCAL
to hbase/host15.secure.com@IBMDEV.LOCAL 2018-04-17
16:41:55,781 WARN
[ReplicationExecutor-0.replicationSource,Indexer_hbaseindexer2-host13.secure.com,16020,1522936844806-host14.secure.com,16020,1523965763579-host14.secure.com,16020,1523966798823]
regionserver.HBaseInterClusterReplicationEndpoint: Can't replicate because of a
local or network error:
java.io.IOException: Couldn't setup connection for hbase/host13.secure.com@IBMDEV.LOCAL
to hbase/host15.secure.com@IBMDEV.LOCAL at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:665) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856) at
org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199) at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) at
org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at
org.apache.hadoop.hbase.protobuf.generated.AdminProtos$AdminService$BlockingStub.replicateWALEntry(AdminProtos.java:23707) at
org.apache.hadoop.hbase.protobuf.ReplicationProtbufUtil.replicateWALEntry(ReplicationProtbufUtil.java:71) at
org.apache.hadoop.hbase.replication.regionserver.HBaseInterClusterReplicationEndpoint.replicate(HBaseInterClusterReplicationEndpoint.java:179) at
org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.shipEdits(ReplicationSource.java:825) at
org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.run(ReplicationSource.java:444) Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.security.AccessDeniedException):
Kerberos principal name does NOT have the expected hostname part: hbase at
org.apache.hadoop.hbase.security.HBaseSaslRpcClient.readStatus(HBaseSaslRpcClient.java:153) at
org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:189) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865) at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734) ... 10
more Here's my Hbase-indexer jaas: Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true storeKey=true
useTicketCache=false
keyTab="/opt/lucidworks-hdpsearch/hbase-indexer/conf/hbasePlusHTTP.service.keytab"
principal="hbase/host15.secure.com@IBMDEV.LOCAL" debug=true; }; In my zkcli.sh I added the following line : ZK_KRB="-Djava.security.auth.login.config=/opt/lucidworks-hdpsearch/hbase-indexer/conf/hbase_indexer.jaas" And the inserted this option to the PATH variable inside the zkcli.sh from the HDP zookeeper, not the Solr one: PATH=$JAVA_HOME/bin:$PATH $JVM $SOLR_ZK_CREDS_AND_ACLS $ZK_KRB
-Dlog4j.configuration=$log4j_config In the HBase Indexer logs, I'm seeing anytime I restart
HBase this error:
org.apache.zookeeper.KeeperException$NoAuthException: KeeperErrorCode =
NoAuth for /hbase-secure/tokenauth/keys/26 at
org.apache.zookeeper.KeeperException.create(KeeperException.java:113) To overcome it, I'm just adding an
"world:anyone:cdrwa" to the new ZNodes, but I'm sure this is not the
solution. I'm citing this problem because I'm not sure whether it is related or
not to the main problem. I tried many solutions before asking, but it seems that
I'm missing something. HBase is working. I'm inserting data to Hbase, but
changes are not replicated to Solr. Before kerberization, everything was working perfectly. Any hint/help/soltion is welcome. Thanks
... View more
04-06-2018
08:48 AM
I need to script the HDP cluster kerberization aginst a remote Active Directory. Can anybody tell me how Ambari can create SPNs and Accounts on a remote Active Directory? Where can I find this class/script/code? There is also something with SPN creation I can't do manually on AD, but Ambari can do it. A valid SPN format is something like SERVICE/FQDN@REALM (Ex: HTTP/server1.com@MYAD.COM, ...). But for Ambari QA SPN for instance does not have the "SERVICE/" (Ex: ambari-qa@MYCOM.FR) part on the SPN. When I try to attach similar SPNs manually on AD, WINDOWS will complain about this format! Thanks for pointing me where I can look for these details and how Ambari could do it.
... View more
Labels:
10-23-2017
08:43 AM
In HDP 2.5, users will have uncomment two lines in bin/hbase-indexer to make it work! HBASE_INDEXER_OPTS="$HBASE_INDEXER_OPTS -Dlww.jaas.file=/opt/lucidworks-hdpsearch/hbase-indexer/conf/hbase_indexer.jaas"
HBASE_INDEXER_OPTS="$HBASE_INDEXER_OPTS -Dlww.jaas.appname=Client" #Client is the name given in this tutorial to Jaas Section
... View more
10-04-2017
02:35 PM
I'm trying to use Apache Spark to feed a Java Complex Bean. Say it is somethng like: class A {
Integer notAnId;
List<B> bs;
}
class B{
float amount;
C c;
}
class C{
String val;
} My input is a fixed length text file looking like: AAA11
BBB0.1
CCCitem1
BBB0.1
CCCitem2 What is the best wayto profit from Spark power while doing this transformation? Shall I stick to the Spring Batch like technic and reuse it's fixed length tokenizer to build a strongly typed JavaRDD<A>? Thanks in advance. PS: The bean an the input are much more complex than the illustrated example.
... View more
Labels: