Member since
02-13-2019
13
Posts
0
Kudos Received
0
Solutions
02-15-2019
03:19 PM
Yes i did, I got drwxr-xr-x - hbaseGeomesa hdfs 0 2019-02-13 15:19 /user/hbaseGeomesa all the users.............
... View more
02-14-2019
09:47 PM
1) sudo su - hbaseGeomesa 2) klist -kt /etc/security/keytabs/hbase.geomesa.keytab 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 2 02/12/19 14:58:35 hbaseGeomesa/edge_fqdn@realm 3) kinit -kt /etc/security/keytabs/hbase.geomesa.keytab hbaseGeomesa/edge_fqdn@realm 4) While ingesting from local to geomesahbase datastore ./geomesa-hbase ingest -c geo-csv -s /home/hbaseGeomesa/geo.sft -C /home/hbaseGeomesa/geo.convert /home/hbaseGeomesa/geo.csv status: is success Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 18 4d 0c cd 01 01 00 00 da de b1 bf b6 0a f5 60 ec 6d 17 58 ]
[============================================================] 100% complete 9 ingested 0 failed in 00:00:01
2019-02-14 19:01:35,082 INFO [main] tools.user: Local ingestion complete in 00:00:01
2019-02-14 19:01:35,082 INFO [main] tools.user: Ingested 9 features with no failures.
2019-02-14 19:01:35,087 INFO [Thread-7] client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x168e345e3294ba7
2019-02-14 19:01:35,089 INFO [Thread-7] zookeeper.ZooKeeper: Session: 0x168e345e3294ba7 closed
2019-02-14 19:01:35,089 INFO [main-EventThread] zookeeper.ClientCnxn: EventThread shut down --------------------------- 5) While ingesting from HDFS to geomesahbase datastore ./geomesa-hbase ingest -c geo-csv -s /home/hbaseGeomesa/geo.sft -C /home/hbaseGeomesa/geo.convert hdfs://janusgraph/user/hbaseGeomesa/geo.csv Errors: This is from terminal [============================================================] 100% complete 0 ingested 0 failed in 00:01:00
2019-02-14 19:08:05,310 ERROR [main] tools.user: Job failed with state FAILED due to: Task failed task_1549479078340_0362_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
2019-02-14 19:08:05,313 INFO [main] tools.user: Distributed ingestion complete in 00:01:00
2019-02-14 19:08:05,313 INFO [main] tools.user: Ingested 0 features with no failures.
2019-02-14 19:08:05,314 INFO [Thread-7] client.ConnectionManager$HConnectionImplementation: Closing master protocol: MasterService
2019-02-14 19:08:05,314 INFO [Thread-7] client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x368c8ba632929f2
2019-02-14 19:08:05,315 INFO [Thread-7] zookeeper.ZooKeeper: Session: 0x368c8ba632929f2 closed
2019-02-14 19:08:05,328 INFO [main-EventThread] zookeeper.ClientCnxn: EventThread shut down --------------------------- When i see logs: Caused by: com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to <node>:16000
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:228)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:292)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
... 24 more
Caused by: java.io.IOException: Could not set up IO Streams to <node>:16000
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:779)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:889)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1201)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:218)
... 29 more
Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
... 33 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
... 33 more
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
... View more
02-14-2019
06:49 PM
If i upgrade geomesa to later version it won't support my Hbase version. I am using hbase 1.1.2 with HDP 2.6.5
... View more
02-14-2019
06:45 PM
This is for installation of geomesahbase https://www.geomesa.org/documentation/1.3.5/user/hbase/install.html This explains if we have kerberos environment https://www.geomesa.org/documentation/1.3.5/user/hbase/kerberos.html
... View more
02-14-2019
06:32 PM
I am adding principal and keytab in hbase-site.xml because Geomesahbase documentation says to add into hbase-site.xml. Below link explains https://www.geomesa.org/documentation/1.3.5/user/hbase/kerberos.html Let me try deleting those and i will post all the logs
... View more
02-14-2019
06:02 PM
@Emilio Lahr-Vivaz and @Geoffrey Shelton Okot Yes Emilio. when i am trying to run from HDFS its launched map/reduce job to ingest. right now I created principal for edge node as hbaseGeomesa/<edgenode_fqdn>.realm and keytab as hbase.geomesa.keytab. Do i need to create principal for all the nodes? ex: hbaseGeomesa/<node1_fqdn>.realm hbaseGeomesa/<node2_fqdn>.realm .........so on ? or just i need to create hbaseGeomesa/<edgenode_fqdn>.realm keytab in all the nodes. another question is : right now i added hbaseGeomesa/<edgenode_fqdn>.realm and hbase.geomesa.keytab in hbase-site.xml. is that fine or do i need add all the node principal in hbase-site.xml.
... View more
02-14-2019
03:21 PM
Without hdfs://clustername/ location it won't search from hdfs. If i give /user/hbaseGeomesa/geo-csv ............. it searching for directory in local and says no file exist.
... View more
02-14-2019
01:00 AM
@Geoffrey Shelton Okot
As you explained:
I have user in hdfs and with RWCA permission, i copied csv file to /user/hbaseGeomsa/file_.csv
I did kinit -kt /home/{user}/hbaseGeomesa.keytab hbaseGeomesa/<edgenode@{REALM}
I have ownership of user --> hbaseGeomesa:hdfs /user/hbaseGeomesa
As i told you before i can ingest csv data from local to geomesahbase datastore(it works fine)
.bin/geomesa-hbase ingest -c geo-csv -s /home/hbaseGeomesa/geo.sft -C /home/hbaseGeomesa/geo.convert /home/hbaseGeomesa/geo.csv
when i am trying to ingest from hdfs://clustename/user/hbaseGeomesa/file_.csv , i am getting above error.
I am using below command to run:
.bin/geomesa-hbase ingest -c geo-csv -s /home/hbaseGeomesa/geo.sft -C /home/hbaseGeomesa/geo.convert hdfs://clustername/user/hbaseGeomesa/*
... View more
02-13-2019
07:18 PM
2019-02-13 11:49:09,058 FATAL [IPC Server handler 3 on 45782] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1549479078340_0328_m_000000_0 - exited : org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to fqdn:16000
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1560)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1731)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.isMasterRunning(ConnectionManager.java:946)
at org.apache.hadoop.hbase.client.HBaseAdmin.checkHBaseAvailable(HBaseAdmin.java:3255)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.checkClusterAvailability(HBaseDataStoreFactory.scala:99)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.org$locationtech$geomesa$hbase$data$HBaseDataStoreFactory$$globalConnection$lzycompute(HBaseDataStoreFactory.scala:44)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.org$locationtech$geomesa$hbase$data$HBaseDataStoreFactory$$globalConnection(HBaseDataStoreFactory.scala:41)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory$$anonfun$2.apply(HBaseDataStoreFactory.scala:61)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory$$anonfun$2.apply(HBaseDataStoreFactory.scala:61)
at scala.Option.getOrElse(Option.scala:121)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.createDataStore(HBaseDataStoreFactory.scala:61)
at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.createDataStore(HBaseDataStoreFactory.scala:36)
at org.geotools.data.DataAccessFinder.getDataStore(DataAccessFinder.java:130)
at org.geotools.data.DataStoreFinder.getDataStore(DataStoreFinder.java:89)
at org.locationtech.geomesa.jobs.mapreduce.GeoMesaRecordWriter.<init>(GeoMesaOutputFormat.scala:83)
at org.locationtech.geomesa.jobs.mapreduce.GeoMesaOutputFormat.getRecordWriter(GeoMesaOutputFormat.scala:60)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px 'Helvetica Neue'; color: #454545}
span.Apple-tab-span {white-space:pre}
Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
... 33 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
... 33 more
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
... View more
02-13-2019
07:17 PM
I have geomesahbase in edge node. I am running as hbaseGeomesa user: I created same user in all the nodes. So i created principal hbaseGeomesa/<edgenode_IP>.realm and keytabs. 2019-02-13 17:48:25,814 INFO [main] tools.user: Tracking available at http://<hostname>:8088/proxy/application_1549479078340_0328/ [============================================================] 100% complete 0 ingested 0 failed in 00:01:26 2019-02-13 17:49:46,718 ERROR [main] tools.user: Job failed with state FAILED due to: Task failed task_1549479078340_0328_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 2019-02-13 17:49:46,721 INFO [main] tools.user: Distributed ingestion complete in 00:01:26 2019-02-13 17:49:46,722 INFO [main] tools.user: Ingested 0 features with no failures. 2019-02-13 17:49:46,723 INFO [Thread-7] client.ConnectionManager$HConnectionImplementation: Closing master protocol: MasterService 2019-02-13 17:49:46,723 INFO [Thread-7] client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x168e345e329263c 2019-02-13 17:49:46,724 INFO [Thread-7] zookeeper.ZooKeeper: Session: 0x168e345e329263c closed 2019-02-13 17:49:46,724 INFO [main-EventThread] zookeeper.ClientCnxn: EventThread shut down
... View more