Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

0favorite

If anybody have done Spark Streaming 1.6.1 (Scala) with Phoenix with Kerberos enabled. Please share some sample code. I am using HDP 2.4.2

6 REPLIES 6
Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

@Nilesh Pandey

I think this test case in Phoenix code help you to write scala code.

https://github.com/apache/phoenix/blob/master/phoenix-spark/src/it/scala/org/apache/phoenix/spark/Ph...

Make sure you suffix principle and keytab file to url. For ex:

jdbc:phoenix:<<em>Zookeeper_host_name</em>>:<<em>port_number</em>>:<<em>secured_Zookeeper_node</em>>:<<em>principal_name</em>>:<<em>HBase_headless_keytab_file</em>>

can get more info here:

http://phoenix.apache.org/phoenix_spark.html

You can validate phoenix connection with kerberos as below

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/validat...

Hope it helps.

Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

Thank You @Rajeshbabu, I more help where can I find phoenix driver to run this jdbc code on hortonworks 2.4.2

Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

In phoenix-client jar you can see phoenix driver class

org.apache.phoenix.jdbc.PhoenixDriver
Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

One more doubt I have is .. we specify keytab location in JDBC url, so when yarn will allocate executor for spark does this jdbc url will look locallly for the keytab file on each executor.

Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

@Nilesh Pandey

Instead of specifying keytab location in the jdbc url you can try this. This might help you.

https://community.hortonworks.com/questions/56848/spark-cant-connect-to-secure-phoenix.html

Highlighted

Re: Spark Streaming + Phoneix + Kerberos+ HDP 2.4.2

I am getting below exception now

client-ip/192.168.212.100:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=33, waitTime=2 Fri Nov 04 12:48:00 CET 2016, RpcRetryingCaller{globalStartTime=1478259550578, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to client-ip/192.168.212.100:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=34, waitTime=3 at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147) at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4083) at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:528) at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:550) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:810) ... 36 more Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to client-ip/192.168.212.100:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=34, waitTime=3 at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1540) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1560) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1711) at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) ... 40 more Caused by: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to client-ip/192.168.212.100:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=34, waitTime=3 at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58152) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1571) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1509) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1531) ... 44 more Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to client-ip/192.168.212.100:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=34, waitTime=3 at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException(RpcClientImpl.java:1259) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1230) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) ... 49 more Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to client-ip/192.168.212.100:16000 is closing. Call id=34, waitTime=3 at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.cleanupCalls(RpcClientImpl.java:1047) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.close(RpcClientImpl.java:846) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.run(RpcClientImpl.java:574) 16/11/04 12:48:01 INFO CoarseGrainedExecutorBackend: Got assigned task 76 16/11/04 12:48:01 INFO Executor: Running task 0.1 in stage 7.0 (TID 76) 16/11/04 12:48:01 INFO BlockManager: Found block input-0-1478259544400 locally 16/11/04 12:48:01 INFO deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum 16/11/04 12:48:01 INFO deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum 16/11/04 12:48:01 INFO RecoverableZooKeeper: Process identifier=hconnection-0x45e8f7b3 connecting to ZooKeeper ensemble=zk-ip:2181 16/11/04 12:48:01 INFO ZooKeeper: Initiating client connection, connectString=zk-ip:2181 sessionTimeout=90000 watcher=hconnection-0x45e8f7b30x0, quorum=zk-ip:2181, baseZNode=/hbase-secure 16/11/04 12:48:01 WARN ClientCnxn: SASL configuration failed: javax.security.auth.login.LoginException: No JAAS configuration section named 'Client' was found in specified JAAS configuration file: './key.conf'. Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. 16/11/04 12:48:01 INFO ClientCnxn: Opening socket connection to server client-ip/192.168.212.104:2181 16/11/04 12:48:01 INFO ClientCnxn: Socket connection established to client-ip/192.168.212.104:2181, initiating session 16/11/04 12:48:01 INFO ClientCnxn: Session establishment complete on server client-ip/192.168.212.104:2181, sessionid = 0x3574704ea4103d4, negotiated timeout = 40000 16/11/04 12:48:01 INFO RecoverableZooKeeper: Process identifier=hconnection-0x27ab4ba3 connecting to ZooKeeper ensemble=zk-ip:2181 16/11/04 12:48:01 INFO ZooKeeper: Initiating client connection, connectString=zk-ip:2181 sessionTimeout=90000 watcher=hconnection-0x27ab4ba30x0, quorum=zk-ip:2181, baseZNode=/hbase-secure 16/11/04 12:48:01 WARN ClientCnxn: SASL configuration failed: javax.security.auth.login.LoginException: No JAAS configuration section named 'Client' was found in specified JAAS configuration file: './key.conf'. Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. 16/11/04 12:48:01 INFO ClientCnxn: Opening socket connection to server client-ip/192.168.212.104:2181 16/11/04 12:48:01 INFO ClientCnxn: Socket connection established to client-ip/192.168.212.104:2181, initiating session 16/11/04 12:48:01 INFO ClientCnxn: Session establishment complete on server client-ip/192.168.212.104:2181, sessionid = 0x3574704ea4103d5, negotiated timeout = 40000

Don't have an account?
Coming from Hortonworks? Activate your account here