Member since
03-21-2016
13
Posts
8
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4555 | 09-20-2016 10:01 AM |
09-20-2016
10:01 AM
2 Kudos
After adding hbase jars in spark.driver.extraClassPath my job is working fine. spark-submit --jars /usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-client.jar --conf "spark.driver.extraClassPath=/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hbase/lib/guava-12.0.1.jar" --master yarn-client --principal ctadmin@EXAMPLE.COM --keytab /etc/security/keytabs/ctadmin.keytab --class com.dq.DataQualityApplicationHandler tr-dq-16.7.0.0.0.jar org QUALITY Root Cause:
When spark-submit detects YARN cluster deployment mode, org.apache.spark.deploy.yarn.Client is used for app submission. While getting HBASE_DELEGATION_TOKEN yarn was not getting HBase jars. Hence SPARK_CLASSPATH should be configured with required Hbase jars, so that yarn will get it while taking delegation token.
Note: Starting from spark 1.5, spark.driver.extraClasspath need to be set instead of exporting SPARK_CLASSPATH.
... View more
09-16-2016
01:13 PM
Also I tried upgrading my phoenix to 4.8 but it didn't work
... View more
09-16-2016
08:03 AM
@Ankit Singhal I added hbase-site.xml in spark conf directory on all nodes and restarted spark service but it didn't works. Also hbas-site.xml is already present in my classpath.
... View more
09-16-2016
07:36 AM
Yes I also gone through this JIRA but I am unable to understand workaround provided. What I understood from workaround is instead of providing full zookeper url only provide comma separated ip/host of zookeeper.
... View more
09-16-2016
07:02 AM
1 Kudo
Hi, I am running a spark program on secured cluster which creates SqlContext for creating dataframe over phoenix table. When I run my program in local mode with --master option set to local[2] my program works completely fine, however when I try to run same program with master option set to yarn-client, I am getting below exception: Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=5, exceptions:
Fri Sep 16 12:14:10 IST 2016, RpcRetryingCaller{globalStartTime=1474008247898, pause=100, retries=5}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to demo-qa2-nn/10.60.2.15:16000
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4083)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:528)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:550)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:810)
... 50 more
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to demo-qa2-nn/10.60.2.15:16000
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1540)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1560)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1711)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
... 54 more
Caused by: com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to demo-qa2-nn/10.60.2.15:16000
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58152)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1571)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1509)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1531)
... 58 more
Caused by: java.io.IOException: Could not set up IO Streams to demo-qa2-nn/10.60.2.15:16000
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:779)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1200)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
... 63 more
Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
... 67 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
... 67 more
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 76 more PFB program and command I am using: val sparkConf = new SparkConf().setAppName(appName)
.set("spark.kyro.registrationRequired", "true") //always use kyro
CustomKryoRegistrator.register(sparkConf)
val sc=new SparkContext(sparkConf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
sqlContext.setConf("spark.sql.parquet.binaryAsString", "true")
val df = sqlContext.read.format("org.apache.phoenix.spark")
.option("table", table_name)
.option("zkUrl", "demo-qa2-dn03,demo-qa2-dn01,demo-qa2-dn02")
.load()
df.show(); Command: spark-submit --jars $(echo ./lib/*.jar | tr ' ' ','),$(echo ./conf/*.* | tr ' ' ','),/usr/hdp/2.4.2.0-258/hbase/lib/hbase-client-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-thin-client.jar,/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-core-4.4.0.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-spark-4.4.0.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/phoenix-4.4.0.2.4.2.0-258-client.jar --driver-class-path $(echo ./lib/*.jar | tr ' ' ','),$(echo ./conf/*.* | tr ' ' ','),/usr/hdp/2.4.2.0-258/hbase/lib/hbase-client-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-spark-4.4.0.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-core-4.4.0.2.4.2.0-258.jar,/usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-thin-client.jar,/usr/hdp/2.4.2.0-258/hbase/lib/phoenix-4.4.0.2.4.2.0-258-client.jar --master yarn-client --class com.xyz.demo.dq.DataQualityApplicationHandler tr-dq-16.7.0.0.0.jar org ss1 Phoenix tr-dq-job.properties QUALITY
... View more
Labels:
- Labels:
-
Apache Phoenix
-
Apache Spark
09-15-2016
10:52 AM
@Josh Elser You are correct. My issue was different, it was related to classpath. I resolved that and now while connecting to secure cluster with above solution I am getting below error. Could you please help me out. Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: demo-dev1-nn/10.60.70.10:16000
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1540)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1560)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1711)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
... 54 more
Caused by: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: demo-dev1-nn/10.60.70.10:16000
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58152)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1571)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1509)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1531)
... 58 more
Caused by: org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: demo-dev1-nn/10.60.70.10:16000
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:701)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1200)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
... 63 more
... View more
09-14-2016
12:35 PM
Above solution is not working for me. However I found below error in debug log. I have hbase libs present in my --driver class path and --jars.
16/09/14 16:56:26 INFO YarnSparkHadoopUtil: HBase class not found java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
16/09/14 16:56:26 DEBUG YarnSparkHadoopUtil: HBase class not found
java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.obtainTokenForHBaseInner(YarnSparkHadoopUtil.scala:381)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.obtainTokenForHBase(YarnSparkHadoopUtil.scala:362)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.obtainTokenForHBase(YarnSparkHadoopUtil.scala:165)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:349)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:733)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:143)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
at com.xyz.demo.dq.util.ContextBuilder$.getSparkContext(DQUtils.scala:118)
at com.xyz.demo.dq.DataQualityApplicationHandler$delayedInit$body.apply(DataQualityApplicationHandler.scala:62)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$anonfun$main$1.apply(App.scala:71)
at scala.App$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.xyz.demo.dq.DataQualityApplicationHandler$.main(DataQualityApplicationHandler.scala:52)
at com.xyz.demo.dq.DataQualityApplicationHandler.main(DataQualityApplicationHandler.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/09/14 16:56:26 WARN Token: Cannot find class for token kind HIVE_DELEGATION_TOKEN
16/09/14 16:56:26 WARN Token: Cannot find class for token kind HIVE_DELEGATION_TOKEN
16/09/14 16:56:26 DEBUG Client: Kind: HDFS_DELEGATION_TOKEN, Service: 10.60.70.10:8020, Ident: (HDFS_DELEGATION_TOKEN token 9045 for ctadmin); HDFS_DELEGATION_TOKEN token 9045 for ctadmin; Renewer: yarn; Issued: 9/14/16 4:56 PM; Max Date: 9/21/16 4:56 PM
Kind: HIVE_DELEGATION_TOKEN, Service: , Ident: 00 12 63 74 61 64 6d 69 6e 40 48 53 43 41 4c 45 2e 43 4f 4d 04 68 69 76 65 00 8a 01 57 28 72 aa ff 8a 01 57 4c 7f 2e ff 2a 40; null
... View more
03-21-2016
10:11 AM
I am also trying to execute hive query through oozie java action on kerberized environment (https://community.hortonworks.com/questions/23857/executing-hive-queries-through-oozie-java-action-o.html). I tried above solution, but still I am facing issue.
... View more