Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

getting phoenix query server client not found class exeception

Highlighted

getting phoenix query server client not found class exeception

I am importing jpype. but getting jpype_jexception.RuntimeExceptionPyRaisable : java.lang.RuntimeException : class org.apache.phoenix.queryserver.client.Driver not found .

can some one please help on this .

thanks in advance

2 REPLIES 2

Re: getting phoenix query server client not found class exeception

Super Mentor

@Anurag Mishra

Can you please share little more information about the error and your sample script so that we will know where have you placed the jar containing the "org.apache.phoenix.queryserver.client.Driver"

Re: getting phoenix query server client not found class exeception

New Contributor

I have the same problem. The spark application jar file existed the "org.apache.phoenix.queryserver.client.Driver" file. But arised a error in run time.

User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 2.0 failed 4 times, most recent failure: Lost task 1.3 in stage 2.0 (TID 14, testdmp6.fengdai.org, executor 3): java.lang.RuntimeException: Failed to get driver instance for jdbcUrl=jdbc:phoenix:thin:url=http://iphost:8765;serialization=PROTOBUF at com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:105) at com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:318) at com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:108) at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:105) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:97) at com.tairanchina.csp.dmp.template.spark.jdbc.JDBCTemplate$.getConnection(JDBCTemplate.scala:61) at com.tairanchina.csp.dmp.template.spark.jdbc.JDBCTemplate$.prepareExecuteBatchUpdate$default$2(JDBCTemplate.scala:92) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3$anonfun$apply$6.apply(ODSProcessor.scala:141) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3$anonfun$apply$6.apply(ODSProcessor.scala:140) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.immutable.Map$Map1.foreach(Map.scala:116) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.AbstractTraversable.map(Traversable.scala:104) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3.apply(ODSProcessor.scala:140) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3.apply(ODSProcessor.scala:138) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1$anonfun$1.apply(BasicStreamingTemplate.scala:38) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1$anonfun$1.apply(BasicStreamingTemplate.scala:34) at org.apache.spark.rdd.RDD$anonfun$mapPartitions$1$anonfun$apply$23.apply(RDD.scala:800) at org.apache.spark.rdd.RDD$anonfun$mapPartitions$1$anonfun$apply$23.apply(RDD.scala:800) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: No suitable driver at java.sql.DriverManager.getDriver(DriverManager.java:315) at com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:98) ... 28 more Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$failJobAndIndependentStages(DAGScheduler.scala:1602) at org.apache.spark.scheduler.DAGScheduler$anonfun$abortStage$1.apply(DAGScheduler.scala:1590) at org.apache.spark.scheduler.DAGScheduler$anonfun$abortStage$1.apply(DAGScheduler.scala:1589) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1589) at org.apache.spark.scheduler.DAGScheduler$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) at org.apache.spark.scheduler.DAGScheduler$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) at scala.Option.foreach(Option.scala:257) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1823) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1772) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1761) at org.apache.spark.util.EventLoop$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2034) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2055) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2074) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099) at org.apache.spark.rdd.RDD$anonfun$collect$1.apply(RDD.scala:939) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.collect(RDD.scala:938) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1.apply(BasicStreamingTemplate.scala:61) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1.apply(BasicStreamingTemplate.scala:31) at org.apache.spark.streaming.dstream.DStream$anonfun$foreachRDD$1$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.DStream$anonfun$foreachRDD$1$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: Failed to get driver instance for jdbcUrl=jdbc:phoenix:thin:url=http://iphost:8765;serialization=PROTOBUF at com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:105) at com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:318) at com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:108) at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:105) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:97) at com.tairanchina.csp.dmp.template.spark.jdbc.JDBCTemplate$.getConnection(JDBCTemplate.scala:61) at com.tairanchina.csp.dmp.template.spark.jdbc.JDBCTemplate$.prepareExecuteBatchUpdate$default$2(JDBCTemplate.scala:92) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3$anonfun$apply$6.apply(ODSProcessor.scala:141) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3$anonfun$apply$6.apply(ODSProcessor.scala:140) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.immutable.Map$Map1.foreach(Map.scala:116) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.AbstractTraversable.map(Traversable.scala:104) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3.apply(ODSProcessor.scala:140) at com.tairanchina.csp.dmp.components.storage.ods.ODSProcessor$anonfun$processTopic$3.apply(ODSProcessor.scala:138) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1$anonfun$1.apply(BasicStreamingTemplate.scala:38) at com.tairanchina.csp.dmp.template.spark.BasicStreamingTemplate$ImplicitsStream$anonfun$each$1$anonfun$1.apply(BasicStreamingTemplate.scala:34) at org.apache.spark.rdd.RDD$anonfun$mapPartitions$1$anonfun$apply$23.apply(RDD.scala:800) at org.apache.spark.rdd.RDD$anonfun$mapPartitions$1$anonfun$apply$23.apply(RDD.scala:800) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) ... 3 more Caused by: java.sql.SQLException: No suitable driver at java.sql.DriverManager.getDriver(DriverManager.java:315) at com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:98)