Cloudera Labs
Provide feedback on Cloudera Labs
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

[ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH5.7

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

Explorer

Resolved....

Ok So I did the pure parcel based (Recommended) installation CDH 5.8.1(Latest till now) and Phoenix worked properly with HBase 1.2.x.

 

Thanks!!!!

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

New Contributor

In a java program, I'm using the jdbc driver from the phoenix-4.7.0-clabs-phoenix1.3.0-client.jar from this latest Phoenix parcel, and I'm seeing a memory issue as I read an increasing number of records. My program is looking up single rows using the primary key from a Phoenix table with about 90 columns, and I'm testing the performance of those selects. I noticed that memory continued to grow over time until I had read a total of 50,000 records (a select is executed for each record), and then I can get:

Exception in thread "Thread-12" java.lang.OutOfMemoryError: GC overhead limit exceeded

I shouldn't have a leak in my program as I am using a single prepared statement on which I change the value for the primary key each time I execute it. Has this behavior of increasing memory been observed and if so, will this be addressed?

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

New Contributor

 Hi mkanchwala

 

aim trying to install this in CDH 5.7.5 cluster with apached tar files but getting below error .

 

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;

 

can you guide me any documentation to install using parcels in CDH5.7 version??

 

Appreciate your help on this.

 

Thanks

 

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

New Contributor

This new Apache Phoenix 4.7.0 packaging works also on CDH 5.7.1 or CDH 5.7.2?  Or it works only on CDH 5.7.0?

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

Expert Contributor

It is expected to work with CDH5.7.0+, barring internal changes in CDH that break the assumptions phoenix has about HBase. At the moment, that should mean working with CDH5.7 and CDH5.8. The testing done for the release was against CDH5.7.0, specifically.

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

Contributor

Hi - We find that CLAB Phoenix 4.7.0+phoenix1.3.0+0 doesn't seem to fully support CDH 5.7.1 components - it seems to have been built on Hadoop 2.5.1 and Spark 1.5 (<-> 2.6.0 and 1.6.0). 

 

One of the errors encountered by my colleague:

 

Exception in thread "main" java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.Module$SetupContext.setClassIntrospector(Lcom/fasterxml/jackson/databind/introspect/ClassIntrospector;)V

                at com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)

                at com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)

                at com.fasterxml.jackson.module.scala.JacksonModule$$anonfun$setupModule$1.apply(JacksonModule.scala:47)

                at com.fasterxml.jackson.module.scala.JacksonModule$$anonfun$setupModule$1.apply(JacksonModule.scala:47)

                at scala.collection.immutable.List.foreach(List.scala:318)

                at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:47)

                at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:18)

                at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:525)

                at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:81)

                at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)

                at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)

                at org.apache.spark.SparkContext.newAPIHadoopRDD(SparkContext.scala:1140)

                at org.apache.spark.api.java.JavaSparkContext.newAPIHadoopRDD(JavaSparkContext.scala:507)

 

Documentation on the main Phoenix site seems partially out of date and inconsistent (release note still at 4.5.0).  On the other hand, the CLAB Phoenix GitHub has not been updated over an year.  Can you provide more technical details and roadmap for Phoenix at Cloudera, especially its relationship with current/future HBase and Spark releases?

 

Thanks,

Miles

Highlighted

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

Contributor

Updates on CDH-compatibility effort in both Apache and Cloudera-labs are tracked in this thread.

 

Re: [ANNOUNCE] Third installment of Cloudera Labs packaging of Apache Phoenix - Phoenix 4.7.0 on CDH

New Contributor

I installed Phoenix 4.7.0 on our CDH 5.7.5 cluster using the parcel. Everything works fine except that i can't populate the secondary index using the Hbase IndexTool. Looks like this is a known issue here. And the issue is known to be fixed on version 4.8.0. I am wondering when cloudera will release a parcels for phoenix for version 4.8.0? 

 

Thanks so much!

Shumin