Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

hbase issue.

hbase issue.

Error Message: com.augmentiq.maxiq.Configuration.Exceptions.SystemException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase: region=SYSTEM.CATALOG,,1505281672560.d132c49cb37767be58d6cc193ac95aa5., hostname=ip-192-168-181-203.ca-central-1.compute.internal,16020,1505281217525, seqNum=2 at com.augmentiq.maxiq.hbase.HbasePhoenixTableCreator.hbaseCreateTableFromDataSource(HbasePhoenixTableCreator.java:56) at com.augmentiq.maxiq.essential.sparkEtl.SparkETLRun.saveDataAtRestForDsWithDataFrame(SparkETLRun.java:222) at com.augmentiq.maxiq.essential.sparkEtl.SparkETLRun.startETLforBasicDataSource(SparkETLRun.java:197) at com.augmentiq.maxiq.spark.ETLSpark.SparkETLDataInjection.runDataIgestionInDataSource(SparkETLDataInjection.java:77) at com.augmentiq.maxiq.spark.handler.SparkHandler.main(SparkHandler.java:73) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$2.run(ApplicationMaster.scala:627) Caused by: java.sql.SQLException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase: region=SYSTEM.CATALOG,,1505281672560.d132c49cb37767be58d6cc193ac95aa5., hostname=ip-192-168-181-203.ca-central-1.compute.internal,16020,1505281217525, seqNum=2 at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:422) at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145) at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1166) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1014) at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1369) at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2116) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:828) at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:338) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:326) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:324) at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1326) at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2279) at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248) at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78) at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248) at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233) at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135) at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at com.augmentiq.maxiq.hbase.HbasePhoenixTableCreator.hbaseCreateTableFromDataSource(HbasePhoenixTableCreator.java:39) ... 9 more


Error Details: java.lang.Exception: com.augmentiq.maxiq.Configuration.Exceptions.SystemException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase: region=SYSTEM.CATALOG,,1505281672560.d132c49cb37767be58d6cc193ac95aa5., hostname=ip-192-168-181-203.ca-central-1.compute.internal,16020,1505281217525, seqNum=2 at com.augmentiq.maxiq.hbase.HbasePhoenixTableCreator.hbaseCreateTableFromDataSource(HbasePhoenixTableCreator.java:56) at com.augmentiq.maxiq.essential.sparkEtl.SparkETLRun.saveDataAtRestForDsWithDataFrame(SparkETLRun.java:222) at com.augmentiq.maxiq.essential.sparkEtl.SparkETLRun.startETLforBasicDataSource(SparkETLRun.java:197) at com.augmentiq.maxiq.spark.ETLSpark.SparkETLDataInjection.runDataIgestionInDataSource(SparkETLDataInjection.java:77) at com.augmentiq.maxiq.spark.handler.SparkHandler.main(SparkHandler.java:73) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$2.run(ApplicationMaster.scala:627) Caused by: java.sql.SQLException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase: region=SYSTEM.CATALOG,,1505281672560.d132c49cb37767be58d6cc193ac95aa5., hostname=ip-192-168-181-203.ca-central-1.compute.internal,16020,1505281217525, seqNum=2 at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:422) at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145) at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1166) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1014) at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1369) at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2116) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:828) at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:338) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:326) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:324) at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1326) at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2279) at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248) at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78) at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248) at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233) at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135) at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at com.augmentiq.maxiq.hbase.HbasePhoenixTableCreator.hbaseCreateTableFromDataSource(HbasePhoenixTableCreator.java:39) ... 9 more at com.augmentiq.maxiq.spark.ETLSpark.SparkETLDataInjection.runDataIgestionInDataSource(SparkETLDataInjection.java:111) at com.augmentiq.maxiq.spark.handler.SparkHandler.main(SparkHandler.java:73) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$2.run(ApplicationMaster.scala:627)

Thanks in advance for the help....!!!!!!!!!!!!!!!!!!

4 REPLIES 4

Re: hbase issue.

It seems, version of phoenix-client/spark.jar you are using for your job is of latest version than the phoenix-server.jar deployed on ip-192-168-181-203.ca-central-1.compute.internal. Please update the phoenix-server.jar on all regionservers with the version you are using for the client or update the client jar to use the version of deployed phoenix-server.jar

This is as per our backward compatibility contract:-

https://phoenix.apache.org/upgrading.html

Re: hbase issue.

@Ankit Singhal

II have verified all jar are same on all regionserer.

Highlighted

Re: hbase issue.

Have you verified the client jar you are using in your application. it should not be of newer version than server jar.

Re: hbase issue.

@ Ankit

Both are with same version.