Support Questions

Find answers, ask questions, and share your expertise

HMaster and regionservers are not starting after upgrading hbase version to 2.1.1 in HDP-3.0.1.0-187.

avatar
Contributor

Getting below error while starting Hbase service,

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-pig.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-thin-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-thin-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. Exception in thread "main" java.lang.NoSuchMethodError: com.ctc.wstx.io.StreamBootstrapper.getInstance(Ljava/lang/String;Lcom/ctc/wstx/io/SystemId;Ljava/io/InputStream;)Lcom/ctc/wstx/io/StreamBootstrapper; at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2918) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2901) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2953) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2926) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2806) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1254) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1660) at org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:66) at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:80) at org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:94) at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149) at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:3126)

Please help me in this, Thanks in advance.

5 REPLIES 5

avatar
Master Mentor

@Rohit Khose

Can you please check the following:

1. If the following kind of JAR exists and Upgraded properly?

/usr/hdp/Sn-3.0.1.0-187/hadoop/client/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hadoop/client/woodstox-core.jar
/usr/hdp/Sn-3.0.1.0-187/hadoop/lib/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hadoop/lib/ranger-hdfs-plugin-impl/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hadoop/lib/ranger-yarn-plugin-impl/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hadoop-hdfs/lib/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hbase/lib/woodstox-core-5.0.3.jar
/usr/hdp/Sn-3.0.1.0-187/hbase/lib/ranger-hbase-plugin-impl/woodstox-core-5.0.3.jar


2. Make sure that you do not have any different version of "woodstox" jar added to your hbase classpath.

Search for all woodstox JARs in your system to find out if it belongs to a different version.

3. Also please check that you have not set any HADOOP_HOME, HADOOP_CLASSPATH, HBASE_CLASSPATH kind of variables in your system pointing to a different version of Hadoop/hbase lib binariies.

avatar
Contributor

@Jay Kumar SenSharma

Thank you for your reply.

Tried above all outputs, but still getting the same error. Do you have any other suggestion/solution over this?

avatar
Master Mentor

@Rohit Khose

How did you upgrade only HBase of your HDP 3.0.1 installation?

By default HBase version 2.0.0 is shipped with HDP 3.0.1

https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.1/release-notes/content/comp_versions.html

.

For HDP 3.1 provides Apache HBase 2.0.2

https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/release-notes/content/comp_versions.html

avatar
Contributor

@Jay Kumar SenSharma

I'm doing it by just downloading hbase-2.1.1 version and replacing it by existing hbase-2.0.0 at /usr/hdp/3.0.1.0-187/ location. Is it possible or not? Now Masters are started but regionservers not starting, showing following error,

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 1 [regionserver/ubuntu20:60020] ERROR org.apache.hadoop.hbase.regionserver.HRegionServer - ***** ABORTING region server ubuntu20.mcloud.com,60020,1546085700471: Unhandled: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected ***** java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:768) at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:118) at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$16.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:848) at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$16.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:843) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:856) at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:51) at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:167) at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:165) at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113) at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:612) at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:124) at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:756) at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:486) at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.<init>(AsyncFSWAL.java:251) at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createWAL(AsyncFSWALProvider.java:73) at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createWAL(AsyncFSWALProvider.java:48) at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:138) at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:57) at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:276) at org.apache.hadoop.hbase.regionserver.HRegionServer.getWAL(HRegionServer.java:2100) at org.apache.hadoop.hbase.regionserver.HRegionServer.buildServerLoad(HRegionServer.java:1311) at org.apache.hadoop.hbase.regionserver.HRegionServer.tryRegionServerReport(HRegionServer.java:1193) at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:1013) at java.lang.Thread.run(Thread.java:745) 4 [regionserver/ubuntu20:60020] ERROR org.apache.hadoop.hbase.regionserver.HRegionServer - RegionServer abort: loaded coprocessors are: [] 101 [main] ERROR org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine - Region server exiting java.lang.RuntimeException: HRegionServer Aborted at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.start(HRegionServerCommandLine.java:67) at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.run(HRegionServerCommandLine.java:87) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149) at org.apache.hadoop.hbase.regionserver.HRegionServer.main(HRegionServer.java:3021)

avatar
Master Mentor

@Rohit Khose

Ambari provides Patch upgrade feature for individual component upgrades, However that is possible only when you get a tested and certified VDF from Hortonworks support. NOTE:Before performing a patch upgrade, you must obtain from Hortonworks Customer Support, the specific VDF file associated with the patch release.

To know more about Patch Upgrade please refer to: https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.2/bk_ambari-upgrade/content/performing_a_patc...

Else If you just try to install a community release of Higher version of HBase then it is not going to work that easily as it has many additional dependencies changed.