Support Questions

Find answers, ask questions, and share your expertise

Problem in connecting Hbase from Scala code in Cloudera Quick start VM CDH5.8.0

avatar
Contributor

I am trying to connect HBase from a Scala code but getting below error 

 

[info] Set current project to play-sbt-project (in build file:/home/cloudera/Desktop/play-sbt-project/)
[info] Running Hi 
Hi!
17/03/28 07:48:00 WARN hbase.HBaseConfiguration: instantiating HBaseConfiguration() is deprecated. Please use HBaseConfiguration#create() to construct a plain Configuration
17/03/28 07:48:02 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x71979f1 connecting to ZooKeeper ensemble=localhost:2181
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:host.name=quickstart.cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/cloudera/sbt-0.13.13/bin/sbt-launch.jar
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-573.el6.x86_64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.name=cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/cloudera/Desktop/play-sbt-project
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x71979f10x0, quorum=localhost:2181, baseZNode=/hbase
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Opening socket connection to server quickstart.cloudera/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Socket connection established to quickstart.cloudera/127.0.0.1:2181, initiating session
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Session establishment complete on server quickstart.cloudera/127.0.0.1:2181, sessionid = 0x15b14fd3ae901b6, negotiated timeout = 60000
17/03/28 07:48:52 INFO client.RpcRetryingCaller: Call exception, tries=10, retries=35, started=49123 ms ago, cancelled=false, msg=
17/03/28 07:49:12 INFO client.RpcRetryingCaller: Call exception, tries=11, retries=35, started=69304 ms ago, cancelled=false, msg=
17/03/28 07:49:32 INFO client.RpcRetryingCaller: Call exception, tries=12, retries=35, started=89341 ms ago, cancelled=false, msg=
17/03/28 07:49:52 INFO client.RpcRetryingCaller: Call exception, tries=13, retries=35, started=109483 ms ago, cancelled=false, msg=
17/03/28 07:50:13 INFO client.RpcRetryingCaller: Call exception, tries=14, retries=35, started=129550 ms ago, cancelled=false, msg=
17/03/28 07:50:33 INFO client.RpcRetryingCaller: Call exception, tries=15, retries=35, started=149684 ms ago, cancelled=false, msg=
17/03/28 07:50:53 INFO client.RpcRetryingCaller: Call exception, tries=16, retries=35, started=169798 ms ago, cancelled=false, msg=
17/03/28 07:51:13 INFO client.RpcRetryingCaller: Call exception, tries=17, retries=35, started=189818 ms ago, cancelled=false, msg=
17/03/28 07:51:33 INFO client.RpcRetryingCaller: Call exception, tries=18, retries=35, started=209998 ms ago, cancelled=false, msg=
17/03/28 07:51:53 INFO client.RpcRetryingCaller: Call exception, tries=19, retries=35, started=230015 ms ago, cancelled=false, msg=
17/03/28 07:52:13 INFO client.RpcRetryingCaller: Call exception, tries=20, retries=35, started=250027 ms ago, cancelled=false, msg=
17/03/28 07:52:33 INFO client.RpcRetryingCaller: Call exception, tries=21, retries=35, started=270138 ms ago, cancelled=false, msg=
17/03/28 07:52:53 INFO client.RpcRetryingCaller: Call exception, tries=22, retries=35, started=290293 ms ago, cancelled=false, msg=
17/03/28 07:53:13 INFO client.RpcRetryingCaller: Call exception, tries=23, retries=35, started=310314 ms ago, cancelled=false, msg=
17/03/28 07:53:33 INFO client.RpcRetryingCaller: Call exception, tries=24, retries=35, started=330365 ms ago, cancelled=false, msg=
17/03/28 07:53:54 INFO client.RpcRetryingCaller: Call exception, tries=25, retries=35, started=350544 ms ago, cancelled=false, msg=
17/03/28 07:54:14 INFO client.RpcRetryingCaller: Call exception, tries=26, retries=35, started=370677 ms ago, cancelled=false, msg=
17/03/28 07:54:34 INFO client.RpcRetryingCaller: Call exception, tries=27, retries=35, started=390778 ms ago, cancelled=false, msg=
17/03/28 07:54:54 INFO client.RpcRetryingCaller: Call exception, tries=28, retries=35, started=410887 ms ago, cancelled=false, msg=
17/03/28 07:55:14 INFO client.RpcRetryingCaller: Call exception, tries=29, retries=35, started=431015 ms ago, cancelled=false, msg=
17/03/28 07:55:34 INFO client.RpcRetryingCaller: Call exception, tries=30, retries=35, started=451155 ms ago, cancelled=false, msg=
17/03/28 07:55:54 INFO client.RpcRetryingCaller: Call exception, tries=31, retries=35, started=471212 ms ago, cancelled=false, msg=
17/03/28 07:56:14 INFO client.RpcRetryingCaller: Call exception, tries=32, retries=35, started=491329 ms ago, cancelled=false, msg=
17/03/28 07:56:34 INFO client.RpcRetryingCaller: Call exception, tries=33, retries=35, started=511392 ms ago, cancelled=false, msg=
17/03/28 07:56:55 INFO client.RpcRetryingCaller: Call exception, tries=34, retries=35, started=531569 ms ago, cancelled=false, msg=
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper

	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1560)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:239)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last compile:run for the full output.
17/03/28 07:56:55 ERROR zookeeper.ClientCnxn: Event thread exiting due to interruption
java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2017)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2052)
	at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
	at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:494)
17/03/28 07:56:55 INFO zookeeper.ClientCnxn: EventThread shut down
java.lang.RuntimeException: Nonzero exit code: 1
	at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 544 s, completed Mar 28, 2017 7:56:56 AM

 

 

  • I referred this url https://hbase.apache.org/book.html#scala 
  • Setting the CLASSPATH. I did't mention the "/path/to/scala-library.jar" in the CLASSPATH as mentiond in the link.
  • $ export CLASSPATH=$CLASSPATH:/usr/lib/hadoop/lib/native:/usr/lib/hbase/lib/native/Linux-amd64-64
  • Project root directory = /home/cloudera/Desktop/play-sbt-project
  • My /home/cloudera/Desktop/play-sbt-project/build.sbt looks like this. I changed the dependent library version as per my environment and added few more dependencies like "hbase-client", "hbase-common" & "hbase-server"

 

name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "1.2.1",
"org.apache.hbase" % "hbase" % "1.2.0",
"org.apache.hbase" % "hbase-client" % "1.2.0",
"org.apache.hbase" % "hbase-common" % "1.2.0",
"org.apache.hbase" % "hbase-server" % "1.2.0"
)

 

  • My main code for Hbase connectivity /home/cloudera/Desktop/play-sbt-project/src/main/scala/pw.scala looks like this

 

import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HBaseAdmin,HTable,Put,Get}
import org.apache.hadoop.hbase.util.Bytes

object Hi {
def main(args: Array[String]) = {
println("Hi!")
val conf = new HBaseConfiguration()
val connection = ConnectionFactory.createConnection(conf);
val admin = connection.getAdmin();

// list the tables
val listtables=admin.listTables()
listtables.foreach(println)
}
}

 

  • My /etc/hbase/conf/hbase-site.xml looks like this:
<!--
<?xml version="1.0" encoding="UTF-8"?> <!--Autogenerated by Cloudera Manager--> <configuration> <property> <name>hbase.rootdir</name> <value>hdfs://quickstart.cloudera:8020/hbase</value> </property> <property> <name>hbase.replication</name> <value>true</value> </property> <property> <name>hbase.client.write.buffer</name> <value>2097152</value> </property> <property> <name>hbase.client.pause</name> <value>100</value> </property> <property> <name>hbase.client.retries.number</name> <value>35</value> </property> <property> <name>hbase.client.scanner.caching</name> <value>100</value> </property> <property> <name>hbase.client.keyvalue.maxsize</name> <value>10485760</value> </property> <property> <name>hbase.ipc.client.allowsInterrupt</name> <value>true</value> </property> <property> <name>hbase.client.primaryCallTimeout.get</name> <value>10</value> </property> <property> <name>hbase.client.primaryCallTimeout.multiget</name> <value>10</value> </property> <property> <name>hbase.coprocessor.region.classes</name> <value>org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value> </property> <property> <name>hbase.regionserver.thrift.http</name> <value>false</value> </property> <property> <name>hbase.thrift.support.proxyuser</name> <value>false</value> </property> <property> <name>hbase.rpc.timeout</name> <value>60000</value> </property> <property> <name>hbase.snapshot.enabled</name> <value>true</value> </property> <property> <name>hbase.snapshot.master.timeoutMillis</name> <value>60000</value> </property> <property> <name>hbase.snapshot.region.timeout</name> <value>60000</value> </property> <property> <name>hbase.snapshot.master.timeout.millis</name> <value>60000</value> </property> <property> <name>hbase.security.authentication</name> <value>simple</value> </property> <property> <name>hbase.rpc.protection</name> <value>authentication</value> </property> <property> <name>zookeeper.session.timeout</name> <value>60000</value> </property> <property> <name>zookeeper.znode.parent</name> <value>/hbase</value> </property> <property> <name>zookeeper.znode.rootserver</name> <value>root-region-server</value> </property> <property> <name>hbase.zookeeper.quorum</name> <!-- <value>quickstart.cloudera</value> --> <value>127.0.0.1</value> </property> <property> <name>hbase.zookeeper.property.clientPort</name> <value>2181</value> </property> <property> <name>hbase.rest.ssl.enabled</name> <value>false</value> </property> </configuration>
-->

 

I googled alot to solve this issue but didn't got sucess. In the process of solving this issue I have done below changes:

  • Changed the Dependent libraries version in build.sbt file as per my environment
  • Added few more dependent libraries "hbase-client", "hbase-common" & "hbase-server".
  • Chaned the "hbase.zookeeper.quorum" value from "quickstart.cloudera" to "127.0.0.1" in "hbase-site.xml" file.

Please help me solving this issue. Thank you.

1 ACCEPTED SOLUTION

avatar
Contributor

Resolved the issue. There are following changes need to be done:

  1. Change "hadoop-core" to "hadoop-common" inside build.sbt file. Since in latest CDH versions 'hadoop-core' is only supported by code running for MapReduce 1.
  2. Change all the dependecy version as per cloudera 5.8.0 compatibility in build.sbt. Updated build.sbt looks like this:

 

name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"
resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"

libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-common" % "2.6.0-cdh5.8.0",
"org.apache.hbase" % "hbase" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.8.0"
)

3. HBaseConfiguration() class is depricated. Instead use create() method. Also I changed some logic in the main code. Earlier I was getting the tables present in HBase (Since this was giving some issues so I dropped this but I will try this next time), Since my moto is to establish Scala to HBase connectivity so now I am trying to insert new row to the already existing HBase table. New code looks like this:

 

package main.scala

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}
import org.apache.hadoop.hbase.util.Bytes

object Hi {

def main(args: Array[String]) = {
println("Hi!")
val conf:Configuration = HBaseConfiguration.create()
val table:HTable = new HTable(conf, "emp1")
val put1:Put = new Put(Bytes.toBytes("row1"))
put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"))
table.put(put1)
println("Success")
}
}

 

View solution in original post

8 REPLIES 8

avatar
Champion

@hkumar449

 

This may help you!

 

make sure hbase-site.xml  is available under spark configuration

 

So either you have to copy/paste hbase-site.xml to /etc/spark/conf (or) create a softlink and try again

avatar
Contributor
Thank you @saranvisa for your comment. But I have one doubt nowhere I am using Spark in my code and my Spark services are also stopped from cloudera manager, so how putting hbase-site.xml in /etc/spark/conf will help me in HBase to Scala connectivity?

avatar
Champion

@hkumar449

 

ok i think that i've overlooked, Since you have mentioned scala, i thought you are using scala from spark.. if you are not using spark then you can ignore my comment

avatar
Contributor

Resolved the issue. There are following changes need to be done:

  1. Change "hadoop-core" to "hadoop-common" inside build.sbt file. Since in latest CDH versions 'hadoop-core' is only supported by code running for MapReduce 1.
  2. Change all the dependecy version as per cloudera 5.8.0 compatibility in build.sbt. Updated build.sbt looks like this:

 

name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"
resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"

libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-common" % "2.6.0-cdh5.8.0",
"org.apache.hbase" % "hbase" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.8.0",
"org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.8.0"
)

3. HBaseConfiguration() class is depricated. Instead use create() method. Also I changed some logic in the main code. Earlier I was getting the tables present in HBase (Since this was giving some issues so I dropped this but I will try this next time), Since my moto is to establish Scala to HBase connectivity so now I am trying to insert new row to the already existing HBase table. New code looks like this:

 

package main.scala

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}
import org.apache.hadoop.hbase.util.Bytes

object Hi {

def main(args: Array[String]) = {
println("Hi!")
val conf:Configuration = HBaseConfiguration.create()
val table:HTable = new HTable(conf, "emp1")
val put1:Put = new Put(Bytes.toBytes("row1"))
put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"))
table.put(put1)
println("Success")
}
}

 

avatar
Mentor
While this would work, note that its recommended not to rely on
"hadoop-core" or "hadoop-common", but use the meta/blanket-wrapper
"hadoop-client" dependency instead. This dependency will cover everything
typically required by hadoop-side dependency and would cause lesser missing
library surprises in future.

See also
https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_hadoop_api_dependencie...

avatar
Contributor

Hi Harsh, Thanks for the comment. Yes you are correct. I tried using "hadoop-client" instead of "hadoop-common", my code is still working fine.

avatar
Contributor

Hi All,

 

My above code is working very fine if my sbt project folder is inside my Cloudera VM. 

Now I am trying to connect the same HBase table from my windows machine on which the Cloudera VM is running.

I just did folowwing changes in my project code"

  1. I copied my project root directory structure "/play-sbt-project/*" to my windows 😧 drive.
  2. Added below line of code inside /play-sbt-project/src/main/scala/pw.scala file
    conf.set("hbase.zookeeper.quorum","xxx.xxx.xxx.xxx") // xxx.xxx.xxx.xxx IP address of my Cloudera virtual machine.
    conf.set("hbase.zookeeper.property.clientPort", "2181")
  3. My new "pw.scala" looks like this:

 

package main.scala

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}
import org.apache.hadoop.hbase.util.Bytes

object Hi {

 def main(args: Array[String]) = {
 println("Hi!")
 val conf:Configuration = HBaseConfiguration.create()
conf.set("hbase.zookeeper.quorum", "xxx.xxx.xxx.xxx") //IP address of my Cloudera virtual machine
conf.set("hbase.zookeeper.property.clientPort", "2181") val table:HTable = new HTable(conf, "emp1") val put1:Put = new Put(Bytes.toBytes("row1")) put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1")) table.put(put1) println("Success") } }

 

  1. I haven't done any thing related to my CLASSPATH variabe om my windows machine. If I need to do any changes related to my CLASSPATH veriable, how and where I should do those changes?
  2. And finally running the "sbt run" command from my project root directory.on my Windows machine.

I am getting below error: 

D:\scala-hbase>sbt run
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
[info] Set current project to scala-hbase (in build file:/D:/scala-hbase/)
[info] Running Hi
Hi!
log4j:WARN No appenders could be found for logger (org.apache.hadoop.security.Gr
oups).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more in
fo.
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time,
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 ac
tion: UnknownHostException: 1 time,
        at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException
(AsyncProcess.java:247)
        at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$1800(A
syncProcess.java:227)
        at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndR
eset(AsyncProcess.java:1766)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCom
mits(BufferedMutatorImpl.java:240)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMuta
torImpl.java:190)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1495)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1086)
        at Hi$.main(hw.scala:16)
        at Hi.main(hw.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
        at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 533 s, completed Mar 31, 2017 1:50:22 AM

D:\scala-hbase>

From the error log I can see it is saying 

org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time

 

But I am not able to rectify this error.

Any help related to this is highly appriciated. Thanks!

 

avatar
New Contributor

Hey Harsha, I am facing a similar problem with the CDH 5.13 version.. have shared the details here

http://community.cloudera.com/t5/Data-Ingestion-Integration/Problem-in-connecting-to-Hbase-from-scal...

 

Please let me knoe if there is something wrong that I am doing. Thanks