<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0 in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52876#M58302</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/11985"&gt;@hkumar449&lt;/a&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ok i think that i've overlooked, Since you have mentioned scala, i thought you are using scala from spark.. if you are not using spark then you can ignore my comment&lt;/P&gt;</description>
    <pubDate>Wed, 29 Mar 2017 18:45:27 GMT</pubDate>
    <dc:creator>saranvisa</dc:creator>
    <dc:date>2017-03-29T18:45:27Z</dc:date>
    <item>
      <title>Problem in connecting Hbase from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52773#M58299</link>
      <description>&lt;P&gt;I am trying to connect HBase from a Scala code but getting below error&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;[info] Set current project to play-sbt-project (in build file:/home/cloudera/Desktop/play-sbt-project/)
[info] Running Hi 
Hi!
17/03/28 07:48:00 WARN hbase.HBaseConfiguration: instantiating HBaseConfiguration() is deprecated. Please use HBaseConfiguration#create() to construct a plain Configuration
17/03/28 07:48:02 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x71979f1 connecting to ZooKeeper ensemble=localhost:2181
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:host.name=quickstart.cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/cloudera/sbt-0.13.13/bin/sbt-launch.jar
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.compiler=&amp;lt;NA&amp;gt;
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-573.el6.x86_64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.name=cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/cloudera/Desktop/play-sbt-project
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x71979f10x0, quorum=localhost:2181, baseZNode=/hbase
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Opening socket connection to server quickstart.cloudera/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Socket connection established to quickstart.cloudera/127.0.0.1:2181, initiating session
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Session establishment complete on server quickstart.cloudera/127.0.0.1:2181, sessionid = 0x15b14fd3ae901b6, negotiated timeout = 60000
17/03/28 07:48:52 INFO client.RpcRetryingCaller: Call exception, tries=10, retries=35, started=49123 ms ago, cancelled=false, msg=
17/03/28 07:49:12 INFO client.RpcRetryingCaller: Call exception, tries=11, retries=35, started=69304 ms ago, cancelled=false, msg=
17/03/28 07:49:32 INFO client.RpcRetryingCaller: Call exception, tries=12, retries=35, started=89341 ms ago, cancelled=false, msg=
17/03/28 07:49:52 INFO client.RpcRetryingCaller: Call exception, tries=13, retries=35, started=109483 ms ago, cancelled=false, msg=
17/03/28 07:50:13 INFO client.RpcRetryingCaller: Call exception, tries=14, retries=35, started=129550 ms ago, cancelled=false, msg=
17/03/28 07:50:33 INFO client.RpcRetryingCaller: Call exception, tries=15, retries=35, started=149684 ms ago, cancelled=false, msg=
17/03/28 07:50:53 INFO client.RpcRetryingCaller: Call exception, tries=16, retries=35, started=169798 ms ago, cancelled=false, msg=
17/03/28 07:51:13 INFO client.RpcRetryingCaller: Call exception, tries=17, retries=35, started=189818 ms ago, cancelled=false, msg=
17/03/28 07:51:33 INFO client.RpcRetryingCaller: Call exception, tries=18, retries=35, started=209998 ms ago, cancelled=false, msg=
17/03/28 07:51:53 INFO client.RpcRetryingCaller: Call exception, tries=19, retries=35, started=230015 ms ago, cancelled=false, msg=
17/03/28 07:52:13 INFO client.RpcRetryingCaller: Call exception, tries=20, retries=35, started=250027 ms ago, cancelled=false, msg=
17/03/28 07:52:33 INFO client.RpcRetryingCaller: Call exception, tries=21, retries=35, started=270138 ms ago, cancelled=false, msg=
17/03/28 07:52:53 INFO client.RpcRetryingCaller: Call exception, tries=22, retries=35, started=290293 ms ago, cancelled=false, msg=
17/03/28 07:53:13 INFO client.RpcRetryingCaller: Call exception, tries=23, retries=35, started=310314 ms ago, cancelled=false, msg=
17/03/28 07:53:33 INFO client.RpcRetryingCaller: Call exception, tries=24, retries=35, started=330365 ms ago, cancelled=false, msg=
17/03/28 07:53:54 INFO client.RpcRetryingCaller: Call exception, tries=25, retries=35, started=350544 ms ago, cancelled=false, msg=
17/03/28 07:54:14 INFO client.RpcRetryingCaller: Call exception, tries=26, retries=35, started=370677 ms ago, cancelled=false, msg=
17/03/28 07:54:34 INFO client.RpcRetryingCaller: Call exception, tries=27, retries=35, started=390778 ms ago, cancelled=false, msg=
17/03/28 07:54:54 INFO client.RpcRetryingCaller: Call exception, tries=28, retries=35, started=410887 ms ago, cancelled=false, msg=
17/03/28 07:55:14 INFO client.RpcRetryingCaller: Call exception, tries=29, retries=35, started=431015 ms ago, cancelled=false, msg=
17/03/28 07:55:34 INFO client.RpcRetryingCaller: Call exception, tries=30, retries=35, started=451155 ms ago, cancelled=false, msg=
17/03/28 07:55:54 INFO client.RpcRetryingCaller: Call exception, tries=31, retries=35, started=471212 ms ago, cancelled=false, msg=
17/03/28 07:56:14 INFO client.RpcRetryingCaller: Call exception, tries=32, retries=35, started=491329 ms ago, cancelled=false, msg=
17/03/28 07:56:34 INFO client.RpcRetryingCaller: Call exception, tries=33, retries=35, started=511392 ms ago, cancelled=false, msg=
17/03/28 07:56:55 INFO client.RpcRetryingCaller: Call exception, tries=34, retries=35, started=531569 ms ago, cancelled=false, msg=
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper

	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1560)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:239)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
	at Hi$.main(hw.scala:12)
	at Hi.main(hw.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last compile:run for the full output.
17/03/28 07:56:55 ERROR zookeeper.ClientCnxn: Event thread exiting due to interruption
java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2017)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2052)
	at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
	at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:494)
17/03/28 07:56:55 INFO zookeeper.ClientCnxn: EventThread shut down
java.lang.RuntimeException: Nonzero exit code: 1
	at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 544 s, completed Mar 28, 2017 7:56:56 AM&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;I referred this url&amp;nbsp;&lt;A href="https://hbase.apache.org/book.html#scala" target="_blank"&gt;https://hbase.apache.org/book.html#scala&lt;/A&gt;&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;Setting the CLASSPATH. I did't mention the "/path/to/scala-library.jar" in the CLASSPATH as mentiond in the link.&lt;/LI&gt;&lt;LI&gt;&lt;PRE&gt;$ export CLASSPATH=$CLASSPATH:/usr/lib/hadoop/lib/native:/usr/lib/hbase/lib/native/Linux-amd64-64&lt;/PRE&gt;&lt;/LI&gt;&lt;LI&gt;Project root directory = /home/cloudera/Desktop/play-sbt-project&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;My /&lt;/SPAN&gt;&lt;SPAN&gt;home/cloudera/Desktop/play-sbt-project&lt;/SPAN&gt;/build.sbt looks like this. I changed the dependent library version as per my environment and added few more dependencies like "hbase-client", "hbase-common" &amp;amp; "hbase-server"&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "1.2.1",
"org.apache.hbase" % "hbase" % "1.2.0",
"org.apache.hbase" % "hbase-client" % "1.2.0",
"org.apache.hbase" % "hbase-common" % "1.2.0",
"org.apache.hbase" % "hbase-server" % "1.2.0"
)&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;My main code for Hbase connectivity /home/cloudera/Desktop/play-sbt-project/src/main/scala/pw.scala looks like this&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HBaseAdmin,HTable,Put,Get}
import org.apache.hadoop.hbase.util.Bytes

object Hi {
def main(args: Array[String]) = {
println("Hi!")
val conf = new HBaseConfiguration()
val connection = ConnectionFactory.createConnection(conf);
val admin = connection.getAdmin();

// list the tables
val listtables=admin.listTables()
listtables.foreach(println)
}
}&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;My&amp;nbsp;/etc/hbase/conf/hbase-site.xml looks like this:&lt;/LI&gt;&lt;/UL&gt;&lt;PRE&gt;&amp;lt;!--&lt;BR /&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;

&amp;lt;!--Autogenerated by Cloudera Manager--&amp;gt;
&amp;lt;configuration&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.rootdir&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;hdfs://quickstart.cloudera:8020/hbase&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.replication&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;true&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.write.buffer&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;2097152&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.pause&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;100&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.retries.number&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;35&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.scanner.caching&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;100&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.keyvalue.maxsize&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;10485760&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.ipc.client.allowsInterrupt&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;true&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.primaryCallTimeout.get&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;10&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.client.primaryCallTimeout.multiget&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;10&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.coprocessor.region.classes&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.regionserver.thrift.http&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;false&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.thrift.support.proxyuser&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;false&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.rpc.timeout&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;60000&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.snapshot.enabled&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;true&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.snapshot.master.timeoutMillis&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;60000&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.snapshot.region.timeout&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;60000&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.snapshot.master.timeout.millis&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;60000&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.security.authentication&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;simple&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.rpc.protection&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;authentication&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;zookeeper.session.timeout&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;60000&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;zookeeper.znode.parent&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;/hbase&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;zookeeper.znode.rootserver&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;root-region-server&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.zookeeper.quorum&amp;lt;/name&amp;gt;
    &amp;lt;!-- &amp;lt;value&amp;gt;quickstart.cloudera&amp;lt;/value&amp;gt; --&amp;gt;
    &amp;lt;value&amp;gt;127.0.0.1&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.zookeeper.property.clientPort&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;2181&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
  &amp;lt;property&amp;gt;
    &amp;lt;name&amp;gt;hbase.rest.ssl.enabled&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;false&amp;lt;/value&amp;gt;
  &amp;lt;/property&amp;gt;
&amp;lt;/configuration&amp;gt;&lt;BR /&gt;--&amp;gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I googled alot to solve this issue but didn't got sucess. In the process of solving this issue I have done below changes:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Changed the Dependent libraries version in build.sbt file as per my environment&lt;/LI&gt;&lt;LI&gt;Added few more dependent libraries&amp;nbsp;"hbase-client", "hbase-common" &amp;amp; "hbase-server".&lt;/LI&gt;&lt;LI&gt;Chaned the "hbase.zookeeper.quorum" value from "quickstart.cloudera" to "127.0.0.1" in "hbase-site.xml" file.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Please help me solving this issue. Thank you.&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 11:21:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52773#M58299</guid>
      <dc:creator>hkumar449</dc:creator>
      <dc:date>2022-09-16T11:21:18Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52779#M58300</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/11985"&gt;@hkumar449&lt;/a&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This may help you!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;make sure&amp;nbsp;&lt;SPAN&gt;hbase-site.xml &amp;nbsp;is available under spark configuration&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So either you have to copy/paste hbase-site.xml to /etc/spark/conf (or) create a softlink and try again&lt;/P&gt;</description>
      <pubDate>Tue, 28 Mar 2017 20:37:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52779#M58300</guid>
      <dc:creator>saranvisa</dc:creator>
      <dc:date>2017-03-28T20:37:02Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52781#M58301</link>
      <description>Thank you &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/18441"&gt;@saranvisa&lt;/a&gt; for your comment. But I have one doubt nowhere I am using Spark in my code and my Spark services are also stopped from cloudera manager, so how putting hbase-site.xml in /etc/spark/conf will help me in HBase to Scala connectivity?</description>
      <pubDate>Tue, 28 Mar 2017 21:05:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52781#M58301</guid>
      <dc:creator>hkumar449</dc:creator>
      <dc:date>2017-03-28T21:05:25Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52876#M58302</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/11985"&gt;@hkumar449&lt;/a&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ok i think that i've overlooked, Since you have mentioned scala, i thought you are using scala from spark.. if you are not using spark then you can ignore my comment&lt;/P&gt;</description>
      <pubDate>Wed, 29 Mar 2017 18:45:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52876#M58302</guid>
      <dc:creator>saranvisa</dc:creator>
      <dc:date>2017-03-29T18:45:27Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52915#M58303</link>
      <description>&lt;P&gt;Resolved the issue. There are following changes need to be done:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Change "hadoop-core" to "hadoop-common" inside build.sbt file. Since in latest CDH versions 'hadoop-core' is only supported by code running for MapReduce 1.&lt;/LI&gt;&lt;LI&gt;Change all the dependecy version as per cloudera 5.8.0 compatibility in build.sbt. Updated&amp;nbsp;build.sbt looks like this:&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"&lt;BR /&gt;resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"&lt;BR /&gt;&lt;BR /&gt;libraryDependencies ++= Seq(&lt;BR /&gt; "org.apache.hadoop" % "hadoop-common" % "2.6.0-cdh5.8.0",&lt;BR /&gt; "org.apache.hbase" % "hbase" % "1.2.0-cdh5.8.0",&lt;BR /&gt; "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.8.0",&lt;BR /&gt; "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.8.0",&lt;BR /&gt; "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.8.0"&lt;BR /&gt;)&lt;/PRE&gt;&lt;P&gt;3. &lt;SPAN&gt;HBaseConfiguration() class is depricated. Instead use create() method. Also I changed some logic in the main code. Earlier I was getting the tables present in HBase (Since this was giving some issues so I dropped this but I will try this next time), Since my moto is to establish Scala to HBase connectivity so now I am trying to insert new row to the already existing HBase table. New code looks like this&lt;/SPAN&gt;:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;package main.scala&lt;BR /&gt;&lt;BR /&gt;import org.apache.hadoop.conf.Configuration&lt;BR /&gt;import org.apache.hadoop.hbase.HBaseConfiguration&lt;BR /&gt;import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}&lt;BR /&gt;import org.apache.hadoop.hbase.util.Bytes&lt;BR /&gt;&lt;BR /&gt;object Hi {&lt;BR /&gt;&lt;BR /&gt; def main(args: Array[String]) = {&lt;BR /&gt; println("Hi!")&lt;BR /&gt; val conf:Configuration = HBaseConfiguration.create()&lt;BR /&gt; val table:HTable = new HTable(conf, "emp1")&lt;BR /&gt; val put1:Put = new Put(Bytes.toBytes("row1"))&lt;BR /&gt; put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"))&lt;BR /&gt; table.put(put1)&lt;BR /&gt; println("Success")&lt;BR /&gt; }&lt;BR /&gt;}&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Mar 2017 12:51:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52915#M58303</guid>
      <dc:creator>hkumar449</dc:creator>
      <dc:date>2017-03-30T12:51:24Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52921#M58304</link>
      <description>While this would work, note that its recommended not to rely on&lt;BR /&gt;"hadoop-core" or "hadoop-common", but use the meta/blanket-wrapper&lt;BR /&gt;"hadoop-client" dependency instead. This dependency will cover everything&lt;BR /&gt;typically required by hadoop-side dependency and would cause lesser missing&lt;BR /&gt;library surprises in future.&lt;BR /&gt;&lt;BR /&gt;See also&lt;BR /&gt;&lt;A href="https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_hadoop_api_dependencies.html" target="_blank"&gt;https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_hadoop_api_dependencies.html&lt;/A&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 30 Mar 2017 13:00:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52921#M58304</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2017-03-30T13:00:57Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52949#M58305</link>
      <description>&lt;P&gt;Hi Harsh, Thanks for the comment. Yes you are correct. I tried using "hadoop-client" instead of "hadoop-common", my code is still working fine.&lt;/P&gt;</description>
      <pubDate>Thu, 30 Mar 2017 20:07:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52949#M58305</guid>
      <dc:creator>hkumar449</dc:creator>
      <dc:date>2017-03-30T20:07:25Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52952#M58306</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My above code is working very fine if my sbt project folder is inside my Cloudera VM.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now I am trying to connect the same HBase table from my windows machine on which the Cloudera VM is running.&lt;/P&gt;&lt;P&gt;I just did folowwing changes in my project code"&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;I copied my project root directory structure "/&lt;SPAN&gt;play-sbt-project&lt;/SPAN&gt;&lt;SPAN&gt;/*" to my windows &lt;span class="lia-unicode-emoji" title=":anguished_face:"&gt;😧&lt;/span&gt; drive.&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;&lt;SPAN&gt;Added below line of code inside&amp;nbsp;/play-sbt-project/src/main/scala/pw.scala file&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;PRE&gt;conf.set("hbase.zookeeper.quorum","xxx.xxx.xxx.xxx") // xxx.xxx.xxx.xxx IP address of my Cloudera virtual machine.
conf.set("hbase.zookeeper.property.clientPort", "2181")&lt;/PRE&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;My new "pw.scala" looks like this:&lt;/SPAN&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;package main.scala

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}
import org.apache.hadoop.hbase.util.Bytes

object Hi {

 def main(args: Array[String]) = {
 println("Hi!")
 val conf:Configuration = HBaseConfiguration.create()&lt;BR /&gt; conf.set("hbase.zookeeper.quorum", "xxx.xxx.xxx.xxx") //IP address of my Cloudera virtual machine&lt;BR /&gt; conf.set("hbase.zookeeper.property.clientPort", "2181")
 val table:HTable = new HTable(conf, "emp1")
 val put1:Put = new Put(Bytes.toBytes("row1"))
 put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"))
 table.put(put1)
 println("Success")
 }
}&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;SPAN&gt;I haven't done any thing related to my CLASSPATH variabe om my windows machine. If I need to do any changes related to my CLASSPATH veriable, how and where I should do those changes?&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;And finally running the "sbt run" command from my project root directory.on my Windows machine.&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;I am getting below error:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;D:\scala-hbase&amp;gt;sbt run
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
[info] Set current project to scala-hbase (in build file:/D:/scala-hbase/)
[info] Running Hi
Hi!
log4j:WARN No appenders could be found for logger (org.apache.hadoop.security.Gr
oups).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more in
fo.
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time,
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 ac
tion: UnknownHostException: 1 time,
        at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException
(AsyncProcess.java:247)
        at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$1800(A
syncProcess.java:227)
        at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndR
eset(AsyncProcess.java:1766)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCom
mits(BufferedMutatorImpl.java:240)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMuta
torImpl.java:190)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1495)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1086)
        at Hi$.main(hw.scala:16)
        at Hi.main(hw.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
        at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 533 s, completed Mar 31, 2017 1:50:22 AM

D:\scala-hbase&amp;gt;&lt;/PRE&gt;&lt;P&gt;From the error log I can see it is saying&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But I am not able to rectify this error.&lt;/P&gt;&lt;P&gt;Any help related to this is highly appriciated. Thanks!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Mar 2017 20:34:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/52952#M58306</guid>
      <dc:creator>hkumar449</dc:creator>
      <dc:date>2017-03-30T20:34:04Z</dc:date>
    </item>
    <item>
      <title>Re: Problem in connecting Hbace from Scala code in Cloudera Quick start VM CDH5.8.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/62524#M58307</link>
      <description>&lt;P&gt;Hey Harsha, I am facing a similar problem with the CDH 5.13 version.. have shared the details here&lt;/P&gt;&lt;P&gt;&lt;A href="http://community.cloudera.com/t5/Data-Ingestion-Integration/Problem-in-connecting-to-Hbase-from-scala-code-in-Cloudera/m-p/62519#M2779" target="_blank"&gt;http://community.cloudera.com/t5/Data-Ingestion-Integration/Problem-in-connecting-to-Hbase-from-scala-code-in-Cloudera/m-p/62519#M2779&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please let me knoe if there is something wrong that I am doing. Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 06 Dec 2017 09:17:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Problem-in-connecting-Hbase-from-Scala-code-in-Cloudera/m-p/62524#M58307</guid>
      <dc:creator>manick</dc:creator>
      <dc:date>2017-12-06T09:17:12Z</dc:date>
    </item>
  </channel>
</rss>

