Member since
09-01-2015
18
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
15765 | 03-30-2017 05:16 AM | |
1610 | 11-27-2015 01:18 AM |
03-30-2017
01:34 PM
Hi All, My above code is working very fine if my sbt project folder is inside my Cloudera VM. Now I am trying to connect the same HBase table from my windows machine on which the Cloudera VM is running. I just did folowwing changes in my project code" I copied my project root directory structure "/play-sbt-project/*" to my windows 😧 drive. Added below line of code inside /play-sbt-project/src/main/scala/pw.scala file conf.set("hbase.zookeeper.quorum","xxx.xxx.xxx.xxx") // xxx.xxx.xxx.xxx IP address of my Cloudera virtual machine.
conf.set("hbase.zookeeper.property.clientPort", "2181") My new "pw.scala" looks like this: package main.scala
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put}
import org.apache.hadoop.hbase.util.Bytes
object Hi {
def main(args: Array[String]) = {
println("Hi!")
val conf:Configuration = HBaseConfiguration.create() conf.set("hbase.zookeeper.quorum", "xxx.xxx.xxx.xxx") //IP address of my Cloudera virtual machine conf.set("hbase.zookeeper.property.clientPort", "2181")
val table:HTable = new HTable(conf, "emp1")
val put1:Put = new Put(Bytes.toBytes("row1"))
put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"))
table.put(put1)
println("Success")
}
} I haven't done any thing related to my CLASSPATH variabe om my windows machine. If I need to do any changes related to my CLASSPATH veriable, how and where I should do those changes? And finally running the "sbt run" command from my project root directory.on my Windows machine. I am getting below error: D:\scala-hbase>sbt run
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
[info] Set current project to scala-hbase (in build file:/D:/scala-hbase/)
[info] Running Hi
Hi!
log4j:WARN No appenders could be found for logger (org.apache.hadoop.security.Gr
oups).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more in
fo.
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time,
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 ac
tion: UnknownHostException: 1 time,
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException
(AsyncProcess.java:247)
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$1800(A
syncProcess.java:227)
at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndR
eset(AsyncProcess.java:1766)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCom
mits(BufferedMutatorImpl.java:240)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMuta
torImpl.java:190)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1495)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1086)
at Hi$.main(hw.scala:16)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 533 s, completed Mar 31, 2017 1:50:22 AM
D:\scala-hbase> From the error log I can see it is saying org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsE
xception: Failed 1 action: UnknownHostException: 1 time But I am not able to rectify this error. Any help related to this is highly appriciated. Thanks!
... View more
03-30-2017
01:07 PM
Hi Harsh, Thanks for the comment. Yes you are correct. I tried using "hadoop-client" instead of "hadoop-common", my code is still working fine.
... View more
03-30-2017
05:16 AM
Resolved the issue. There are following changes need to be done: Change "hadoop-core" to "hadoop-common" inside build.sbt file. Since in latest CDH versions 'hadoop-core' is only supported by code running for MapReduce 1. Change all the dependecy version as per cloudera 5.8.0 compatibility in build.sbt. Updated build.sbt looks like this: name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/" resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" libraryDependencies ++= Seq( "org.apache.hadoop" % "hadoop-common" % "2.6.0-cdh5.8.0", "org.apache.hbase" % "hbase" % "1.2.0-cdh5.8.0", "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.8.0", "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.8.0", "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.8.0" ) 3. HBaseConfiguration() class is depricated. Instead use create() method. Also I changed some logic in the main code. Earlier I was getting the tables present in HBase (Since this was giving some issues so I dropped this but I will try this next time), Since my moto is to establish Scala to HBase connectivity so now I am trying to insert new row to the already existing HBase table. New code looks like this: package main.scala import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.HBaseConfiguration import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put} import org.apache.hadoop.hbase.util.Bytes object Hi { def main(args: Array[String]) = { println("Hi!") val conf:Configuration = HBaseConfiguration.create() val table:HTable = new HTable(conf, "emp1") val put1:Put = new Put(Bytes.toBytes("row1")) put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1")) table.put(put1) println("Success") } }
... View more
03-28-2017
02:05 PM
Thank you @saranvisa for your comment. But I have one doubt nowhere I am using Spark in my code and my Spark services are also stopped from cloudera manager, so how putting hbase-site.xml in /etc/spark/conf will help me in HBase to Scala connectivity?
... View more
03-28-2017
12:29 PM
I am trying to connect HBase from a Scala code but getting below error [info] Set current project to play-sbt-project (in build file:/home/cloudera/Desktop/play-sbt-project/)
[info] Running Hi
Hi!
17/03/28 07:48:00 WARN hbase.HBaseConfiguration: instantiating HBaseConfiguration() is deprecated. Please use HBaseConfiguration#create() to construct a plain Configuration
17/03/28 07:48:02 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x71979f1 connecting to ZooKeeper ensemble=localhost:2181
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:host.name=quickstart.cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/cloudera/sbt-0.13.13/bin/sbt-launch.jar
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-573.el6.x86_64
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.name=cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/cloudera
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/cloudera/Desktop/play-sbt-project
17/03/28 07:48:02 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x71979f10x0, quorum=localhost:2181, baseZNode=/hbase
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Opening socket connection to server quickstart.cloudera/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Socket connection established to quickstart.cloudera/127.0.0.1:2181, initiating session
17/03/28 07:48:02 INFO zookeeper.ClientCnxn: Session establishment complete on server quickstart.cloudera/127.0.0.1:2181, sessionid = 0x15b14fd3ae901b6, negotiated timeout = 60000
17/03/28 07:48:52 INFO client.RpcRetryingCaller: Call exception, tries=10, retries=35, started=49123 ms ago, cancelled=false, msg=
17/03/28 07:49:12 INFO client.RpcRetryingCaller: Call exception, tries=11, retries=35, started=69304 ms ago, cancelled=false, msg=
17/03/28 07:49:32 INFO client.RpcRetryingCaller: Call exception, tries=12, retries=35, started=89341 ms ago, cancelled=false, msg=
17/03/28 07:49:52 INFO client.RpcRetryingCaller: Call exception, tries=13, retries=35, started=109483 ms ago, cancelled=false, msg=
17/03/28 07:50:13 INFO client.RpcRetryingCaller: Call exception, tries=14, retries=35, started=129550 ms ago, cancelled=false, msg=
17/03/28 07:50:33 INFO client.RpcRetryingCaller: Call exception, tries=15, retries=35, started=149684 ms ago, cancelled=false, msg=
17/03/28 07:50:53 INFO client.RpcRetryingCaller: Call exception, tries=16, retries=35, started=169798 ms ago, cancelled=false, msg=
17/03/28 07:51:13 INFO client.RpcRetryingCaller: Call exception, tries=17, retries=35, started=189818 ms ago, cancelled=false, msg=
17/03/28 07:51:33 INFO client.RpcRetryingCaller: Call exception, tries=18, retries=35, started=209998 ms ago, cancelled=false, msg=
17/03/28 07:51:53 INFO client.RpcRetryingCaller: Call exception, tries=19, retries=35, started=230015 ms ago, cancelled=false, msg=
17/03/28 07:52:13 INFO client.RpcRetryingCaller: Call exception, tries=20, retries=35, started=250027 ms ago, cancelled=false, msg=
17/03/28 07:52:33 INFO client.RpcRetryingCaller: Call exception, tries=21, retries=35, started=270138 ms ago, cancelled=false, msg=
17/03/28 07:52:53 INFO client.RpcRetryingCaller: Call exception, tries=22, retries=35, started=290293 ms ago, cancelled=false, msg=
17/03/28 07:53:13 INFO client.RpcRetryingCaller: Call exception, tries=23, retries=35, started=310314 ms ago, cancelled=false, msg=
17/03/28 07:53:33 INFO client.RpcRetryingCaller: Call exception, tries=24, retries=35, started=330365 ms ago, cancelled=false, msg=
17/03/28 07:53:54 INFO client.RpcRetryingCaller: Call exception, tries=25, retries=35, started=350544 ms ago, cancelled=false, msg=
17/03/28 07:54:14 INFO client.RpcRetryingCaller: Call exception, tries=26, retries=35, started=370677 ms ago, cancelled=false, msg=
17/03/28 07:54:34 INFO client.RpcRetryingCaller: Call exception, tries=27, retries=35, started=390778 ms ago, cancelled=false, msg=
17/03/28 07:54:54 INFO client.RpcRetryingCaller: Call exception, tries=28, retries=35, started=410887 ms ago, cancelled=false, msg=
17/03/28 07:55:14 INFO client.RpcRetryingCaller: Call exception, tries=29, retries=35, started=431015 ms ago, cancelled=false, msg=
17/03/28 07:55:34 INFO client.RpcRetryingCaller: Call exception, tries=30, retries=35, started=451155 ms ago, cancelled=false, msg=
17/03/28 07:55:54 INFO client.RpcRetryingCaller: Call exception, tries=31, retries=35, started=471212 ms ago, cancelled=false, msg=
17/03/28 07:56:14 INFO client.RpcRetryingCaller: Call exception, tries=32, retries=35, started=491329 ms ago, cancelled=false, msg=
17/03/28 07:56:34 INFO client.RpcRetryingCaller: Call exception, tries=33, retries=35, started=511392 ms ago, cancelled=false, msg=
17/03/28 07:56:55 INFO client.RpcRetryingCaller: Call exception, tries=34, retries=35, started=531569 ms ago, cancelled=false, msg=
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
[error] Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:04 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:05 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:06 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:08 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:42 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:48:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:12 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:32 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:49:52 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:50:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:51:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:52:53 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:13 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:33 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:53:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:54:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:55:54 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:14 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:34 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
Tue Mar 28 07:56:55 PDT 2017, RpcRetryingCaller{globalStartTime=1490712483497, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
at Hi$.main(hw.scala:12)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1560)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
at Hi$.main(hw.scala:12)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:239)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
at Hi$.main(hw.scala:12)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
at Hi$.main(hw.scala:12)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411)
at Hi$.main(hw.scala:12)
at Hi.main(hw.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last compile:run for the full output.
17/03/28 07:56:55 ERROR zookeeper.ClientCnxn: Event thread exiting due to interruption
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2017)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2052)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:494)
17/03/28 07:56:55 INFO zookeeper.ClientCnxn: EventThread shut down
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 544 s, completed Mar 28, 2017 7:56:56 AM I referred this url https://hbase.apache.org/book.html#scala Setting the CLASSPATH. I did't mention the "/path/to/scala-library.jar" in the CLASSPATH as mentiond in the link. $ export CLASSPATH=$CLASSPATH:/usr/lib/hadoop/lib/native:/usr/lib/hbase/lib/native/Linux-amd64-64 Project root directory = /home/cloudera/Desktop/play-sbt-project My /home/cloudera/Desktop/play-sbt-project/build.sbt looks like this. I changed the dependent library version as per my environment and added few more dependencies like "hbase-client", "hbase-common" & "hbase-server" name := "play-sbt-project"
version := "1.0"
scalaVersion := "2.10.2"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "1.2.1",
"org.apache.hbase" % "hbase" % "1.2.0",
"org.apache.hbase" % "hbase-client" % "1.2.0",
"org.apache.hbase" % "hbase-common" % "1.2.0",
"org.apache.hbase" % "hbase-server" % "1.2.0"
) My main code for Hbase connectivity /home/cloudera/Desktop/play-sbt-project/src/main/scala/pw.scala looks like this import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{ConnectionFactory,HBaseAdmin,HTable,Put,Get}
import org.apache.hadoop.hbase.util.Bytes
object Hi {
def main(args: Array[String]) = {
println("Hi!")
val conf = new HBaseConfiguration()
val connection = ConnectionFactory.createConnection(conf);
val admin = connection.getAdmin();
// list the tables
val listtables=admin.listTables()
listtables.foreach(println)
}
} My /etc/hbase/conf/hbase-site.xml looks like this: <!-- <?xml version="1.0" encoding="UTF-8"?>
<!--Autogenerated by Cloudera Manager-->
<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://quickstart.cloudera:8020/hbase</value>
</property>
<property>
<name>hbase.replication</name>
<value>true</value>
</property>
<property>
<name>hbase.client.write.buffer</name>
<value>2097152</value>
</property>
<property>
<name>hbase.client.pause</name>
<value>100</value>
</property>
<property>
<name>hbase.client.retries.number</name>
<value>35</value>
</property>
<property>
<name>hbase.client.scanner.caching</name>
<value>100</value>
</property>
<property>
<name>hbase.client.keyvalue.maxsize</name>
<value>10485760</value>
</property>
<property>
<name>hbase.ipc.client.allowsInterrupt</name>
<value>true</value>
</property>
<property>
<name>hbase.client.primaryCallTimeout.get</name>
<value>10</value>
</property>
<property>
<name>hbase.client.primaryCallTimeout.multiget</name>
<value>10</value>
</property>
<property>
<name>hbase.coprocessor.region.classes</name>
<value>org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value>
</property>
<property>
<name>hbase.regionserver.thrift.http</name>
<value>false</value>
</property>
<property>
<name>hbase.thrift.support.proxyuser</name>
<value>false</value>
</property>
<property>
<name>hbase.rpc.timeout</name>
<value>60000</value>
</property>
<property>
<name>hbase.snapshot.enabled</name>
<value>true</value>
</property>
<property>
<name>hbase.snapshot.master.timeoutMillis</name>
<value>60000</value>
</property>
<property>
<name>hbase.snapshot.region.timeout</name>
<value>60000</value>
</property>
<property>
<name>hbase.snapshot.master.timeout.millis</name>
<value>60000</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>simple</value>
</property>
<property>
<name>hbase.rpc.protection</name>
<value>authentication</value>
</property>
<property>
<name>zookeeper.session.timeout</name>
<value>60000</value>
</property>
<property>
<name>zookeeper.znode.parent</name>
<value>/hbase</value>
</property>
<property>
<name>zookeeper.znode.rootserver</name>
<value>root-region-server</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<!-- <value>quickstart.cloudera</value> -->
<value>127.0.0.1</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
<property>
<name>hbase.rest.ssl.enabled</name>
<value>false</value>
</property>
</configuration> --> I googled alot to solve this issue but didn't got sucess. In the process of solving this issue I have done below changes: Changed the Dependent libraries version in build.sbt file as per my environment Added few more dependent libraries "hbase-client", "hbase-common" & "hbase-server". Chaned the "hbase.zookeeper.quorum" value from "quickstart.cloudera" to "127.0.0.1" in "hbase-site.xml" file. Please help me solving this issue. Thank you.
... View more
Labels:
11-27-2015
01:18 AM
Resolved the issue. Need to add hive conf as external class folder in your build path. Right click on you project in eclipse -- Build Path -- Configure Build Path -- Java Build Path -- Libraries -- Add External Class Folder -- select the conf folder (/usr/lib/hive/conf) -- OK -- OK All done. Now I am able to get my hive databases and tables.
... View more
11-26-2015
10:32 PM
I am creating a java swing ui which will connect to Hive databases and tables and provide me the details on UI itself. I am able to generate the UI but details of all Hive databases and tables are not comming except the "default" database. But inside "default" database also I am not able to get the tables present inside it. Here is the code to retrieve the Hive database HiveConf hconf = new HiveConf(); HiveMetaStoreClient msClient = new HiveMetaStoreClient(hconf); List<String> dbs = msClient.getAllDatabases(); System.out.println(All Databases: "+ dbs); //All Databases: [default] List<String> tbls = msClient.getAllTables(dbName); System.out.println(All Tables: "+ tbls); // All Tables: [] System details Hadoop distribution: Cloudera Quickstart VM on Windows version: CDH5.4.2 My hive-site.xml looks like this <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://127.0.0.1/metastore?createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>cloudera</value> </property> <property> <name>hive.hwi.war.file</name> <value>/usr/lib/hive/lib/hive-hwi-0.8.1-cdh4.0.0.jar</value> <description>This is the WAR file with the jsp content for Hive Web Interface</description> </property> <property> <name>hive.metastore.uris</name> <value>thrift://127.0.0.1:9083</value> <description>IP address (or fully-qualified domain name) and port of the metastore host</description> </property> </configuration> My thought In the process to solve this issue, I sysout hive conf properties System.out.println("All Hive conf property: "+ hconf.getAllProperties().toString()); I found some properties where it uses derby database, I don't know from where it is using derby database here are some output of hive conf properties when I sysout it hive.stats.dbconnectionstring=jdbc:derby:;databaseName=TempStatsStore; hive.stats.jdbcdriver=org.apache.derby.jdbc.EmbeddedDriver javax.jdo.option.ConnectionURL=jdbc:derby:;databaseName=metastore_db;create=true, javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver, From the output it looks it is connecting to derby database, but I am not getting why it is connecting to derby database, since in my **hive-site.xml** It is mentioned mysql database. Please help me out in this.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Quickstart VM