Member since
03-09-2018
7
Posts
1
Kudos Received
0
Solutions
04-17-2018
06:28 AM
I am new to scala based sbt build. I am not sure whether it is correct build.sbt for given scala code. Please correct me as well
When i am trying to connect Hbase with spark. using scala code. Unable to do so because of sbt file error.
I stucked at sbt only. Aim is to add data using scala code programtic manner ( for CDH 5.13 version)
Version
Hbase - 1.2.0-cdh5.13.0
Cloudera - cdh5.13.0
Getting error :
assemblyMergeStrategy Portion is unreolvable. Pathlist as well
Build.sbt
name := "ScalaHBase"
version := "1.0"
scalaVersion := "2.11.8"
resolvers ++= Seq(
"Hadoop Releases" at "https://repository.cloudera.com/content/repositories/releases/"
)
libraryDependencies ++= Seq(
"com.google.guava" % "guava" % "15.0",
"org.apache.hadoop" % "hadoop-common" % "2.6.0",
"org.apache.hadoop" % "hadoop-mapred" % "0.22.0",
"org.apache.hbase" % "hbase-common" % "1.0.0",
"org.apache.hbase" % "hbase-client" % "1.0.0"
)
dependencyOverrides += "com.google.guava" % "guava" % "15.0"
assemblyMergeStrategy in assembly := {
case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("com", "google", xs @ _*) => MergeStrategy.first
case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
case "about.html" => MergeStrategy.rename
case "plugin.properties" => MergeStrategy.last
case "log4j.properties" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
Scala code
import org.apache.hadoop.hbase.client._
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.{CellUtil, HBaseConfiguration, TableName}
import org.apache.hadoop.conf.Configuration
import scala.collection.JavaConverters._
object ScalaHBaseExample extends App{
def printRow(result : Result) = {
val cells = result.rawCells();
print( Bytes.toString(result.getRow) + " : " )
for(cell <- cells){
val col_name = Bytes.toString(CellUtil.cloneQualifier(cell))
val col_value = Bytes.toString(CellUtil.cloneValue(cell))
print("(%s,%s) ".format(col_name, col_value))
}
println()
}
val conf : Configuration = HBaseConfiguration.create()
val ZOOKEEPER_QUORUM = "WRITE THE ZOOKEEPER CLUSTER THAT HBASE SHOULD USE"
conf.set("hbase.zookeeper.quorum", ZOOKEEPER_QUORUM);
val connection = ConnectionFactory.createConnection(conf)
val table = connection.getTable(TableName.valueOf( Bytes.toBytes("emostafa:test_table") ) )
// Put example
var put = new Put(Bytes.toBytes("row1"))
put.addColumn(Bytes.toBytes("d"), Bytes.toBytes("test_column_name"), Bytes.toBytes("test_value"))
put.addColumn(Bytes.toBytes("d"), Bytes.toBytes("test_column_name2"), Bytes.toBytes("test_value2"))
table.put(put)
// Get example
println("Get Example:")
var get = new Get(Bytes.toBytes("row1"))
var result = table.get(get)
printRow(result)
//Scan example
println("\nScan Example:")
var scan = table.getScanner(new Scan())
scan.asScala.foreach(result => {
printRow(result)
})
table.close()
connection.close()
}
Also I tried this Build.sbt file
name := "HbaseExample" version := "0.1" scalaVersion := "2.12.5" libraryDependencies ++= Seq ( //"org.apache.hbase" % "hbase" % "0.98.8-hadoop2", "org.apache.hbase" % "hbase" % "1.2.0-cdh5.13.0" pomOnly(), //"org.apache.hbase" % "hbase-client" % "0.98.8-hadoop2", "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.13.0" , // "org.apache.hbase" % "hbase-common" % "0.98.8-hadoop2", "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.13.0" , //"org.apache.hbase" % "hbase-server" % "0.98.8-hadoop2", "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.13.0" ) mergeStrategy in assembly := { case m if m.toLowerCase.endsWith( "manifest.mf" ) => MergeStrategy.discard case m if m.toLowerCase.matches( "meta-inf.* \\ .sf$" ) => MergeStrategy.discard case "META-INF/jersey-module-version" => MergeStrategy.first case _ => MergeStrategy.first }
when trying to import this build.sbt again this portion : saying cannot resolve symbol
Any help on it would be much appreciated ?
... View more
- Tags:
- HBase
Labels:
04-02-2018
02:13 AM
This is complete messeges I am getting when trying to send the messages like Hi howdy
... View more
04-02-2018
02:12 AM
[cloudera@quickstart bin]$ kafka-console-producer --broker-list hostname:909 2 --topic planedate
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/kafka/libs/slf4j-log4j12-1.7.21.jar!/ org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/kafka/libs/slf4j-log4j12-1.7.5.jar!/o rg/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/04/02 02:09:46 INFO producer.ProducerConfig: ProducerConfig values:
acks = 1
batch.size = 16384
block.on.buffer.full = false
bootstrap.servers = [172.27.54.65:9092]
buffer.memory = 33554432
client.id = console-producer
compression.type = none
connections.max.idle.ms = 540000
interceptor.classes = null
key.serializer = class org.apache.kafka.common.serialization.ByteArraySe rializer
linger.ms = 1000
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.fetch.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.De faultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.ms = 50
request.timeout.ms = 1500
retries = 3
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 102400
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
timeout.ms = 30000
value.serializer = class org.apache.kafka.common.serialization.ByteArray Serializer
18/04/02 02:09:46 INFO utils.AppInfoParser: Kafka version : 0.10.2-kafka-2.2.0
18/04/02 02:09:46 INFO utils.AppInfoParser: Kafka commitId : unknown
>hi
howdy
... View more
04-02-2018
02:02 AM
Is that command i am using is worng one or missing something in it ? Please let me know
... View more
04-02-2018
02:00 AM
I am not able to send the data into topic using this command kafka-console-producer --broker-list hostname:9092 --topic topic name
... View more
04-02-2018
01:17 AM
1 Kudo
When i am trying to sending data to topic. using this command kafka-console-producer --broker-list hostname:9092 --topic topic name Getting this lines as a output of command 18/04/02 01:09:49 INFO utils.AppInfoParser: Kafka version : 0.10.2-kafka-2.2.0
18/04/02 01:09:49 INFO utils.AppInfoParser: Kafka commitId : unknown Env CDH version 5.13 package Any help would be much appreciated ?
... View more