Member since
04-18-2017
39
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2772 | 03-06-2018 05:40 PM |
07-17-2020
06:53 PM
Did you resolve? Tks.
... View more
04-09-2020
01:26 PM
From your logs I see there are no healthy datanodes for it to try replace bad datanodes. In addition I see several slow sync error for which you will have to tune your memstore's lower and upper limit configuration to reduce the frequency of data being flushed in order to get the best out of available heap.
... View more
08-24-2018
09:03 AM
@Mathi Murugan I see that currently ALTER (DATABASE|SCHEMA) database_name SET OWNER [USER|ROLE] user_or_role; does not give the desired fine-grained option for only a specific table in a schema/database, so I think the best option would be to use Ranger and give a select privilege on the particular database or underlying table to the new user who can then issue a Create Table As Select (CTAS) which will automatically change the ownership to the issuer of the CTAS. HTH
... View more
03-07-2018
05:09 AM
@Aymen Rahal The issue is due to 'Connection refused on the default ssh port'. Verify the following: 1. Check the ssh port under file /etc/ssh/sshd_config, if not set try setting to 22. 2. Try running ssh to the host from terminal.
... View more
10-20-2017
03:13 PM
1 Kudo
@Mathi Murugan, You can use this one line command echo "scan 'test_1',{FILTER =>\"(PrefixFilter ('r1'))\"}" | hbase shell -n | grep "column=" | hdfs dfs -appendToFile - /tmp/hbaseresults.txt Here 'test_1' is the table name, 'r1' is the row key and /tmp/hbaseresults.txt is the hdfs file path. You can replace these values with your values. Thanks, Aditya
... View more
07-07-2017
03:22 PM
There are several Scala native JSON libraries available (json4s, play-json, circe, as well as others) so there are lots of ways to do this. With play-json, your solution might look like: import play.api.libs.json._
import play.api.libs.json.Reads._
import play.api.libs.functional.syntax._
val jsonStr = """{"Id":1,"Name":"Babu","Address":"dsfjskkjfs"}"""
Person(id: Int, name: String, address: String)
implicit val personReads: Reads[Person] = (
(JsPath \ "Id").read[Int] and
(JsPath \ "Name").read[String] and
(JsPath \ "Address").read[String]
)(Person.apply _)
jsonStr.validate[Person] match {
case JsSuccess(person, _) => {
// do something
}
case JsError(e) => {
// do something
}
}
With circe, your solution might look like: import io.circe._
import io.circe.generic.auto._
import io.circe.parser._
import io.circe.syntax._
val jsonStr = """{"Id":1,"Name":"Babu","Address":"dsfjskkjfs"}"""
Person(id: Int, name: String, address: String)
decode[Person](jsonStr) match {
case Right(person) => {
// do something with Person(1, "Babu", "dsfjskkjfs")
}
case Left(e) => {
// handle error
}
}
In your question you asked for a case class but wrote it out as a tuple, so just to close that loop, if you want to subsequently convert the data to a tuple you can do `Person.unapply(person).get`.
... View more
05-07-2017
05:59 AM
Something does not looks right about this one DN identified with an unusual IP address. 1..1..1..:50010 2017-05-03 21:22:38,729 WARN [DataStreamer for file /apps/hbase/data/WALs/aps-hadoop5,16020,1493618413009/aps-hadoop5%2C16020%2C1493618413009.default.1493846432867 block BP-1810172115-10.64.228.157-1478343078462:blk_1079562185_5838908] hdfs.DFSClient: Error Recovery for block BP-1810172115-10.64.228.157-1478343078462:blk_1079562185_5838908 in pipeline DatanodeInfoWithStorage[1..1..1..:50010,DS-751946a0-5a6f-4485-ad27-61f061359410,DISK], DatanodeInfoWithStorage[10.64.228.140:50010,DS-8ab76f9c-ee05-4ec0-897a-8718ab89635f,DISK], DatanodeInfoWithStorage[10.64.228.150:50010,DS-57010fb6-92c0-4c3e-8b9e-11233ceb7bfa,DISK]: bad datanode DatanodeInfoWithStorage[1..1..1..:50010,DS-751946a0-5a6f-4485-ad27-61f061359410,DISK]
... View more