Member since
08-08-2018
49
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
13981 | 08-11-2018 12:05 AM | |
10781 | 08-10-2018 07:29 PM |
08-16-2018
04:40 AM
@Vinicius Higa Murakami I don't mind providing the jars but is there a way to add all of them in once to Zeppelin? I'm also have trouble figuring out which Phoenix jar this class is from. Do you know?
... View more
08-15-2018
11:38 PM
1 Kudo
Hello- I have HDP 3.0 with Sqoop v1.4.7. What is the best way to migrate my data from an external RDBMS into something query-able from Phoenix? I want to make sure I import it in a way that it was have very fast queries. Do I need to Sqoop it into HDFS first or can I go directly into HBase? It looks like the Sqoop - Phoenix is not yet completed so I believe I will need to sqoop the data into HDFS or Hbase and then connect to Phoenix. Can someone show (or point) me to how to do that? This post makes me think that I will need to go RBDMS>HDFS>CSV>Phoenix, please tell me that is not true... Thanks!
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
-
Apache Sqoop
08-15-2018
11:18 PM
I've heard that Sqoop 1.4.7 enables import through Phoenix directly. Can you please add a section on how to do that with HDP 3.0 or a new post?
... View more
08-15-2018
06:03 AM
I'm trying to use the Phoenix-Spark2 connector in Zeppelin as described here and having some confusion about dependencies Here is the code I'm running: import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._
val sc = new SparkContext("local", "phoenix-test")
val sqlContext = new SQLContext(sc)
val df = sqlContext.load(
"org.apache.phoenix.spark",
Map("table" -> "dz_places",
"zkUrl" -> "driver_node:2181"))
I keep getting ClassNotFound Exceptions so I go and find the related jar and add it to the Zeppelin Spark2 interpreter dependencies. So far I've added these jars: /usr/hdp/current/phoenix-server/lib/phoenix-spark-5.0.0.3.0.0.0-1634.jar
/usr/hdp/current/phoenix-server/lib/phoenix-core-5.0.0.3.0.0.0-1634.jar
/usr/hdp/current/hbase-master/lib/hbase-common-2.0.0.3.0.0.0-1634.jar
/usr/hdp/current/hbase-client/lib/hbase-client-2.0.0.3.0.0.0-1634.jar
/usr/hdp/current/hbase-client/lib/htrace-core-3.2.0-incubating.jar Now I'm seeing this error: java.lang.NoClassDefFoundError: org/apache/tephra/TransactionSystemClient I'm not seeing this jar in any of the Hbase or Phoenix lib folders. What's going on? Why do I need to add all these? Where is this particular class housed? Is there a better way to specify these? Using /usr/hdp/current/hbase-client/lib/*.jar threw an error.
... View more
Labels:
08-15-2018
05:03 AM
I had to do a lot of fanagaling but this command ultimately solved my issues. Thanks! I did inadvertantly install hbase-master on one of the datanodes and it is not registered in Ambari. Should delete it? if so, `yum remove hbase-master` did not work. How would I do so?
... View more
08-14-2018
11:45 PM
@Jay Kumar SenSharma I think the issue is with a bad hbase-client install. What is the yum install arg to reinstall that?
... View more
08-14-2018
11:34 PM
@Jay Kumar SenSharma I didn't see an entry about the RegionServer. I tried deleting and installing again in Ambari and this did not result in anything different. Only the conf dir is present.
... View more
08-14-2018
11:33 PM
I tried another time and am still only showing the conf file present
... View more
08-14-2018
11:28 PM
I have already removed and installed again once and it did not solve the issue.
... View more
08-14-2018
11:24 PM
@amarnath reddy pappu Yep I don't doubt that the message is correct, what do I do next? Only conf file is present: [dzafar@MYSERVER03 ~]$ ls /usr/hdp/current/hbase-regionserver
conf
... View more