Member since
02-08-2017
8
Posts
1
Kudos Received
0
Solutions
10-10-2017
03:59 PM
Could you share how you generated a blueprint to install Spark on top of a running cluster with other existing services? It's not obvious how to do it.
... View more
02-08-2017
03:57 PM
That's what I was thinking. Was just wondering if there was a way to define a custom package somehow. Thanks!
... View more
02-08-2017
01:54 PM
@Artem we do need to rebuild HBase since we need to run custom co-processors from Thrift. We want to do something like is described here: https://issues.apache.org/jira/browse/HBASE-5600 "We could create a thrift method to take the name of the class, method, and an array of params and then call coprocessorExec".
If there's a better way to run a custom co-processor and access it from Thrift I'm all ears. But this brings me back to a good way to run a custom build of HBase with Ambari.
... View more
02-08-2017
11:25 AM
1 Kudo
I'd like to use Ambari but be able to build HBase from source as we need a custom co-processor. Is there a documented way of doing this or do I have to turn into a full-custom install?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache HBase