Member since
05-26-2016
5
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3173 | 05-30-2016 07:32 PM |
05-30-2016
07:32 PM
I've discovered that if I omit the dependencies the Ambari Web UI was forcing me to install in an Ambari blueprint, and then submit it manually to the Ambari REST API that I can install Spark without Hive (and its dependencies) no problem. I've created an unattended install by using the Ambari REST API and submitting a blueprint and cluster hostmapping file. https://cwiki.apache.org/confluence/display/AMBARI/Blueprints#Blueprints-Step1:CreateBlueprint
... View more
05-27-2016
06:51 PM
Sorry, maybe the question wasn't clear. I know Hive isn't a requirement of Spark, but Ambari makes it a requirement as part of installation of the HDP platform. I'm asking how I can workaround Ambari not forcing me to install Hive without having to install Spark manually on my HDP platform.
... View more
05-27-2016
03:30 PM
Yes, but then when I install Hive I also need to install Mysql (for the metastore), Pig and Tez, more dependencies I don't need. When not using Hive with Spark there needs to be an option to not include it as part of the install.
... View more
05-27-2016
03:24 PM
I've setting up an HDP 2.4 cluster and when I attempt to install Spark without Hive I get a dependency issue and cannot continue. I don't wish to use Hive with Spark. If I use a blueprint for install can I force Spark install without Hive, and if so, would there be any consequences?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
-
Apache Spark
05-26-2016
06:06 PM
I wrote a python script that does the same thing. I noticed that immediately after I install the cluster hostmap to kickstart the install, and then check the request it returns, I get a 100% `progress_percent`. I had to introduce a delay (I set it to 5 seconds) before I start polling the request API.
... View more