Member since
02-16-2016
20
Posts
10
Kudos Received
0
Solutions
11-09-2017
09:59 AM
If that's how it is I wonder what is the criteria to publish or not. I would rather if all the official releases were also published, like it happens with all the other artifacts. So if I'm using 2.5.1.0.159 in prod, I would like to link to exactly those build libraries, not a vanilla 2.5.1 built by myself and published on my own repository so my build server can reach it. Thanks again for your assistance.
... View more
11-09-2017
08:59 AM
I appreciate your help but the question is not fully answered. 2.5.1, 2.5.2 and 2.6.0 have been released but are not there. Is this a mistake or they won't be published anymore? Hopefully somebody from Hortonworks can answer.
... View more
11-08-2017
03:40 PM
Wasn't aware of that repo. For some reason the latest they have is Ambari 2.5.0 but it's better that nothing. Thanks
... View more
11-08-2017
02:35 PM
I have been using this repository when building my own Ambari views: http://repo.hortonworks.com/content/repositories/releases/org/apache/ambari/ambari-views/ But now I'm trying to upgrade to Ambari 2.5 or later and I see they have been not published there. Does anybody know where I can find the Hortonworks Ambari artifacts published as a repository (not as tars)?
... View more
Labels:
- Labels:
-
Apache Ambari
11-08-2017
02:30 PM
In Ambari you cannot pick and choose, either the whole cluster uses kerberos or not and that trigger a number of configuration and keytab creation done automatically. The only non hacky solution I see is to have a separate ambari cluster for kafka but it's not ideal since each node can only belong to one cluster
... View more
07-18-2017
07:59 AM
Now in 2.6, I understand Hive LLAP is production ready but it still only allows a single HSI which becomes a SPOF
There was a patch contributed but it was never accepted: https://issues.apache.org/jira/browse/AMBARI-18917
Do you have any information about why something as important yet apparently trivial hasn't been addressed?
... View more
03-08-2016
11:06 AM
Thanks. That's really disappointing.
Ambari knows the packages installed by the stack so it should be easy to implement that feature.
Even better if it moved away from packages altogether, as Cloudera did Somebody mentioned the Ambari cleanup script in an idea proposed. Will check it out but I believe that script will remove everything not just old versions.
... View more
03-08-2016
09:11 AM
3 Kudos
How are you supposed to remove an old version of HDP once you have successfully upgraded to a new version? The old version is still listed as installed and the "Deregister" button is disabled because "it is installed". It would be easy to delete the /usr/hdp/[old version] folder but all the packages would still be considered installed by the OS. On the other hand trying to remove the old packages manually on each node is cumbersome and risky. Leaving behing GBs of data and lots of packages will easily buildup as you upgrade to new versions over the years.
... View more
Labels:
- Labels:
-
Apache Ambari