Member since
04-21-2016
12
Posts
0
Kudos Received
0
Solutions
06-15-2017
05:30 PM
I have installed Spark 2.1 manually on HDP 2.3.4 while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ?
... View more
06-15-2017
04:52 PM
I have installed Spark 2.1 manually on HDP 2.3.4 while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ?
... View more
10-20-2016
03:38 PM
Thank you it did help. The bug fix jira threw me off. I am running into another issue now with kafka where i am running into the following
https://issues.apache.org/jira/browse/AMBARI-14147
Any way to fix without applying the patch. Also i see this is stemming from me not able to reset the version using this command. We have upgraded and finalized the upgrade but Ambari stack and versions continue to say upgrade in progress. When i run the following
ambari-server set-current --cluster-name=*** --version-display-name=HDP-2.3.4.0 ERROR: Exiting with exit code 1.
REASON: Error during setting current version. Http status code - 500.
{
"status" : 500,
"message" : "org.apache.ambari.server.controller.spi.SystemException: Finalization failed. More details: \nSTDOUT: Begin finalizing the upgrade of cluster 001 to version 2.3.0.0-2557\nThe following 1 host(s) have not been upgraded to version 2.3.0.0-2557. Please install and upgrade the Stack Version on those hosts and try again.\nHosts: ns1.hadoop.com\n\nSTDERR: The following 1 host(s) have not been upgraded to version 2.3.0.0-2557. Please install and upgrade the Stack Version on those hosts and try again.\nHosts: ns1.hadoop.com\n"
} Any thoughts ? Thanks for checking
... View more
10-19-2016
11:45 PM
@Alejandro Fernandez We have upgraded from Amabari 1.7 to Ambari 2.1.2 last week and today we have upgraded from HDP 2.2 to HDP 2.3 (manual upgrade). The final goal is from here do a rolling upgrade to HDP 2.3.4. All the upgrade process went fine and all the services are running fine until i issued a ambari-server set-current --cluster-name=*** --version-display-name=HDP-2.3.4.0 and restarted the cluster as it was showing pending upgrade in the stacks and versions page. Every time i start a service I get the following error File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 70, in setup_users
create_tez_am_view_acls()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 94, in create_tez_am_view_acls
if not params.tez_am_view_acls.startswith("*"):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 81, in __getattr__
raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'tez.am.view-acls' was not found in configurations dictionary!
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-5003.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-5003.json', 'INFO', '/var/lib/ambari-agent/tmp'] I did see a Jira and Fix but we have not encountered this when we upgraded our Dev environment. https://issues.apache.org/jira/browse/AMBARI-13835 Moving to a new version is not that easy in our organization, and am trying to see if there is any work around to fix this issue. Is there a way to do a roll back or do a rolling upgrade to 2.3.4 ? I know this is specific to ambari and am trying to see how to fix this ? I am able to shutdown the services but not able to start any. Thanks for looking into this Prasad
... View more
Labels:
- Labels:
-
Apache Ambari
08-01-2016
04:27 AM
I have tried the above settings and restarted the ambari server but seems to have no effect on the alert. Anything i am missing. Following are my current settings.. # defaults in case no script parameters are passed MIN_FREE_SPACE_DEFAULT = 1000000000L PERCENT_USED_WARNING_DEFAULT = 80 PERCENT_USED_CRITICAL_DEFAULT = 90 And the Error below : Ambari / xxx
WARN
for 5 minutes
1
Capacity Used: [60.47%, 6.3 GB], Capacity Total: [10.4 GB], path=/usr/hdp Thanks for looking Rao
... View more