Member since
11-10-2016
26
Posts
3
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1608 | 12-21-2016 08:26 AM | |
1516 | 12-14-2016 04:45 PM | |
1820 | 12-14-2016 01:24 PM | |
5875 | 11-21-2016 12:04 PM |
12-21-2016
08:26 AM
Raised Customer message towards SAP and the resolution was: "Known issue for Spark Controller 1.6.2, so please updagrade to Spark Controller 2.0". After upgrading to Spark Controller 2.0 the installation was successful. Hence closing this thread.
... View more
12-21-2016
05:45 AM
HDP 2.3.0.0
... View more
12-21-2016
05:42 AM
Yes this is a SAP specific install for data management. Ambari : 2.4.2.0
Spark : 1.5.2.2.3
Spark Controller : 1.6.1
... View more
12-15-2016
01:55 PM
1 Kudo
When we are trying to install Spark Controller via Ambari, it is giving error. All pre-installation activities are cool. below is the error we are getting: stderr: /var/lib/ambari-agent/data/errors-403.txt File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/SparkController/package/scripts/controller_conf.py", line 10, in controller_conf
recursive = True
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 147, in __init__
raise Fail("%s received unsupported argument %s" % (self, key))
resource_management.core.exceptions.Fail: Directory['/usr/sap/spark/controller/conf'] received unsupported argument recursive stdout: /var/lib/ambari-agent/data/output-403.txt 2016-12-15 08:44:36,441 - Skipping installation of existing package curl
2016-12-15 08:44:36,441 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-12-15 08:44:36,496 - Skipping installation of existing package hdp-select
Start installing
2016-12-15 08:44:36,668 - Execute['cp -r /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/SparkController/package/files/sap/spark /usr/sap'] {}
2016-12-15 08:44:36,685 - Execute['chown hanaes:sapsys /var/log/hanaes'] {}
Configuring...
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Spark
12-14-2016
04:45 PM
creating the soft link like ln -s /usr/hdp/2.3.0.0-2557/spark spark-thriftserver at the dir /usr/hdp/current did the trick
... View more
12-14-2016
04:35 PM
We have installed HDP 2.3.0.0(2.3.0.0-2557) on our cluster managed by Ambari 2.4.2.0. When we are starting the spark thrift server, it errors out. Upon checking we found that the package directory "/usr/hdp/current/spark-thriftserver/" doesn't exist. this directory should had been created while installation but it is not. Any solutions how to deal with this.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Spark
12-14-2016
01:24 PM
1 Kudo
Its available with HDP 2.3.4.7 Successful installation done.
... View more
12-14-2016
12:54 PM
HDP-2.3.6.0 (HDP-2.3.6.0-3796) gives Spark 1.5.1 but we need Spark 1.5.2
... View more
12-14-2016
11:53 AM
1 Kudo
Hi Guyzz, I have a business requirement to install Spark1.5.2 in our cluster as this version is compatible with Hana Spark Controller 1.6.1. So I want to know exact Ambari repository and HDP version which I should be using during the installation.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Spark
11-21-2016
12:04 PM
Hi Changing the value for hive.security.authorization.manager = org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider worked. Changed the hive-site.xml at spark controller. hive-site conf at Hive Client is having proper authorizations. Issue Resolved.
... View more