Member since
03-17-2016
132
Posts
106
Kudos Received
13
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3446 | 03-28-2019 11:16 AM | |
| 4915 | 03-28-2019 09:19 AM |
03-29-2019
04:35 AM
In /etc/yum.repos.d, remove all .repo files pointing to the Internet and copy only .repo files from other servers which are already using your local repo. For HDP nodes, initially you need only 2 .repo files, one for the OS, and ambari.repo. When Ambari adds a new node to the cluster it will copy there HDP.repo and HDP-UTILS.repo. Also, have you set your repository URLs in Ambari-> Admin-> Stack and versions-> Versions -> Manage Versions -> [click on your current version] ?
... View more
03-28-2019
11:16 AM
1 Kudo
@Ruslan Fialkovsky You need to write a custom code which can block -skiptrash command This is the path where you need to place the command vi /usr/hdp/current/hadoop-client/bin/hadoop #!/bin/bash export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.6.5.0-292/hadoop} export HADOOP_MAPRED_HOME=${HADOOP_MAPRED_HOME:-/usr/hdp/2.6.5.0-292/hadoop-mapreduce} export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-/usr/hdp/2.6.5.0-292/hadoop-yarn} export HADOOP_LIBEXEC_DIR=${HADOOP_HOME}/libexec export HDP_VERSION=${HDP_VERSION:-2.6.5.0-292} export HADOOP_OPTS="${HADOOP_OPTS} -Dhdp.version=${HDP_VERSION}" exec /usr/hdp/2.6.5.0-292//hadoop/bin/hadoop.distro "$@" ###here you need to write code to restrict skip trash
... View more
03-28-2019
09:19 AM
There's an API to remove older versions from the hosts. Take a look at https://issues.apache.org/jira/browse/AMBARI-18435 E.g., curl 'http://c6401.ambari.apache.org:8080/api/v1/clusters/cl1/requests' -u admin:admin -H "X-Requested-By: ambari" -X POST -d'{"RequestInfo":{"context":"remove_previous_stacks", "action" : "remove_previous_stacks", "parameters" : {"version":"2.5.0.0-1245"}}, "Requests/resource_filters": [{"hosts":"c6403.ambari.apache.org, c6402.ambari.apache.org"}]}'
... View more
09-14-2017
11:50 AM
Hi Sindhu We are facing the same issue with insert ovewrite but it is not a local directory. We are facing this issue after upgrade from 2.5.3 to 2.6.1 Tried running with different destination . It created the folder but fails with below error Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [xyz] does not have [WRITE] privilege on [/tmp/*] (state=42000,code=40000)
Closing: 0: jdbc:hive2://host:2181,host:2181,host:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [xyz] does not have [WRITE] privilege on [/user/*] (state=42000,code=40000)
Closing: 0: jdbc:hive2://host:2181,host:2181,host:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
... View more