Member since
09-18-2015
216
Posts
208
Kudos Received
49
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1044 | 09-13-2017 06:04 AM | |
2108 | 06-27-2017 06:31 PM | |
2028 | 06-27-2017 06:27 PM | |
8885 | 11-04-2016 08:02 PM | |
9191 | 05-25-2016 03:42 PM |
05-27-2017
09:28 AM
if my hive table is a external table located on hdfs, could this solution work? thanks , if my hive table is a external table ,could this solution work?
... View more
05-13-2016
06:28 PM
Yes you can @Hemant Kumar Dindi.. You can do this by registering service, services components and hosts using Ambari APIs.
... View more
05-04-2016
10:54 AM
This did the trick: curl -u admin:admin -H "X-Requested-by:ambari" -i -k -X DELETE http://<host>/api/v1/clusters/<cluster>/services/ZEPPELIN
Thanks!
... View more
05-03-2016
10:32 PM
2 Kudos
@Anandha L Ranganathan Every distribution of HDP is created and tested so that components are compatible with each other. So, if you want to upgrade all components in Ambari other than Kafka, while it is definitely possible, we don't recommend it for supportability and compatibility. Essentially, if you go this route, you are trying to create a distro of your own which won't be worth the pain due to possible unknown BUGs, support issues and other challenges. Better take your time, upgrade Dev cluster, re-write consumer jobs and then upgrade prod once you have tested everything. This would ensure smooth upgrade. Else time and effort saved here can result into pain in other areas down the road as mentioned above. Hope it helps!!
... View more
05-04-2016
11:09 AM
Glad that issue is resolved.
... View more
05-03-2016
10:40 PM
@Neeraj SabharwalThis was a BUG in Ambari2.1.2.1 and is fixed in Ambari2.2.0 onwards.
... View more
01-13-2016
04:26 AM
2 Kudos
Got the solution after working with support. This is BUG in current Ambari and HDP versions. https://hortonworks.jira.com/browse/BUG-38390 Below is workaround: Change Log directory for Ranger Admin logs On Ranger Admin nodes:
ADMIN LOGS (Change the symbolic link for in ews directory):
mkdir /opt/hadoop/log/ranger/admin
chmod 775 /opt/hadoop/log/ranger/admin
chown ranger_qa:ranger_qa_grp /opt/hadoop/log/ranger/admin
cd /usr/hdp/current/ranger-admin/ews
unlink logs
ln -s /opt/hadoop/log/ranger/admin logs
From Ambari Web UI --> Change Log directory for Ranger Admin and restart Ranger. Note: here ranger_qa and ranger_qa_grp are ranger user and group respectively which are customized in my case. Change log directory for USERSYNC LOGS
On Ranger Usersync node: mkdir /opt/hadoop/log/ranger/usersync
chmod 775 /opt/hadoop/log/ranger/usersync
chown ranger_qa:ranger_qa_grp /opt/hadoop/log/ranger/usersync Note: here ranger_qa and ranger_qa_grp are ranger user and group respectively which are customized in my case.
1. Make a back of file /usr/hdp/current/ranger-usersync/ranger-usersync-services.sh
2. Change the file to the path to your new log location.
logdir=/var/log/ranger/usersync cd /usr/hdp/current/ranger-usersync/
cp ranger-usersync-services.sh{,.backup01122015}
vi ranger-usersync-services.sh
#logdir=/var/log/ranger/usersync
logdir=/opt/hadoop/log/ranger/usersync
From Ambari Web UI --> Change Log directory for Ranger Usersync and restart Ranger
... View more
06-06-2018
01:00 PM
Hi, I had similar issue when I try install Ranger on HDP 2.5.0.3. Ambari app are working on ambari (non-root) account. I also moved hdp directory do annother disk and created symlink. Error what I got: 2018-06-05 10:25:43,096 - Execute[(u'/opt/java/jdk1.8.0/bin/java', '-cp', u'/usr/hdp/current/ranger-admin/cred/lib/*', 'org.apache.ranger.credentialapi.buildks', 'create', u'rangeradmin', '-value', [PROTECTED], '-provider', u'jceks://file/etc/ranger/admin/rangeradmin.jceks')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/opt/java/jdk1.8.0'}, 'sudo': True} Sorry, user ambari is not allowed to execute '/opt/java/jdk1.8.0/bin/java -cp /usr/hdp/current/ranger-admin/cred/lib/* org.apache.ranger.credentialapi.buildks create rangeradmin -value O6gZHKsVNcEMVAZSvjZo Solution: I detected when I logged in ambari user I could run java, but when run sudo java I got similar error like above. The solution was to add and entry to /etc/sudoers: ambari ALL=(ALL) NOPASSWD:SETENV: /bin/java *
... View more
12-29-2015
10:32 AM
1 Kudo
You can't have High Availability for any service i.e. Resource Manager in your case, in a single machine sandbox environment.
... View more
12-28-2016
05:50 AM
@Vetrivel S I think your path "/user/plasne" in HDFS dont have write permission. Kindly try to give write permission for respective path, suppose you are in Ubuntu "hadoop fs -chmod g+w /user/plasne" without quotes in terminal then start your hive
... View more