Member since
11-07-2016
637
Posts
253
Kudos Received
144
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2224 | 12-06-2018 12:25 PM | |
2281 | 11-27-2018 06:00 PM | |
1775 | 11-22-2018 03:42 PM | |
2831 | 11-20-2018 02:00 PM | |
5132 | 11-19-2018 03:24 PM |
10-30-2019
03:30 AM
Hi, To understand what the Yarn application is doing, check the application logs of the particular yarn application and if the job hasnot completed , Also check for the Resource manager logs if it was stuck with any errors. Thanks Arun
... View more
10-24-2018
05:07 PM
Can you try solution mentioned in this question https://community.hortonworks.com/questions/61415/ranger-audit-to-solr-problem.html
... View more
10-19-2018
01:12 AM
@Felix Albani, This worked like a charm. Thanks a lot for your help. Really appreciate 🙂 However in the latest version of Ambari, it should have been handled by Ambari itself. I do not see the manual step in this doc. Must be a doc bug or ambari issue in my cluster. https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.1.0/managing-high-availability/content/amb_enable_namenode_high_availability.html
... View more
10-18-2018
04:21 AM
I see Spark2 history server available in my cluster. Can you check if Spark2 history server is already installed on that node. Please refer the screenshot For clients, move option is not required. You can install clients on the new node in the similar way. You will see "Spark client" in the dropdown when you click +ADD
... View more
08-22-2018
04:21 PM
1 Kudo
@yadir Aguilar This is a permission issue. You need to give permissions on the folder to admin user or create directory as hdfs user su hdfs
hdfs dfs -mkdir /user/d (or) su hdfs
hdfs dfs -chown admin:admin /user/d . If you are using a kerberized environment, you should do a kinit with hdfs keytab and then run the above commands kinit -kt /etc/security/keytabs/hdfs.headless.keytab {user-principal} . Please "Accept" the answer if this helps.
... View more
08-04-2018
02:41 AM
3 Kudos
Note : This feature is available from HDP 3.0 (Ambari 2.7)
Ambari 2.7 has a cool new feature where it is integrated with Swagger and you can try and explore all the REST APIs.
Steps to use Swagger
Login to Ambari
Hit this url ( http://{ambari-host}:8080/api-docs)
This page takes you to the API explorer where you can try different APIs. Here are some of the screenshots.
You can get all the supported endpoints from http://{ambari-host}:8080/api-docs/swagger.json)
.
Hope this helps 🙂
... View more
Labels:
07-11-2018
12:13 PM
@Anjali Shevadkar you are right, that's why I asked you to check hive cli, so, seems to be some configuration in your ranger. Did you try to connect using ZK hosts on your connection string? I suggest you check this following document, check the permissions on HDFS. Let me know if this works for you. Make sure the user that you configure as the same as the unix user (or ldap, whatever). Try to configure another user to test. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_security/content/configure_ranger_authentication.html Another important thing, check the permissions on your HDFS, because when you are using ranger you need to change the owner/group and permissions https://br.hortonworks.com/blog/best-practices-in-hdfs-authorization-with-apache-ranger/
... View more
06-29-2018
06:11 PM
oh the following syntax worked [root@hadoop1 ~]# curl --negotiate -i -u : -X GET -H "Accept: text" http://$(hostname):17001/
HTTP/1.1 401 Authentication required
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Type: text/html; charset=iso-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 1393
HTTP/1.1 200 OK
Set-Cookie: hadoop.auth="u=hbase&p=hbase/hadoop1.xxx.com@XXX.US&t=kerberos&e=1530331783162&s=Ypuvww45JSzCbQwTbc5ysWmaSfI="; Path=/; HttpOnly
Content-Type: text/plain
Cache-Control: no-cache
Content-Length: 18
UFM
WZ
state_code
... View more