Member since
07-18-2017
10
Posts
0
Kudos Received
0
Solutions
07-08-2019
02:35 PM
Thanks @Akhil S Naik , I'm able to pass the parameters now. However, there is a problem i'm facing when i'm submitting the request with "Requests/resource_filters". If i do the below, the custom script is executed but, it gets executed on all hosts instead of only server1 and server2: -d '{"RequestInfo":{"context":"Execute an action", "action" : "stream_file", "parameters" : {"file" : "/hadoop_mount/hdfs.tar.gz"},"service_name" : "", "component_name":"", "hosts":"server1,server2"}}' http://ambari-server:8080/api/v1/clusters/<clustername>/requests To resolve this i added below as per from your above sample: -d '{"RequestInfo":{"context":"Execute an action", "action" : "stream_file", "parameters" : {"file" : "/hadoop01/hdfs.tar.gz"}},"Requests/resource_filters":[{"service_name" : "", "component_name":"", "hosts":"server1,server2"}]}' http://ambari-server:8080/api/v1/clusters/<clustername>/requests But when i submit this, even though the command goes to only the required hosts, it just goes into hung state (grey colored gears - Execution doesn't start at all.) Any idea on what might be blocking the execution ? With Regards, Narendra To the users who will come here in future, the parameter you passed will be available from the configs (config = Script.get_config()) inside your code and at config['roleParams']['my_input'] to be precise.
... View more
07-08-2019
10:02 AM
My previous comment is still under moderation but the solution is: Apparently, it's mandatory to pass value for service_name and component_name.
... View more
07-05-2019
06:31 AM
I'm working on a POC to take NN metadata backup. For this, i'm compressing the NN's metadata using a custom script that i will call through ambari API. (https://community.hortonworks.com/articles/139788/running-custom-scripts-through-ambari.html) . If i simply hardcode the datadirectory name it will work. However, i'm looking to pass them as an argument while submitting the request. Anyone aware of the method ?
... View more
Labels:
- Labels:
-
Apache Hadoop
06-10-2019
08:21 AM
I'm trying to execute the "Refresh yarn queue's" command through ambari API. Unable to find a solution for this. If anybody has any idea please let me know.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache YARN
-
Cloudera Manager
11-12-2018
07:04 AM
Got it. It started working fine once i passed the core-site.xml properly to the tool. Seems it wasn't able to pick up the rules as it didn't read the core-site.xml file or something. Thank you @Robert Levas for helping out.
... View more
11-10-2018
08:34 PM
ycsb is a standalone tool. We need to pass the hbase-site.xml and we can run the tests (benchmarking tool for databases) I'm trying to figure out if any specific auth-to-local rules are required to be configured in ambari. Since i'm triggering it with my user id after authenticating with AD realm (AD realm added to auth-to-local rules), not able to understand why i still have the error. As far as i'm understanding, the error is not originating from the tool as i'm able to use/run ycsb benchmarking if i authenticate using the local realm (i added my user principal to the local MIT kdc and authenticated using that----getting a ticket as user@LOCALREALM.EXAMPLE.COM instead of user@EXAMPLE.COM). when using kinit as user@EXAMPLE.COM and running, i'm getting below responses as in the above stack trace: Caused by: java.io.IOException: failure to login Caused by: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name user@EXAMPLE.COM Caused by: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user@EXAMPLE.COM
... View more
11-09-2018
12:44 PM
I have a kerberized cluster where in local realm trusts AD realm with MIT KDC setup. AD Realm : EXAMPLE.COM Local Realm: LOCALREALM.EXAMPLE.COM Post doing kinit as user@EXAMPLE.COM , I'm able to perform all the regular tasks through command line like creating hbase tables, running mapreduce job etc. But, when i'm trying to connect to hbase to perform a benchmarking through ycsb tool, it throws an exception as unable to login. If i authenticate using the local realm such as user@LOCALREALM.EXAMPLE.COM, it works like a charm. I have the rules added in auth to local to trust AD realm too : RULE:[1:$1@$0](.*@EXAMPLE.COM)s/@.*// Do not understand if i'm missing anything else. can someone please help ? Below is a part of the stack trace: Caused by: java.io.IOException: failure to login at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:782) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:285) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:281) at org.apache.hadoop.hbase.security.User.getCurrent(User.java:185) at org.apache.hadoop.hbase.security.UserProvider.getCurrent(UserProvider.java:88) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:215) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at com.yahoo.ycsb.db.HBaseClient10.init(HBaseClient10.java:149) ... 3 more Caused by: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name user@EXAMPLE.COM at org.apache.hadoop.security.User.<init>(User.java:50) at org.apache.hadoop.security.User.<init>(User.java:43) at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:179) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) at javax.security.auth.login.LoginContext.login(LoginContext.java:588) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:757) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:285) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:281) at org.apache.hadoop.hbase.security.User.getCurrent(User.java:185) at org.apache.hadoop.hbase.security.UserProvider.getCurrent(UserProvider.java:88) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:215) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at com.yahoo.ycsb.db.HBaseClient10.init(HBaseClient10.java:149) at com.yahoo.ycsb.DBWrapper.init(DBWrapper.java:86) at com.yahoo.ycsb.ClientThread.run(Client.java:424) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user@EXAMPLE.COM at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:389) at org.apache.hadoop.security.User.<init>(User.java:48) ... 26 more
... View more
Labels:
11-07-2018
07:38 PM
Thanks @Robert Levas This fixed it.
... View more
11-07-2018
01:28 PM
I enabled kerbros using MIT KDC. MIT KDC has a trust setup with Active Directory. Say my AD realm was "EXAMPLE.COM" and local realm was "HADOOP.CLUSTERNAME.EXAMPLE.COM". When i do a kinit <username>@EXAMPLE.COM, i'm able to get a kerberos ticket from the Active Directory. Now this should allow me to use hadoop as "username". However, instead it allows me to use only if i'm "username@EXAMPLE.COM". Ex 1: once authenticated with kerberos: hadoop fs -put <localfile> /user/<username>/ - doesn't allow But, hadoop fs -mkdir /user/<username>@EXAMPLE.COM hadoop fs -put <localfile> ---- this works Ex 2: hbase shell user "username" has permission to access a table but, won't be allowed to access the table unless "username@EXAMPLE.COM" has access to table (or in other words): after kinit <username>@EXAMPLE.COM grant '<username>','R','tablename' --- will not allow me to access the table whereas grant '<username@EXAMPLE.COM>','R','tablename' will allow me to access the table.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
-
Kerberos