Member since
01-10-2016
11
Posts
2
Kudos Received
0
Solutions
05-05-2019
08:58 PM
> So If i want to fetch all defined mapreduce properties,can i use this Api or it does have any pre-requisites? Yes you can. The default role group mostly always exists even if role instances do not, but if not (such as in a heavily API driven install) you can create one before you fetch. > Also does it require any privileges to access this api? A read-only user should also be able to fetch configs as a GET call over API. However, if there are configs marked as secured (such as configs that carry passwords, etc.) then the value retrieval will require admin privileges - they will otherwise appear redacted.
... View more
01-22-2016
11:35 AM
2 Kudos
docu56048_OneFS-7.2-CLI-Administration-Guide.pdf Configure HDFS authentication properties on the Hadoop client If you want clients running Hadoop 2.2 and later to connect to an access zone through Kerberos, you must make some modifications to the core-site.xml and hdfs- site.xml files on the Hadoop clients. Before you begin Kerberos must be set as the HDFS authentication method and a Kerberos authentication provider must be configured on the cluster. Procedure 1. Go to the $HADOOP_CONF directory on your Hadoop client. 2. Open the core-site.xml file in a text editor. 3. Set the value of the hadoop.security.token.service.use_ip property to false as shown in the following example: <property> <name>hadoop.security.token.service.use_ip</name> <value>false</value> </property> 4. Save and close the core-site.xml file. 5. Open the hdfs-site.xml file in a text editor. 6. Set the value of the dfs.namenode.kerberos.principal.pattern property to the Kerberos realm as shown in the following example: <property> <name>dfs.namenode.kerberos.principal.pattern</name> <value>hdfs/*@storage.company.com</value> </property> 7. Save and close the hdfs-site.xml file.
... View more