Example: Suppose you want to create a new user "newuser1" on HDFS.
So first create a user on your Host (client machine) and add it to "hadoop" group
# useradd newuser1 -G hadoop
Create Home directory for this user on HDFS.
# su - hdfs -c "hdfs dfs -mkdir /user/newuser1" # su - hdfs -c "hdfs dfs -chown newuser1:hadoop /user/newuser1" # su - hdfs -c "hdfs dfs -chmod 755 /user/newuser1"
Now you can switch to this "newuser1" and then can work on hdfs like putting a file to HDFS.
# su - newuser1 # hdfs dfs -put /etc/passwd /user/newuser1 # hdfs dfs -cat /user/newuser1/passwd
If you are using Ambari then it is more easy ... as soon as you create a user on HDFS it will automatically create the Home Directory for that user on HDFS and set the permission accordingly.
You can enable this feature via ambari as mentioned in the doc: https://docs.hortonworks.com/HDPDocuments/Ambari-188.8.131.52/bk_ambari-administration/content/create_use...
The users can be created using below steps:
a)Get the information from user as to which machine is he working from.
b)create the user in in OS first.
c)Create the user in Hadoop by creating his home folder /user/username in Hadoop
d)make sure that we have 777 permission for temp directory in HDFS
e)using chown command change ownership from Hadoop to user for only his home directory so that he can write into only his directory and not other users.
f)add his name into name node hdfs dfsadmin -refreshUserToGroupMappings
G)If needed set a space limit for the user to limit the amount of data stored by him.hdfs dfsadmin -setSpaceQuota 50g /user/username