- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Creating new user and allocating space
- Labels:
-
Apache Ambari
Created on ‎11-21-2016 06:51 PM - edited ‎09-16-2022 03:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I started a Hortonworks Cloud Cluster using the AWS Marketplace and CloudFormation. I logged in as "admin" and created a new account, but that account does not have a /user/<username> in Files View of Ambari. How do I allocate and provision a file folder for that new user?
Thanks!
Created ‎11-21-2016 09:10 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
From the command line as the hdfs user:
# Create the directory
$ hdfs dfs -mkdir /user/<username>
# Set permissions and ownership
$ hdfs dfs -chown <username> /user/<username>
$ hdfs dfs -chmod 700 /user/<username>
##Optionally set
$ hdfs dfsadmin -setSpaceQuota <bytes_allocated> /user/<username>
## Where bytes_allocated is bytes allowed for this directory (counting replication). This is allocating space for the directory, not by username. So if the user created files in other HDFS directories, this doesn't control that.
Created ‎11-21-2016 09:10 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
From the command line as the hdfs user:
# Create the directory
$ hdfs dfs -mkdir /user/<username>
# Set permissions and ownership
$ hdfs dfs -chown <username> /user/<username>
$ hdfs dfs -chmod 700 /user/<username>
##Optionally set
$ hdfs dfsadmin -setSpaceQuota <bytes_allocated> /user/<username>
## Where bytes_allocated is bytes allowed for this directory (counting replication). This is allocating space for the directory, not by username. So if the user created files in other HDFS directories, this doesn't control that.
Created ‎11-22-2016 01:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm sorry, but at the moment I don't know how to execute commands as hdfs in cloudbreak, but maybe - the cloudbreak user *may* be able to sudo (sudo -u hdfs <COMMAND>). If you are not familiar with sudo, do it something like this:
sudo -u hdfs hdfs dfs -mkdir /user/<username>
Don't let the "hdfs hdfs" together confuse you. The first one is the username and the second one is the command.
Give that a shot and let me know.
Created ‎11-22-2016 01:49 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Venkat Rangan - I think I found the documentation you need to become the admin user:
As the default "cloudbreak" user doesn't have certain permissions (for example, it has no write access to HDFS), you must use the "admin" user to perform certain actions. To use the "admin" user instead of the default "cloudbreak" user, run sudo su - admin
.
(http://docs.hortonworks.com/HDPDocuments/HDCloudAWS/HDCloudAWS-1.8.0/bk_hdcloud-aws/content/using/index.html)
Created ‎11-21-2016 11:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
James,
Thanks very much for your response. Now, when I ssh, I can only enter as "cloudbreak" user. After getting to the box, I tried:
su hdfs
and that failed to login with "su: Authentication failure". So, how do I first setup the hdfs user?
Thanks!
Created ‎11-22-2016 01:53 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes - adding a "sudo -u hdfs" to all the commands worked. Accepting your answer. Thanks again.
Created ‎11-22-2016 01:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Awesome. Good luck.
