Member since
07-20-2020
11
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 10562 | 07-23-2020 06:18 AM |
07-23-2020
06:18 AM
Since the solution is scattered across many posts, I'm posting a short summary of what I did. I am running HDP 2.6.5 image on VirtualBox. Increased my virtual hard disk through Virtual Media Manager In the guest OS, Partitioned the unused space Formatted the new partition as an ext4 file system Mounted the file system Update the /etc/fstab (I couldn't do it, as I did not find that file In Ambari, under DataNode directory config, added the newly mounted file system as a comma separated value Restarted HDFS (my cluster did not have any files, therefore I did not run the below) Thanks to @Shelton for his guidance. sudo -u hdfs hdfs balancer
... View more
07-22-2020
07:48 PM
Thank you for your inputs. I have been able to expand the size of my HDFS finally.
... View more
07-22-2020
09:06 AM
@Shelton So, I've been able to create a new partition and format it as an ext4 filesystem mount it How do I add add this new partition to my datanode? Is it as simple as putting the drive path in Amabari DataNode config?
... View more
07-21-2020
06:42 AM
It is a VDI. I have used Virtual Media Manager to increase the size of my disk. How can i get HDFS to expand and make use of the unallocated space? I'm assuming this is how one would do it 1. Create a new partition in the Guest OS and assign a mount point to it. 2. Add that path to the DataNode directories (or) Extend the current partition to fill the unused disk space so that DataNode automatically increases the HDFS size?
... View more
07-20-2020
08:15 PM
I don't think it is dynamically allocated, or at least it doesn't seem to be working. I've run out of space trying to load a ~70 GB file. How can I increase the capacity?
... View more
07-20-2020
04:04 PM
I'll look into it. I'll have to install gcc and then later Maven to run those shell scripts. Thanks for your input.
... View more
07-20-2020
01:05 PM
I am running Hortonworks Sandbox HDP 2.6.5 on VirtualBox. I have increased the size of my virtual hard disk (.vdi) to 500 GB. However, when I login to Ambari and view the size of my disk, it shows 106 GB only. What should I do to increase the HDFS capacity from 106 GB to 500 GB?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
HDFS