Support Questions

Find answers, ask questions, and share your expertise

What is the releatiohship between yarn container and Block size in hdfs?

avatar
Rising Star

What is the releatiohship between yarn container and Block size in hdfs

1 ACCEPTED SOLUTION

avatar
Super Guru

@ANSARI FAHEEM AHMEDyarn container is a collection of physical resources like CPU,memory and disk.while HDFS block size is chunk on file system where actual read and write happen.

View solution in original post

3 REPLIES 3

avatar
Super Guru

@ANSARI FAHEEM AHMEDyarn container is a collection of physical resources like CPU,memory and disk.while HDFS block size is chunk on file system where actual read and write happen.

avatar
Rising Star

@Rajkumar Singh: Yes, means no relation.

avatar
Master Guru

I mean a bit relation is there. Normally MapReduce will create one Map task for every block. ( Unless small split merge is switched on ). And one map task will run in one container. So half the block size means twice the number of containers running. ( Again not always true since Pig/Tez merge small blocks together using something called the CombineFileInputFormat)