Posts: 8
Registered: ‎01-11-2017
Accepted Solution

HDFS Directory Structure Best Practices


   Can someone point me to a good resource for "best practices" for a hadoop directory structure for storing raw data, intermediate files, output files, metadata etc in HDFS?   Do you segregate different data types into different directory structures?   Are the directory structures labeled per YYMMDD?  What would a typical HDFS directory structure look like when setting up to store data? 

Cloudera Employee
Posts: 48
Registered: ‎08-16-2016

Re: HDFS Directory Structure Best Practices

Eric Sammer (author of Hadoop Operations) has written a great answer about the same here:

Hadoop Operations is a great book and has quite a few good tricks.
New Contributor
Posts: 1
Registered: ‎06-25-2018

Re: HDFS Directory Structure Best Practices

It could be depends on data layers in your HDFS directory, for instance, if you have raw and standard layer this would be one of the practices. 


Raw is the first landing of data and need to be as close to the original data as possible. Standard is the staging of the data where it converted into different data formats and still no semantic changed have been done to data.


the structure for raw data and meta is : 



the structure of standard data/meta folder is: standard/businessarea/sourcesystem/data/date&time



these standards can also help to make  sentry/ranger policies based AD groups