Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Super Mentor

@Michael Bronson

"Mycluster" needs to be replaced with the "fs.defaultFS" parameter of your HDFS config.


Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Super Mentor

In case of NameNode enabled cluster the "dfs.nameservices" is defined. so based on the "dfs.nameservices" the "fs.defaultFS" is determined.


For example if "dfs.nameservices=mycluster" then the "fs.defaultFS" will be ideally "hdfs://mycluster"


If there is No NameNode HA enabled then the "fs.defaultFS" will be pointing to NameNode host/port

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

@Jay , in my cluster ( HDFS --> config ) I see dfs.nameservices=hdfsha ,

so it should be like this?


hadoop fs -rm -r -skipTrash hdfs://hdfsha/tmp/hive/hive/ 
Michael-Bronson

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

@Jay actually it should be like this


hadoop fs -rm -r -skipTrash hdfs://hdfsha/tmp/hive/hive/*

need to add the "*" after slash in order to delete only the folders under /tmp/hive/hive and not the sub folder itself (/tmp/hive/hive)

Michael-Bronson

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Super Mentor

@Michael Bronson

Looks good. Yes in your command mycluster need to be replaced with hdfsha

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

@jay - nice


I see there the option:


hadoop fs -rm -r -skipTrash hdfs://mycluster/tmp/hive/hive/ 


this option will remove all folders under /tmp/hive/hive


but what is the value - mycluster ? ( what I need to replace instead that ?

Michael-Bronson

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Mentor

@Michael Bronson

Whenever you change the parameter this config the cluster needs to be aware of the changes. When you start Ambari the underlying components don't get started unless you explicitly start those components!

So you can start Ambari without stating YARN or HDFS

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

@Geoffrey Shelton Okot - do you mean to restart the ambari server ? as ambari server restart? instead to restart the HDFS and YARN services ? ( after we set dfs.namenode.fs-limits.max-directory-items )

Michael-Bronson
Highlighted

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Super Mentor

@Michael Bronson

the parameter "dfs.namenode.fs-limits.max-directory-items " is HDFS specific hence the & HDFS dependent services and HDFS Dependent service components needs to be restarted. In Ambari UI it will show the required service components that needs to be restarted.

No need to restart Ambari Server.

Re: how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

second when I saved the parameter in ambari - Ambari -> HDFS -> Configs -> Advanced -> Custom hdfs-site

dfs.namenode.fs-limits.max-directory-items=2097152

109375-capture.png


I GET:


The configuration changes could not be validated for consistency due to an unknown error. Your changes have not been saved yet. Would you like to proceed and save the changes? 


109383-capture.png

dose this parameter supported in HDP version - 2.6.4?

Michael-Bronson