Created 05-30-2019 06:27 PM
Hi, i am new with hadoop,
I have the following errors when i try to run the command hdfs dfs -ls on a specific directory:
[t_hdhusr@node01 ~]$ hdfs dfs -ls /user/T_HDHUSR log4j:ERROR setFile(null,true) call failed. java.io.FileNotFoundException: /var/log/hadoop/t_hdhusr/hadoop-mapreduce.jobsummary.gz (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:133) at org.apache.log4j.FileAppender.setFile(FileAppender.java:294) at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207) at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165) at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104) at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842) at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768) at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.<clinit>(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66) at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72) at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45) at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383) at org.apache.hadoop.fs.FsShell.<clinit>(FsShell.java:48) ls: Permission denied: user=t_hdhusr, access=READ_EXECUTE, inode="/user/T_HDHUSR":T_HDHUSR:hdfs:drwx------
Thank you in advance for your help.
Created 05-30-2019 08:51 PM
This is a permission issue which can be resolved by changing the access using the hdfs user who happens to be the HDFS superuser
The error Permission denied: user=t_hdhusr, access=READ_EXECUTE, inode="/user/T_HDHUSR":T_HDHUSR:hdfs:drwx------
As the root user
# su- hdfs $ hdfs dfs -chown -R t_hdhusr:hdfs /user/T_HDHUSR
It seems t_hdhuser and T_HDHUSR is not interpreted as the same
Now running the below should succeed
$ hdfs dfs -ls /user/T_HDHUSR
HTH
Created 05-30-2019 08:51 PM
This is a permission issue which can be resolved by changing the access using the hdfs user who happens to be the HDFS superuser
The error Permission denied: user=t_hdhusr, access=READ_EXECUTE, inode="/user/T_HDHUSR":T_HDHUSR:hdfs:drwx------
As the root user
# su- hdfs $ hdfs dfs -chown -R t_hdhusr:hdfs /user/T_HDHUSR
It seems t_hdhuser and T_HDHUSR is not interpreted as the same
Now running the below should succeed
$ hdfs dfs -ls /user/T_HDHUSR
HTH
Created 05-30-2019 09:58 PM
thank you for your answer.