Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Not able to access HDFS in cloudera VM

Not able to access HDFS in cloudera VM

New Contributor

Hi Team,

I installed cloudera vm and started trying some basic stuff. First I just wanted to ls the hdfs directoires. so I issued the below command.

 

[cloudera@quickstart ~]$ hadoop fs -ls /
ls: Failed on local exception: java.net.SocketException: Network is unreachable; Host Details : local host is: "quickstart.cloudera/10.0.2.15"; destination host is: "quickstart.cloudera":8020;

 

though ps -fu hdfs says both namenode and data node is running. I checked the status using the service command.

 

[cloudera@quickstart ~]$ sudo service hadoop-hdfs-namenode status
Hadoop namenode is not running [FAILED]

 

 

Thinking all the problems will be resolved if I restart all the services, I executed the below command.

 

[cloudera@quickstart conf]$ sudo /home/cloudera/cloudera-manager --express --force
[QuickStart] Shutting down CDH services via init scripts...
[QuickStart] Disabling CDH services on boot...
[QuickStart] Starting Cloudera Manager daemons...
[QuickStart] Waiting for Cloudera Manager API...
[QuickStart] Configuring deployment...
Submitted jobs: 92
[QuickStart] Deploying client configuration...
Submitted jobs: 93
[QuickStart] Starting Cloudera Management Service...
Submitted jobs: 101
[QuickStart] Enabling Cloudera Manager daemons on boot...

 

Now I thought all services will be up so again checked the status of namenode service. Again it came failed.

 

[cloudera@quickstart ~]$ sudo service hadoop-hdfs-namenode status
Hadoop namenode is not running [FAILED]

 

Now I decided to manually stop and start the namenode service. Again not much use.

 

[cloudera@quickstart ~]$ sudo service hadoop-hdfs-namenode stop
no namenode to stop
Stopped Hadoop namenode: [ OK ]
[cloudera@quickstart ~]$ sudo service hadoop-hdfs-namenode status
Hadoop namenode is not running [FAILED]
[cloudera@quickstart ~]$ sudo service hadoop-hdfs-namenode start
starting namenode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-namenode-quickstart.cloudera.out
Failed to start Hadoop namenode. Return value: 1 [FAILED]

 

I checked the file /var/log/hadoop-hdfs/hadoop-hdfs-namenode-quickstart.cloudera.out . It just said below

 

log4j:ERROR Could not find value for key log4j.appender.RFA
log4j:ERROR Could not instantiate appender named "RFA".

 

I also checked /var/log/hadoop-hdfs/hadoop-cmf-hdfs-NAMENODE-quickstart.cloudera.log.out . Found below when I searched for error. Can anyone please suggest me what is the best way to get the services back on track. Unfortunately I am not able to access cloudera manager from browser. Anything that I can do from command line?

 

 

2016-02-24 21:02:48,105 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[quickstart.cloudera], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[quickstart.cloudera], EVENTCODE=[EV_LOG_EVENT]}, content=Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!, timestamp=1456295437905} - 1 of 17 failure(s) in last 79302s
java.io.IOException: Error connecting to quickstart.cloudera/10.0.2.15:7184
at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:249)
at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:198)
at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:133)
at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:122)
at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:196)
at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.SocketException: Network is unreachable

 

 

 

 

4 REPLIES 4

Re: Not able to access HDFS in cloudera VM

Master Collaborator

Note that once you launch Cloudera Manager, it will manage the services. So using Linux service management (e.g. service hadoop-hdfs-namenode status) won't give you the results you expect after that point. That is *some* of the confusion you're seeing.

 

It looks to me like the root cause is a problem with your virtual network device set up. Hadoop was initialized with the hostname quickstart.cloudera, and that hostname needs to be consistent from Hadoop's perspective. Some of the errors are indicating it can't resolve that, so I suspect something has gone wrong in /etc/hosts. It should all work locally if /etc/hosts looks like this:

127.0.0.1	quickstart.cloudera	quickstart	localhost	localhost.domain

For Hadoop clients outside the VM to work, the hostname needs to map to the external IP, so on boot the VM will attempt to detect that IP and rewrite the hosts file (this script is at /usr/bin/cloudera-quickstart-ip), but depending on the virtual adapter type it may only be able to detect the loopback IP. I haven't seen a scenario where it fails any other way, but I would check your hosts file and see what the status of 'quickstart.cloudera' is.

Re: Not able to access HDFS in cloudera VM

New Contributor

I shutdown the vm and started it again. It seemed to resolve the problem, I am able the hdfs file systema and hence I am able to make use of the vm. Upon checking the /etc/hosts below is what I find. Please let me know if you find anything wrong with the below entries in this file.

 

127.0.0.1            localhost           localhost.domain

10.0.2.15            quickstart.cloudera        quickstart

Re: Not able to access HDFS in cloudera VM

New Contributor

Hi,

 

I still have the same problem with the log4j.appender.RFA 
even if I restart the namenode. :( 

Highlighted

Re: Not able to access HDFS in cloudera VM

New Contributor
Try hdfs dfs -ls /