Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Flume

Solved Go to solution
Highlighted

Flume

Hi,

 

when i start the flume-ng i am getting the below error,

 

 

at java.lang.Thread.run(Thread.java:662)

2014-08-04 17:54:50,417 WARN hdfs.HDFSEventSink: HDFS IO error

org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /logs/prod/jboss/2014/08/04/web07.prod.hs18.lan.1407154543459.tmp could only be replicated to 0 nodes, instead of 1

        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1637)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:757)

        at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)

        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)

        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)

        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

 

 

 

 

how can we fix the problem?

 

 

-Thankyou

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Flume

i gone through the logs and i come to know that this is purely HDFS issue not flume,

 

so i fix it as follows:

 

step1:

 

stop all the services

 

Step2:

 

started name node

 

then when am trying to start the data nodes on the 3 servers,one of the server throwig the error message 

/var/log/  ----No such file/directory

/var/run --No such file/directory

 

But these files are existing so i check the permissions on those two differ from second server to third server

 

So given the permission to those directories to  be in sink

 

and then started all the services then flume working fine,

 

thats it.

 

 

-Thankyou

2 REPLIES 2

Re: Flume

New Contributor

This means that Namenode does not have available datanode to write. This could because of varity of reasons. Please check the logs.

Re: Flume

i gone through the logs and i come to know that this is purely HDFS issue not flume,

 

so i fix it as follows:

 

step1:

 

stop all the services

 

Step2:

 

started name node

 

then when am trying to start the data nodes on the 3 servers,one of the server throwig the error message 

/var/log/  ----No such file/directory

/var/run --No such file/directory

 

But these files are existing so i check the permissions on those two differ from second server to third server

 

So given the permission to those directories to  be in sink

 

and then started all the services then flume working fine,

 

thats it.

 

 

-Thankyou

Don't have an account?
Coming from Hortonworks? Activate your account here