Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problems writing to hdfs from storm bolt

Solved Go to solution

Problems writing to hdfs from storm bolt

Hi experts

I have written a topology that aims to write the results to HDFS. I am choosing Sequence file to do it, but I am getting the below error:

org.apache.hadoop.ipc.Client handleConnectionTimeout

INFO: Retrying connect to server: 35.176.11.138/35.176.11.138:50075. Already tried 0 time(s); maxRetries=45

Here is the code snippet

SequenceFileBolt boltseq = new SequenceFileBolt() .withFsUrl("hdfs://35.176.11.138:50075") .withFileNameFormat(fileNameFormat) .withSequenceFormat(format) .withRotationPolicy(rotationPolicy) .withSyncPolicy(syncPolicy);

.....

b.setBolt("HDFS", boltseq).shuffleGrouping("WordCounterBolt");

I am not sure what I am doing wrong, because when I open a browser and run http://35.176.11.138:50075, I can see the datanode.

Please can somebody help?

Thank you

Regards

Naveen

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Problems writing to hdfs from storm bolt

New Contributor

I think you are using datanode url, you should be using namenode url.

1 REPLY 1
Highlighted

Re: Problems writing to hdfs from storm bolt

New Contributor

I think you are using datanode url, you should be using namenode url.

Don't have an account?
Coming from Hortonworks? Activate your account here