- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Problems writing to hdfs from storm bolt
Created ‎05-24-2017 04:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi experts
I have written a topology that aims to write the results to HDFS. I am choosing Sequence file to do it, but I am getting the below error:
org.apache.hadoop.ipc.Client handleConnectionTimeout
INFO: Retrying connect to server: 35.176.11.138/35.176.11.138:50075. Already tried 0 time(s); maxRetries=45
Here is the code snippet
SequenceFileBolt boltseq = new SequenceFileBolt() .withFsUrl("hdfs://35.176.11.138:50075") .withFileNameFormat(fileNameFormat) .withSequenceFormat(format) .withRotationPolicy(rotationPolicy) .withSyncPolicy(syncPolicy);
.....
b.setBolt("HDFS", boltseq).shuffleGrouping("WordCounterBolt");
I am not sure what I am doing wrong, because when I open a browser and run http://35.176.11.138:50075, I can see the datanode.
Please can somebody help?
Thank you
Regards
Naveen
Created ‎05-25-2017 04:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you are using datanode url, you should be using namenode url.
Created ‎05-25-2017 04:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you are using datanode url, you should be using namenode url.
