Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Getting exception as Required field 'nimbus_uptime_secs' is unset

Solved Go to solution

Getting exception as Required field 'nimbus_uptime_secs' is unset

New Contributor

Hi, I am beginner in Storm, For submitting storm topology i am using these below set of lines

conf.put(Config.NIMBUS_HOST, NIMBUS_NODE);

conf.put(Config.NIMBUS_THRIFT_PORT,6627);

StormSubmitter.submitTopology("test", conf, builder.createTopology()); and in my pom, we have storm-core-0.9.3 and i am using HDP2.2.

Now we are migrating to HDP2.4 which is using storm-core-0.10.0 so i changed my pom to storm-core-0.10.0 but i am getting error as Required field 'nimbus_uptime_secs' is unset!

In code i changed/added

conf.put(Config.NIMBUS_HOST, NIMBUS_NODE); conf.put(Config.NIMBUS_THRIFT_PORT,6627);

ClusterSummary cs = new ClusterSummary();

cs.setFieldValue(_Fields.NIMBUS_UPTIME_SECS, 2);

StormSubmitter.submitTopology("test", conf, builder.createTopology());

but not able to figure out how to pass the setted value to StormSubmitter. I tried moving to storm-core-1.0.1 and storm-core-0.11.0-snapshot but those are causing me compilation issues in other dependant code. Thank you

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Getting exception as Required field 'nimbus_uptime_secs' is unset

Cloudera Employee

Storm in HDP 2.4 is based on Apache Storm 0.10.0 but not same since it has Nimbus H/A feature which is available on Apache Storm 1.0.0.

You can change your storm-core dependency to match your Storm version like 0.10.0-2.4.x.xxx according to your HDP version and see it helps.

You may also want to add Hortonworks repository (http://repo.hortonworks.com/content/repositories/releases) to your maven pom or gradle config.

3 REPLIES 3

Re: Getting exception as Required field 'nimbus_uptime_secs' is unset

Cloudera Employee

Storm in HDP 2.4 is based on Apache Storm 0.10.0 but not same since it has Nimbus H/A feature which is available on Apache Storm 1.0.0.

You can change your storm-core dependency to match your Storm version like 0.10.0-2.4.x.xxx according to your HDP version and see it helps.

You may also want to add Hortonworks repository (http://repo.hortonworks.com/content/repositories/releases) to your maven pom or gradle config.

Re: Getting exception as Required field 'nimbus_uptime_secs' is unset

New Contributor

Thank you Lim, I already tried what you have mentioned but my scenerio is we are using the client which is apache storm ie, /Users/samrat/dev/StormCluster/apache-storm-0.10.0/bin/storm jar sample.jar com.bigd.tool.SampleMain ..... but now I think i should change my code to be compatible with apache-storm 1.0.1. Thank you once again Lim for your response

Re: Getting exception as Required field 'nimbus_uptime_secs' is unset

New Contributor

I zipped the storm from sandboxHDP2.4 and unzipped it in my local machine and created two folders as /usr/hdp/2.4.0.0-169/etc and /usr/hdp/2.4.0.0-169/storm. Created default folder in etc and copied hadoop file and created bin folder in storm and copied storm.distro and storm.py. I keep on getting some or other file missing like

No such file or directory: '/usr/hdp/2.4.0.0-169/storm/extlib' Could somebody help me is there any better solution or other procedure. I can't go to apache storm 1.0.1 as hdp2.4 has storm-0.10.0-2.4.0.0-169. Basically I want to submit job from storm client which is present in my local machine to cluster which is in sandbox