Member since
12-29-2015
8
Posts
0
Kudos Received
0
Solutions
07-14-2016
02:22 PM
I zipped the storm from sandboxHDP2.4 and unzipped it in my local machine and created two folders as /usr/hdp/2.4.0.0-169/etc and /usr/hdp/2.4.0.0-169/storm. Created default folder in etc and copied hadoop file and created bin folder in storm and copied storm.distro and storm.py. I keep on getting some or other file missing like No such file or directory: '/usr/hdp/2.4.0.0-169/storm/extlib'
Could somebody help me is there any better solution or other procedure. I can't go to apache storm 1.0.1 as hdp2.4 has storm-0.10.0-2.4.0.0-169. Basically I want to submit job from storm client which is present in my local machine to cluster which is in sandbox
... View more
07-11-2016
07:32 AM
Hi Kulkarni, thank you for the reply. Do you want me to increase RAM memory or storage memory. I already allocated 8192MB, I remember seeing HDFS 97% full warning in ambari, does that causing me the problem.. I don't see any error and it just stuck at that point. Typing ctrl+c doesn't show me anything
... View more
07-09-2016
06:44 PM
Hi, I have HDP2.4 installed on ubuntu 14.04 and it is working fine from last few months and I have lot on important data in it. Suddenly it is not getting started, it is stopping at 'Starting Resource Manager' as shown in above image. I have attached VirtualBox log image below In /opt folder I don't have VBoxGuestAdditions and my latest history(reason why I attached my history is I have installed few things and want to know whether they are causing the problem) is I am new to sandbox and I have little knowledge on admin, could somebody help me. Thanks in advance
... View more
07-09-2016
10:48 AM
Thank you Lim, I already tried what you have mentioned but my scenerio is we are using the client which is apache storm ie,
/Users/samrat/dev/StormCluster/apache-storm-0.10.0/bin/storm jar sample.jar com.bigd.tool.SampleMain ..... but now I think i should change my code to be compatible with apache-storm 1.0.1.
Thank you once again Lim for your response
... View more
07-07-2016
03:28 PM
Hi, I am beginner in Storm, For submitting storm topology i am using these below set of lines conf.put(Config.NIMBUS_HOST, NIMBUS_NODE); conf.put(Config.NIMBUS_THRIFT_PORT,6627); StormSubmitter.submitTopology("test", conf, builder.createTopology());
and in my pom, we have storm-core-0.9.3 and i am using HDP2.2. Now we are migrating to HDP2.4 which is using storm-core-0.10.0 so i changed my pom to storm-core-0.10.0 but i am getting error as
Required field 'nimbus_uptime_secs' is unset! In code i changed/added
conf.put(Config.NIMBUS_HOST, NIMBUS_NODE);
conf.put(Config.NIMBUS_THRIFT_PORT,6627); ClusterSummary cs = new ClusterSummary(); cs.setFieldValue(_Fields.NIMBUS_UPTIME_SECS, 2); StormSubmitter.submitTopology("test", conf, builder.createTopology());
but not able to figure out how to pass the setted value to StormSubmitter. I tried moving to storm-core-1.0.1 and storm-core-0.11.0-snapshot but those are causing me compilation issues in other dependant code. Thank you
... View more
Labels:
- Labels:
-
Apache Storm
12-30-2015
06:29 AM
Hi Rahul, Thank you very much for the advice, your answer helped us alot.
... View more
12-29-2015
08:52 AM
We are a group of 7 people each having our own individual laptops and we have installed sandbox HDP2.2 on our machines and we all are working under one wifi network in our office. My question is how does each one of us get unique ip address for each of our sandbox so that we can ssh to each of our HDP from our local machines and work individually. If we do ifconfig in sandbox, all 7 are getting inet addr as 192.168.0.166 in bridged network and 10.0.2.15 in NAT network and 192.168.56.101 in Host-Only network.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)