Support Questions

Find answers, ask questions, and share your expertise

Hadoop Eclipse plugin

avatar
Contributor
I am getting below error while executing Simple mapreduce programme using eclipse 'mars'
Hadoop 2.6
ubuntu: 15.05 (hacked it and made it 14.04 to avaoid any compatibility issues with hadoop)
using hadoop­eclipse pluggin 2.6
Note that i am able to connect to HDFS via eclipse plugging since in the DFS location on right hand side i can
see the directory structure of HDFS.
Please find below the settings in eclipse and help to overcome the error while executing Mapreduce programme.
(Please note that if i make a jar out of the project via eclipse and run it in traditional command prompt way, i am
getting no issues(as the output and results are clearly visible in DFS location in eclipse, but i want to run it wia
eclipse as a whole))
Error while exeuting the programme :
Picked up JAVA_TOOL_OPTIONS: ­javaagent:/usr/share/java/jayatanaag.jar
2016­01­08 07:32:46,255 INFO [main] datanode.DataNode
(StringUtils.java:startupShutdownMessage(633)) ­ STARTUP_MSG:
/************************************************************
.....
...
..
STARTUP_MSG: build = https://git­wip­us.apache.org/repos/asf/hadoop.git ­r
e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014­11­13T21:10Z
STARTUP_MSG: java = 1.8.0_66
************************************************************/
2016­01­08 07:32:46,271 INFO [main] datanode.DataNode (SignalLogger.java:register(91)) ­
registered UNIX signal handlers for [TERM, HUP, INT]
Usage: java DataNode [­regular | ­rollback]
­regular : Normal DataNode startup (default).
­rollback : Rollback a standard or rolling upgrade.
Refer to HDFS documentation for the difference between standard
and rolling upgrades.
2016­01­08 07:32:46,770 WARN [main] datanode.DataNode (DataNode.java:secureMain(2392)) ­
Exiting Datanode
2016­01­08 07:32:46,773 INFO [main] util.ExitUtil (ExitUtil.java:terminate(124)) ­ Exiting
with status 1
2016­01­08 07:32:46,778 INFO [Thread­1] datanode.DataNode (StringUtils.java:run(659)) ­
SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at prince/127.0.1.1
************************************************************/
1 ACCEPTED SOLUTION

avatar
Rising Star

From what I understand, the Eclipse plugin has not been maintained as new versions of Hadoop have been released.

It appears that the command to start the DataNode is missing a required argument:

Usage: java DataNode [­regular | ­rollback]­
regular : Normal DataNode startup (default).­
rollback : Rollback a standard or rolling upgrade.
Refer to HDFS documentation for the difference between standard
and rolling upgrades.

The Apache HDT (Hadoop Development Tools) project had plans to fix this, but unfortunately, it has been retired due to lack of contributions.

http://hdt.incubator.apache.org/

One option to consider would be to ditch the Eclipse Plugin and leverage "mini clusters" to provide a similar development experience, but without the need to connect to an external cluster or leverage the Eclipse plugin.

https://wiki.apache.org/hadoop/HowToDevelopUnitTests

Another option would be to leverage the hadoop-mini-clusters project that I maintain. It simplifies the use of mini clusters by wrapping them in a common Builder pattern.

https://github.com/sakserv/hadoop-mini-clusters

Hope that helps.

View solution in original post

1 REPLY 1

avatar
Rising Star

From what I understand, the Eclipse plugin has not been maintained as new versions of Hadoop have been released.

It appears that the command to start the DataNode is missing a required argument:

Usage: java DataNode [­regular | ­rollback]­
regular : Normal DataNode startup (default).­
rollback : Rollback a standard or rolling upgrade.
Refer to HDFS documentation for the difference between standard
and rolling upgrades.

The Apache HDT (Hadoop Development Tools) project had plans to fix this, but unfortunately, it has been retired due to lack of contributions.

http://hdt.incubator.apache.org/

One option to consider would be to ditch the Eclipse Plugin and leverage "mini clusters" to provide a similar development experience, but without the need to connect to an external cluster or leverage the Eclipse plugin.

https://wiki.apache.org/hadoop/HowToDevelopUnitTests

Another option would be to leverage the hadoop-mini-clusters project that I maintain. It simplifies the use of mini clusters by wrapping them in a common Builder pattern.

https://github.com/sakserv/hadoop-mini-clusters

Hope that helps.