Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

java compile problem for HadoopDFSFileReadWrite.java

Solved Go to solution

java compile problem for HadoopDFSFileReadWrite.java

Explorer

I set up HDP 2.4 Sandbox on Virtualbox on my windows machine.

The pi program works. So, I guess the installation was OK.

Then, I tried to compile a java program from wiki (http://wiki.apache.org/hadoop/HadoopDFSReadWrite.java), but it did not work.

The following shows the compile command and error messages. Basically it does not recognize the org.apache.hadoop.conf package.

javac -cp 'hadoop classpath' HadoopDFSFileReadWrite.java HadoopDFSFileReadWrite.java:22: error: package org.apache.hadoop.conf does not exist import org.apache.hadoop.conf.Configuration; ^ HadoopDFSFileReadWrite.java:23: error: package org.apache.hadoop.fs does not exist import org.apache.hadoop.fs.FileSystem; ^

...

Is this package recognized by HortowonWorks sandbox distribution?

Many thanks.

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: java compile problem for HadoopDFSFileReadWrite.java

Super Mentor

@ilhyung cho

in your command you are not using the correct Quotation marks.

Incorrect:

javac -cp 'hadoop classpath' HadoopDFSFileReadWrite.java 

Correct:

javac -cp `hadoop classpath` HadoopDFSFileReadWrite.java 

.

Everything you type between backticks is evaluated (executed) by the shell before the main command, Whereas the same is not the case with single quotation marks.

For Example:

[root@sandbox ~]# echo 'hadoop classpath'
hadoop classpath


[root@sandbox ~]# echo `hadoop classpath`
/usr/hdp/2.5.0.0-1245/hadoop/conf:/usr/hdp/2.5.0.0-1245/hadoop/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/.//*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/./:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/.//*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/.//*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/.//*::jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf

.

View solution in original post

3 REPLIES 3
Highlighted

Re: java compile problem for HadoopDFSFileReadWrite.java

Super Mentor

@ilhyung cho

in your command you are not using the correct Quotation marks.

Incorrect:

javac -cp 'hadoop classpath' HadoopDFSFileReadWrite.java 

Correct:

javac -cp `hadoop classpath` HadoopDFSFileReadWrite.java 

.

Everything you type between backticks is evaluated (executed) by the shell before the main command, Whereas the same is not the case with single quotation marks.

For Example:

[root@sandbox ~]# echo 'hadoop classpath'
hadoop classpath


[root@sandbox ~]# echo `hadoop classpath`
/usr/hdp/2.5.0.0-1245/hadoop/conf:/usr/hdp/2.5.0.0-1245/hadoop/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/.//*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/./:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/.//*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/.//*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/.//*::jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf

.

View solution in original post

Highlighted

Re: java compile problem for HadoopDFSFileReadWrite.java

Super Mentor

@ilhyung cho

Good to know that it works now. It will be great if you mark this thread/answer as accepted (by clicking the accept link)

Highlighted

Re: java compile problem for HadoopDFSFileReadWrite.java

Explorer

Thanks for quick reply. The book I follow does not show them as backticks. Now I remember those Unix shell stuff. ;-)

It works.

Don't have an account?
Coming from Hortonworks? Activate your account here