Support Questions

Find answers, ask questions, and share your expertise

Unable to Connect HDFS through java, my hadoop version is 2.9.0. It gives following exception

avatar
18/06/19 18:40:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "hdfs"
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3332)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
	at com.Jambo.App.main(App.java:21)


My code is 

import java.io.IOException;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class App 
{
    public static void main( String[] args ) throws IOException
    {
        System.out.println( "Hello World!" );
        System.out.println("---143---");
		String localPath="/home/user1/Documents/hdfspract.txt";
		String uri="hdfs://172.16.32.139:9000";
		String hdfsDir="hdfs://172.16.32.139:9000/fifo_tbl";
		
		Configuration conf = new Configuration();
		FileSystem fs = FileSystem.get(URI.create(uri),conf);
		fs.copyFromLocalFile(new Path(localPath),new  Path(hdfsDir));
    }
}
1 ACCEPTED SOLUTION

avatar
Super Guru

@Saurabh Ambre,

Try adding these below 2 lines and see if it works.

Configuration conf = new Configuration();
conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());

.

Please "Accept" the answer if this works. This will be helpful for other community users.

.

-Aditya

View solution in original post

3 REPLIES 3

avatar
Super Guru

@Saurabh Ambre,

Try adding these below 2 lines and see if it works.

Configuration conf = new Configuration();
conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());

.

Please "Accept" the answer if this works. This will be helpful for other community users.

.

-Aditya

avatar

thanks for reply

After Adding above code previous exception is resolve but it gives new exception as following, i have added hadoop-hdfs-2.9.0.jar though still it throws following exception

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(Ljava/lang/String;)Ljava/net/InetSocketAddress; at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:99) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3242) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:121) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3291) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3259) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:470) at Opts.main(Opts.java:14)

avatar
Super Guru

@Saurabh Ambre,

Glad to know that the previous issue is resolved. It is always good to create a separate thread for each issue. Please create a new issue for this issue so that the main thread doesn't get deviated. Also in the new question , put the complete stack trace and attach the pom.xml file. Feel free to tag me in the question. Please Accept the above answer.