Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Cannot connect to hdfs through java. My hadoop version 2.9.0. Code and error are as following

Highlighted

Cannot connect to hdfs through java. My hadoop version 2.9.0. Code and error are as following

New Contributor
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.net.URI;
public class HdfsCon 
{
	public static void main(String[] args) throws Exception 	 
	{		 
		String HdfsUrl="hdfs://172.16.32.139:9000";
		Configuration conf=new Configuration();
		URI uri=new URI(HdfsUrl);
		FileSystem fs= FileSystem.get(URI.create(HdfsUrl),conf);
		String fp="/home/user1/Documents/hive-site.xml";
		String tp="/fifo_tbl";
		fs.copyFromLocalFile(new Path(fp),new Path(tp));
	}
}


It gives me following Exception........

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
	at org.apache.hadoop.ipc.Client.call(Client.java:1113)
	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
	at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
	at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
	at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
	at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
	at HdfsCon.main(HdfsCon.java:20)


1 REPLY 1

Re: Cannot connect to hdfs through java. My hadoop version 2.9.0. Code and error are as following

Cloudera Employee

Hello @Saurabh Ambre,

The error message you are getting is caused by a version mismatch between the client library you are compiling with and the server version of HDFS running.

e.g. Using version 3.1 of the client JARs with version 2.7 of the server.

==> What version of HDP are you using?

Hortonworks provides Maven repos you can use in your Java IDE to facilitate getting the correct librairies. See here : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_command-line-installation/content/downlo...

Don't have an account?
Coming from Hortonworks? Activate your account here