Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

I am trying to create HDFS directory using java . I get java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy exception .

avatar
Super Collaborator

FileSystem.getConfig(config) throws org.apache.hadoop.ipc.RPC.getProxy exception while trying to create HDFS directories.

config.addResource(new Path(String.format("%s/core-site.xml",TestProperties.HDFS_CONF_DIR))); config.addResource(new Path(String.format("%s/hdfs-site.xml",TestProperties.HDFS_CONF_DIR)));

config.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName() );

config.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName() );

FileSystem dfs = FileSystem.get(config);

Attached the complete stack trace :

java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformation;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;ILorg/apache/hadoop/io/retry/RetryPolicy;Z)Lorg/apache/hadoop/ipc/VersionedProtocol; at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:135) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:280) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:245) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:179) at org.apache.atlas.regression.util.HDFSUtil.createDirectory(HDFSUtil.java:46) at org.apache.atlas.regression.tests.FalconIntegrationTest.setUp(FalconIntegrationTest.java:43)

hadoop version in the cluster is 2.7.3 and in pom.xml 2.7.3 . What could be the issue ?

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Adding this dependency resolved the issue .

<dependency>

<groupId>org.apache.hadoop</groupId>

<artifactId>hadoop-client</artifactId>

<version>2.7.3</version>

</dependency>

Thanks!

View solution in original post

3 REPLIES 3

avatar
Super Guru
@ssainath

This is usually a pom issue where it's missing some jar. I am assuming that's not your case. How do you run the app? do you run it using "java -jar ....." or "hadoop jar...." command? You should be running it as "hadoop jar.....".

avatar
Super Collaborator

Thanks @mqureshi . The code is a part of test . I run it as a maven command (mvn test).

avatar
Super Collaborator

Adding this dependency resolved the issue .

<dependency>

<groupId>org.apache.hadoop</groupId>

<artifactId>hadoop-client</artifactId>

<version>2.7.3</version>

</dependency>

Thanks!