Options
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Expert Contributor
Created on 06-27-2017 06:44 AM
PROBLEM: While implementing AutoHDFS for storm-hdfs integration, observed following error:
2017-05-19 11:21:44.865 o.a.s.h.c.s.AutoHDFS [ERROR] Could not populate HDFS credentials. java.lang.RuntimeException: Failed to get delegation tokens. at org.apache.storm.hdfs.common.security.AutoHDFS.getHadoopCredentials(AutoHDFS.java:242) at org.apache.storm.hdfs.common.security.AutoHDFS.populateCredentials(AutoHDFS.java:76) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at clojure.lang.Reflector.invokeMatchingMethod(Reflector.java:93) at clojure.lang.Reflector.invokeInstanceMethod(Reflector.java:28) at org.apache.storm.daemon.nimbus$mk_reified_nimbus$reify__11226.submitTopologyWithOpts(nimbus.clj:1544) at org.apache.storm.generated.Nimbus$Processor$submitTopologyWithOpts.getResult(Nimbus.java:2940) at org.apache.storm.generated.Nimbus$Processor$submitTopologyWithOpts.getResult(Nimbus.java:2924) at org.apache.storm.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.storm.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.storm.security.auth.SaslTransportPlugin$TUGIWrapProcessor.process(SaslTransportPlugin.java:138) at org.apache.storm.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2214) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2746) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2759) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.storm.hdfs.common.security.AutoHDFS$1.run(AutoHDFS.java:217) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:360) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704) at org.apache.storm.hdfs.common.security.AutoHDFS.getHadoopCredentials(AutoHDFS.java:213) ... 17 more Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2120)
CAUSE: Missing Hadoop dependencies in pom.xml caused the following exception:
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found
SOLUTION: To resolve this issue, added the following dependency in pom.xml:
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.3.2.5.3.0-37</version> <exclusions> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.7.3.2.5.3.0-37</version> <exclusions> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> </exclusions> </dependency>
1,887 Views