Support Questions
Find answers, ask questions, and share your expertise

Exception in while performing Storm operation

Exception in while performing Storm operation

Contributor

Hi All,

I got only below exception to debug but no other details.

Could you help me on this.

**************

2018-01-08 13:44:35.167 c.t.t.b.HdfsStateConvertor [ERROR] Error preparing HdfsState: Running in secure mode, but config doesn't have a keytab java.io.IOException: Running in secure mode, but config doesn't have a keytab at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:300) ~[stormjar.jar:?] at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:271) ~[stormjar.jar:?] at org.apache.storm.hdfs.security.HdfsSecurityUtil.login(HdfsSecurityUtil.java:72) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$Options.prepare(HdfsStateConvertor.java:140) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor.prepare(HdfsStateConvertor.java:504) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertedFactory.makeState(HdfsStateConvertedFactory.java:35) [stormjar.jar:?] at org.apache.storm.trident.planner.SubtopologyBolt.prepare(SubtopologyBolt.java:68) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.trident.topology.TridentBoltExecutor.prepare(TridentBoltExecutor.java:245) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.daemon.executor$fn__10454$fn__10467.invoke(executor.clj:794) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:482) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92] 2018-01-08 13:44:35.187 o.a.s.d.executor [INFO] Prepared bolt b-0:(12) 2018-01-08 13:44:35.272 o.a.s.h.s.HdfsSecurityUtil [INFO] Logging in using keytab as AutoHDFS is not specified for topology.auto-credentials 2018-01-08 13:44:35.272 c.t.t.b.HdfsStateConvertor [INFO] Preparing HDFS Bolt... 2018-01-08 13:44:35.329 c.t.t.b.HdfsStateConvertor [ERROR] Error preparing HdfsState: Running in secure mode, but config doesn't have a keytab java.io.IOException: Running in secure mode, but config doesn't have a keytab at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:300) ~[stormjar.jar:?] at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:271) ~[stormjar.jar:?] at org.apache.storm.hdfs.security.HdfsSecurityUtil.login(HdfsSecurityUtil.java:72) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$Options.prepare(HdfsStateConvertor.java:140) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor.prepare(HdfsStateConvertor.java:504) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertedFactory.makeState(HdfsStateConvertedFactory.java:35) [stormjar.jar:?] at org.apache.storm.trident.planner.SubtopologyBolt.prepare(SubtopologyBolt.java:68) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.trident.topology.TridentBoltExecutor.prepare(TridentBoltExecutor.java:245) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.daemon.executor$fn__10454$fn__10467.invoke(executor.clj:794) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:482) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92] 2018-01-08 13:44:35.344 o.a.s.d.executor [INFO] Prepared bolt b-0:(15) 2018-01-08 13:44:37.401 o.a.h.h.s.DomainSocketFactory [WARN] The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2018-01-08 13:44:37.625 o.a.h.h.s.DomainSocketFactory [WARN] The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2018-01-08 13:44:37.639 c.t.t.b.HdfsStateConvertor [ERROR] Error preparing HdfsState: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local ho st is: "XXXX.XXX.XXX/10.143.104.16"; destination host is: "XXXX.XXX.XXX":8020; java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "XXXX.XXX.XXX/10.143.104.16"; des tination host is: "XXXX.XXX.XXX":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[stormjar.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[stormjar.jar:?] at com.sun.proxy.$Proxy59.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:313) ~[stormjar.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_92] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_92] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_92] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) ~[stormjar.jar:?] at com.sun.proxy.$Proxy60.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1822) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1701) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1636) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:480) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:417) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:930) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:807) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:796) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$HdfsFileOptions.createOutputFile(HdfsStateConvertor.java:267) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$Options.prepare(HdfsStateConvertor.java:142) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor.prepare(HdfsStateConvertor.java:504) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertedFactory.makeState(HdfsStateConvertedFactory.java:35) [stormjar.jar:?] at org.apache.storm.trident.planner.SubtopologyBolt.prepare(SubtopologyBolt.java:68) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.trident.topology.TridentBoltExecutor.prepare(TridentBoltExecutor.java:245) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.daemon.executor$fn__10454$fn__10467.invoke(executor.clj:794) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:482) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92] Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:787) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1451) ~[stormjar.jar:?] ... 34 more 2018-01-08 13:44:37.643 o.a.s.d.executor [INFO] Prepared bolt b-2:(21) 2018-01-08 13:44:38.093 c.t.t.b.HdfsStateConvertor [ERROR] Error preparing HdfsState: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "XXXX.XXX.XXX/10.143.104.16"; destination host is: "XXXX.XXX.XXX":8020; java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "XXXX.XXX.XXX/10.143.104.16"; destination host is: "XXXX.XXX.XXX":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[stormjar.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[stormjar.jar:?] at com.sun.proxy.$Proxy59.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:313) ~[stormjar.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_92] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_92] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_92] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) ~[stormjar.jar:?] at com.sun.proxy.$Proxy60.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1822) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1701) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1636) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:480) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:417) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:930) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:807) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:796) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$HdfsFileOptions.createOutputFile(HdfsStateConvertor.java:267) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$Options.prepare(HdfsStateConvertor.java:142) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor.prepare(HdfsStateConvertor.java:504) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertedFactory.makeState(HdfsStateConvertedFactory.java:35) [stormjar.jar:?] at org.apache.storm.trident.planner.SubtopologyBolt.prepare(SubtopologyBolt.java:68) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.trident.topology.TridentBoltExecutor.prepare(TridentBoltExecutor.java:245) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.daemon.executor$fn__10454$fn__10467.invoke(executor.clj:794) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:482) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92] Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:787) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1451) ~[stormjar.jar:?] ... 34 more 2018-01-08 13:44:38.093 c.t.t.b.HdfsStateConvertor [ERROR] Error preparing HdfsState: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "XXXX.XXX.XXX/10.143.104.16"; destination host is: "XXXX.XXX.XXX":8020; java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "XXXX.XXX.XXX/10.143.104.16"; destination host is: "XXXX.XXX.XXX":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[stormjar.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[stormjar.jar:?] at com.sun.proxy.$Proxy59.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:313) ~[stormjar.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_92] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_92] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_92] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) ~[stormjar.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) ~[stormjar.jar:?] at com.sun.proxy.$Proxy60.create(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1822) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1701) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1636) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:480) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:476) ~[stormjar.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:417) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:930) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:807) ~[stormjar.jar:?] at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:796) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$HdfsFileOptions.createOutputFile(HdfsStateConvertor.java:267) ~[stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor$Options.prepare(HdfsStateConvertor.java:142) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertor.prepare(HdfsStateConvertor.java:504) [stormjar.jar:?] at com.tdameritrade.trident.bolt.HdfsStateConvertedFactory.makeState(HdfsStateConvertedFactory.java:35) [stormjar.jar:?] at org.apache.storm.trident.planner.SubtopologyBolt.prepare(SubtopologyBolt.java:68) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.trident.topology.TridentBoltExecutor.prepare(TridentBoltExecutor.java:245) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.daemon.executor$fn__10454$fn__10467.invoke(executor.clj:794) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:482) [storm-core-1.1.0.2.6.2.0-205.jar:1.1.0.2.6.2.0-205] at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92] Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:787) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620) ~[stormjar.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1451) ~[stormjar.jar:?]

******************************************

1 REPLY 1

Re: Exception in while performing Storm operation

Contributor

Cluster is not kerberized... Can some one help me on this.