Member since
05-06-2016
11
Posts
1
Kudos Received
0
Solutions
08-30-2017
04:24 PM
Hi Matt - It is customized processor which our client is building. I just got the confirmation that work is still in progress. Hope it work and thanks for the response
... View more
08-29-2017
10:48 AM
Any KB article on Nifi processor ImportSqoopFull. We are facing issues while using ImportSqoopFull nifi processor. We are getting any errors in bulliten board as well. Any reference materials on this is much appreciated.
... View more
Labels:
- Labels:
-
Apache NiFi
03-29-2017
03:47 AM
Thanks a lot Vipin. I had implemented the above changes and still facing the issues. Attached the latest files as well.cross-realm-details-89602.txt distcp-error-cross-clusters-89602.txt ===================================/var/log/krb5kdc.log==================================================================
Server:ambaristandby.myhadoop.com
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, HTTP/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, yarn/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, rm/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, zookeeper/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:21:07 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:08 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:09 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:09 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:10 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:14 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
... View more
03-29-2017
03:46 AM
Thanks a lot Juan. I had implemented the above changes and still facing the issues. It seems there is some other change which I need to perform. Please help. Attached the latest files as well. cross-realm-details-89602.txt distcp-error-cross-clusters-89602.txt ===================================/var/log/krb5kdc.log==================================================================
Server:ambaristandby.myhadoop.com
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, HTTP/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, yarn/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, rm/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:20:21 ambaristandby.myhadoop.com krb5kdc[1177](info): AS_REQ (4 etypes {18 17 16 23}) 172.21.58.120: ISSUE: authtime 1490757621, etypes {rep=18 tkt=18 ses=18}, zookeeper/standbyms.myhadoop.com@EXAMPLE.COM for krbtgt/EXAMPLE.COM@EXAMPLE.COM
Mar 29 05:21:07 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:08 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:09 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:09 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:10 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
Mar 29 05:21:14 ambaristandby.myhadoop.com krb5kdc[1177](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 172.21.58.116: UNKNOWN_SERVER: authtime 0, varnika@EXAMPLE.COM for nn/ms.myhadoop.com@EXAMPLE.COM, Server not found in Kerberos database
... View more
03-29-2017
03:39 AM
Hi Vipin & Juan - Thanks a lot for your suggestions. I had implemented all of your suggestion but still facing the issue. It seems something I had missed out. Please look at the latest configuration and error details and let me know.cross-realm-details-89602.txtdistcp-error-cross-clusters-89602.txt
... View more
03-20-2017
06:26 AM
Hi Team - I am facing issues while communicating between two different REALM kerberized clusters using Oracle Virtual Box. Please find details of my clusters. I had followed below mentioned link. Any help on this much appreciated. Thanks in advance. https://community.hortonworks.com/articles/18686/kerberos-cross-realm-trust-for-distcp.html Details: HDP: 2.5.3.0 , Ambari: 2.4.2.0, OS: CentOS 6.8, Java: JDK1.7 cross-realm-details.txt distcp-error-cross-clusters.txt Cluster-PRIMARY: REALM: EXAMPLE.COM Cluster-DR:REALM: HORTONWORKS.COM Also not able to perform DISTCP between clusters [ambari-qa@ambaristandby ~]$ hadoop distcp hdfs://172.21.58.120:8020/user/ambari-qa/distcp.txt hdfs://172.21.58.111:8020/user/ambari-qa/distcp_test/ Error: [hdfs@ambarinode ~]$ hdfs dfs -ls hdfs://172.21.58.120:8020/user/
17/03/19 10:04:27 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.
17/03/19 10:04:27 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.
17/03/19 10:04:30 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.
17/03/19 10:04:34 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.
17/03/19 10:04:35 WARN ipc.Client: Couldn't setup connection for hdfs-dr@HORTONWORKS.COM to /172.21.58.120:8020
org.apache.hadoop.ipc.RemoteException(javax.security.sasl.SaslException): GSS initiate failed
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:375)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:595)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:397)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:762)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:758)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:757)
at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
at org.apache.hadoop.ipc.Client.call(Client.java:1449)
at org.apache.hadoop.ipc.Client.call(Client.java:1396)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1674)
at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)
17/03/19 10:04:35 WARN retry.RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over null. Not retrying because try once and fail.
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't setup connection for hdfs-dr@HORTONWORKS.COM to /172.21.58.120:8020; Host Details : local host is: "ambarinode.myhadoop.com/172.21.58.111"; destination host is: "172.21.58.120":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1556)
at org.apache.hadoop.ipc.Client.call(Client.java:1496)
at org.apache.hadoop.ipc.Client.call(Client.java:1396)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1674)
at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)
Caused by: java.io.IOException: Couldn't setup connection for hdfs-dr@HORTONWORKS.COM to /172.21.58.120:8020
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:712)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:683)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:770)
at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
at org.apache.hadoop.ipc.Client.call(Client.java:1449)
... 29 more
Caused by: org.apache.hadoop.ipc.RemoteException(javax.security.sasl.SaslException): GSS initiate failed
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:375)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:595)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:397)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:762)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:758)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:757)
... 32 more
ls: Failed on local exception: java.io.IOException: Couldn't setup connection for hdfs-dr@HORTONWORKS.COM to /172.21.58.120:8020; Host Details : local host is: "ambarinode.myhadoop.com/172.21.58.111"; destination host is: "172.21.58.120":8020;
... View more
Labels:
- Labels:
-
Apache Hadoop