Member since
09-08-2017
15
Posts
0
Kudos Received
0
Solutions
09-09-2020
09:13 AM
Hi @Debangshu It worked with 1.10.0 and 1.11.3, thanks mate for the resolution. Thanks David
... View more
09-03-2020
06:14 AM
Does it work with 1.12.0 NiFi? Or 1.11.3 is better?
... View more
09-03-2020
04:31 AM
hi, If "root" is trying to write the file on HDFS then privilege issue should throw up as HDFS ACLs sync with sentry privileges. Thanks David
... View more
09-03-2020
02:24 AM
Is this applicable for Hadoop 2.6.0.(CDH 5.16.2)?
... View more
08-13-2020
07:21 AM
Hi Timothy, Here are some logs which might give you some insights, today I have eliminated Networking and Dual NiC issues, as both the clusters are the in the same Subnet and there is no dual NiC for these VMs and all possible traffic is seamlessly flowing back and forth. Name Node Logs:
2020-08-13 16:01:26,120 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for mapred/principle@user.queue (auth:KERBEROS)
2020-08-13 16:01:26,126 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for mapred/principle@user.queue (auth:KERBEROS) for protocol=interface user.queue.user.queue.user.queue
2020-08-13 16:01:28,672 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for principle@user.queue (auth:KERBEROS)
2020-08-13 16:01:28,679 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for principle@user.queue (auth:KERBEROS) for protocol=interface user.queue.user.queue.user.queue
2020-08-13 16:01:28,704 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /user/abc/puthdfs_test/.user.queue. user.queue.64.55-1545405130172 blk_1075343453_1603222{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[[DISK]DS-53536364-33f4-40d6-85c2-508abf7ff023:NORMAL:00.00.64.58:50010|RBW], ReplicaUnderConstruction[[DISK]DS-abba7d97-925a-4299-af86-b58fef9aaa12:NORMAL:00.00.64.84:50010|RBW], ReplicaUnderConstruction[[DISK]DS-286b28e8-d035-4b8c-a2dd-aabb08666234:NORMAL:00.00.64.56:50010|RBW]]}
2020-08-13 16:01:28,727 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /user/abc/puthdfs_test/.user.queue. user.queue.64.55-1545405130172 blk_1075343454_1603223{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[[DISK]DS-abba7d97-925a-4299-af86-b58fef9aaa12:NORMAL:00.00.64.84:50010|RBW], ReplicaUnderConstruction[[DISK]DS-286b28e8-d035-4b8c-a2dd-aabb08666234:NORMAL:00.00.64.56:50010|RBW], ReplicaUnderConstruction[[DISK]DS-d6f56418-6e18-4317-a8ec-4a5b15757728:NORMAL:00.00.64.57:50010|RBW]]}
2020-08-13 16:01:28,734 WARN org.apache.hadoop.hdfs.server.blockmanagement.BlockPlacementPolicy: Failed to place enough replicas, still in need of 1 to reach 3 (unavailableStorages=[], storagePolicy=BlockStoragePolicy{HOT:7, storageTypes=[DISK], creationFallbacks=[], replicationFallbacks=[ARCHIVE]}, newBlock=true) For more information, please enable DEBUG log level on user.queue.user.queue.user.queue.BlockPlacementPolicy and user.queue.user.queue.NetworkTopology
2020-08-13 16:01:28,735 WARN org.apache.hadoop.hdfs.protocol.BlockStoragePolicy: Failed to place enough replicas: expected size is 1 but only 0 storage types can be selected (replication=3, selected=[], unavailable=[DISK], removed=[DISK], policy=BlockStoragePolicy{HOT:7, storageTypes=[DISK], creationFallbacks=[], replicationFallbacks=[ARCHIVE]})
DataNode Logs:
2020-08-13 16:00:41,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving user.queue.64.55-1545405130172:blk_1075343452_1603221 src: /00.00.64.58:55510 dest: /00.00.64.57:50010
2020-08-13 16:00:41,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /00.00.64.58:55510, dest: /00.00.64.57:50010, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1029630366_107, offset: 0, srvID: cb4e7a77-f5d6-49a5-abab-58d060602ec7, blockid: user.queue.64.55-1545405130172:blk_1075343452_1603221, duration: 54439548
2020-08-13 16:00:41,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: user.queue.64.55-1545405130172:blk_1075343452_1603221, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
2020-08-13 16:00:45,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1075343452_1603221 file /data/dfs/dn/current/user.queue.64.55-1545405130172/current/finalized/subdir24/subdir112/blk_1075343452 for deletion
2020-08-13 16:00:45,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted user.queue.64.55-1545405130172 blk_1075343452_1603221 file /data/dfs/dn/current/user.queue.64.55-1545405130172/current/finalized/subdir24/subdir112/blk_1075343452
2020-08-13 16:01:28,743 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: user.queue.user.queue:50010:DataXceiver error processing unknown operation src: /00.00.64.67:59988 dst: /00.00.64.57:50010
java.io.IOException:
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessage(DataTransferSaslUtil.java:217)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:364)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)
at java.lang.Thread.run(Thread.java:748)
2020-08-13 16:01:41,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving user.queue.64.55-1545405130172:blk_1075343458_1603227 src: /00.00.64.56:58556 dest: /00.00.64.57:50010
2020-08-13 16:01:41,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /00.00.64.56:58556, dest: /00.00.64.57:50010, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1081910632_107, offset: 0, srvID: cb4e7a77-f5d6-49a5-abab-58d060602ec7, blockid: user.queue.64.55-1545405130172:blk_1075343458_1603227, duration: 8787325
2020-08-13 16:01:41,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: user.queue.64.55-1545405130172:blk_1075343458_1603227, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
2020-08-13 16:01:45,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1075343458_1603227 file /data/dfs/dn/current/user.queue.64.55-1545405130172/current/finalized/subdir24/subdir112/blk_1075343458 for deletion
2020-08-13 16:01:45,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted user.queue.64.55-1545405130172 blk_1075343458_1603227 file /data/dfs/dn/current/user.queue.64.55-1545405130172/current/finalized/subdir24/subdir112/blk_1075343458
2020-08-13 16:02:13,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving user.queue.64.55-1545405130172:blk_1075343459_1603228 src: /00.00.64.55:43446 dest: /00.00.64.57:50010
Thanks David
... View more
08-13-2020
02:47 AM
Hi Timothy, I can reach hive and insert data into tables as well, that works perfectly fine. Also The NiFi cluser and the CDH cluster are in the same subnet. We already did have a solutions architect from cloudera who did assess our clusters from security stand point and certified everything as good. I am able to connect to HDFS from other applications seamlessly with the same security standards. Also in bootstrap.conf I have "java.arg.16=-Djavax.security.auth.useSubjectCredsOnly=true" Thanks David
... View more
08-12-2020
01:50 AM
Hi Timothy, I have Free IPA IAM which actually assigns all kerberos related transactions, here is what i am trying to do, 1) create a separate service account for NiFi and use that service account to start NiFi and add that service account, in the service account which has required permissions on HDFS and sentry. So that when the flow is triggered with PutHDFS processor it has all required permissions on the CDH cluster. 2) Trigger the work flow in my local NiFi instance installed in my PC while I capture the logs from NiFi intsance, Name Node and the data nodes when the flow is running, again trigger the flow in NiFi cluster while running processors from the primary node alone and capture the logs of NiFi instance, Name node logs and the Data Node logs, and then compare all these logs and see if i can find anything. Because I have eliminated all the security related options because local security, Firewalls and file permissions are eliminated out of the equation. Thanks David
... View more
08-11-2020
01:37 PM
It's actually the same, however the NiFi service is being run by root and I use a service account principal and it's keytab in the PutHDFS processor, So do you say that "root" is not having sufficient privileges to write the file into HDFS?
... View more
08-11-2020
12:53 PM
Hi Timothy, Here is the dfs admin report [hdfssuperuser@abc ~]$ hadoop dfsadmin -report Configured Capacity: 8748844187648 (7.96 TB) Present Capacity: 8649609633023 (7.87 TB) DFS Remaining: 6585870316863 (5.99 TB) DFS Used: 2063739316160 (1.88 TB) DFS Used%: 23.86% Live datanodes (4): Name: 0.1.0.1:50010 Hostname: abc Rack: /default Decommission Status : Normal Configured Capacity: 2187211046912 (1.99 TB) DFS Used: 673284505773 (627.05 GB) Non DFS Used: 24917778259 (23.21 GB) DFS Remaining: 1488606111329 (1.35 TB) DFS Used%: 30.78% DFS Remaining%: 68.06%
... View more
08-11-2020
11:42 AM
Hi Timothy, The NameNode may be overloaded. Check the logs for messages that say "discarding calls..." - Name node is fine and works fine when I actually have a nifi instance running in my laptop and use the same work flow with same configurations am able to write the file successfully to the same cluster There may not be enough (any) DataNode nodes running for the data to be written. Again, check the logs. - All datanodes are up and running, Will check the logs and get back to you Every DataNode on which the blocks were stored might be down (or not connected to the NameNode ; it is impossible to distinguish the two). - Am able to put file from command line.
... View more
08-11-2020
10:26 AM
Hi Timothy, Here are the details... Please let me know what log information are you looking for? NiFi version - 1.11.4 HDFS version - 2.6.0+cdh5.16.2+2863 Encrypted file system? - No KMS, HDFS - Kerberos & SSL Enabled Cloud? - VM - On Prem OS version - RHEL 7 JDK version / JVM version - jdk1.8.0_162 CDH/HDP/CDP version - CDH 5.16.2 PutHDFS Settings - 1) Hadoop Configuration Resources - hdfs-site.xml, core-site.xml 2) Kerberos Credentials Service - Keytab location - keytab placed on all nodes in the same path 3) Directory - HDFS - /path/to/folder 4) Conflict Resolution Strategy - Replace What kind of data? Example data, - A simple text file
... View more
08-11-2020
01:50 AM
Hello, I get the below error message in my NiFi logs when I tried to write a file to HDFS, but when i try to write a file with in the hadoop cluster it works fine, but from NiFi it fails with the below message. My Nifi service is started by root and, when I have a local NiFi instance i am able to write the file to HDFS, where as from the NiFi cluster am unable to do so, any help would be highly appreciated, I am trying another solution, if that works then I will post it over here. 2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.c.s.StandardProcessScheduler Starting LogMessage[id=b4bc6d2c-0173-1000-0000-00002905a41b]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.controller.StandardProcessorNode Starting LogMessage[id=b4bc6d2c-0173-1000-0000-00002905a41b]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.c.s.StandardProcessScheduler Starting LogMessage[id=b4bd264b-0173-1000-0000-000018f91304]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.controller.StandardProcessorNode Starting LogMessage[id=b4bd264b-0173-1000-0000-000018f91304]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.c.s.StandardProcessScheduler Starting GetFile[id=b4d14ae8-0173-1000-ffff-ffffe680a6a0]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.controller.StandardProcessorNode Starting GetFile[id=b4d14ae8-0173-1000-ffff-ffffe680a6a0]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.c.s.StandardProcessScheduler Starting PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8]
2020-08-10 13:41:59,519 INFO [NiFi Web Server-32056] o.a.n.controller.StandardProcessorNode Starting PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8]
2020-08-10 13:41:59,519 INFO [Timer-Driven Process Thread-6] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled GetFile[id=b4d14ae8-0173-1000-ffff-ffffe680a6a0] to run with 1 threads
2020-08-10 13:41:59,519 INFO [Timer-Driven Process Thread-2] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled LogMessage[id=b4bc6d2c-0173-1000-0000-00002905a41b] to run with 1 threads
2020-08-10 13:41:59,519 INFO [Timer-Driven Process Thread-5] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled LogMessage[id=b4bd264b-0173-1000-0000-000018f91304] to run with 1 threads
2020-08-10 13:41:59,543 INFO [Timer-Driven Process Thread-10] o.a.hadoop.security.UserGroupInformation Login successful for user abc@UX.xyzCORP.NET using keytab file /home/abc/confFiles/abc.keytab
2020-08-10 13:41:59,544 INFO [Timer-Driven Process Thread-10] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8] to run with 1 threads
2020-08-10 13:41:59,595 INFO [Thread-9481] o.a.h.h.p.d.sasl.SaslDataTransferClient SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-08-10 13:41:59,599 INFO [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Exception in createBlockOutputStream blk_1075334640_1594409
java.io.EOFException: null
at java.io.DataInputStream.readByte(DataInputStream.java:267)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFieldsLegacy(BlockTokenIdentifier.java:240)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFields(BlockTokenIdentifier.java:221)
at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:200)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:530)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:342)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:276)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:245)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:203)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:193)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1731)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2020-08-10 13:41:59,599 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Abandoning BP-1824237254-0.00.64.55-1545405130172:blk_1075334640_1594409
2020-08-10 13:41:59,601 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Excluding datanode DatanodeInfoWithStorage[0.00.64.57:50010,DS-d6f56418-6e18-4317-a8ec-4a5b15757728,DISK]
2020-08-10 13:41:59,605 INFO [Thread-9481] o.a.h.h.p.d.sasl.SaslDataTransferClient SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-08-10 13:41:59,606 INFO [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Exception in createBlockOutputStream blk_1075334641_1594410
java.io.EOFException: null
at java.io.DataInputStream.readByte(DataInputStream.java:267)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFieldsLegacy(BlockTokenIdentifier.java:240)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFields(BlockTokenIdentifier.java:221)
at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:200)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:530)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:342)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:276)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:245)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:203)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:193)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1731)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2020-08-10 13:41:59,606 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Abandoning BP-1824237254-0.00.64.55-1545405130172:blk_1075334641_1594410
2020-08-10 13:41:59,608 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Excluding datanode DatanodeInfoWithStorage[0.00.64.56:50010,DS-286b28e8-d035-4b8c-a2dd-aabb08666234,DISK]
2020-08-10 13:41:59,612 INFO [Thread-9481] o.a.h.h.p.d.sasl.SaslDataTransferClient SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-08-10 13:41:59,612 INFO [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Exception in createBlockOutputStream blk_1075334642_1594411
java.io.EOFException: null
at java.io.DataInputStream.readByte(DataInputStream.java:267)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFieldsLegacy(BlockTokenIdentifier.java:240)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFields(BlockTokenIdentifier.java:221)
at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:200)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:530)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:342)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:276)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:245)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:203)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:193)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1731)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2020-08-10 13:41:59,612 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Abandoning BP-1824237254-0.00.64.55-1545405130172:blk_1075334642_1594411
2020-08-10 13:41:59,614 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Excluding datanode DatanodeInfoWithStorage[0.00.64.58:50010,DS-53536364-33f4-40d6-85c2-508abf7ff023,DISK]
2020-08-10 13:41:59,618 INFO [Thread-9481] o.a.h.h.p.d.sasl.SaslDataTransferClient SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-08-10 13:41:59,619 INFO [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Exception in createBlockOutputStream blk_1075334643_1594412
java.io.EOFException: null
at java.io.DataInputStream.readByte(DataInputStream.java:267)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFieldsLegacy(BlockTokenIdentifier.java:240)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFields(BlockTokenIdentifier.java:221)
at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:200)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:530)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:342)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:276)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:245)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:203)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:193)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1731)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2020-08-10 13:41:59,619 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Abandoning BP-1824237254-0.00.64.55-1545405130172:blk_1075334643_1594412
2020-08-10 13:41:59,621 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Excluding datanode DatanodeInfoWithStorage[0.00.64.84:50010,DS-abba7d97-925a-4299-af86-b58fef9aaa12,DISK]
2020-08-10 13:41:59,621 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer DataStreamer Exception
java.io.IOException: Unable to create new block.
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1694)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2020-08-10 13:41:59,621 WARN [Thread-9481] org.apache.hadoop.hdfs.DataStreamer Could not get block locations. Source file "/user/abc/puthdfs_test/.test.txt" - Aborting...block==null
2020-08-10 13:41:59,626 ERROR [Timer-Driven Process Thread-2] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8] Failed to write to HDFS due to org.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8]: java.io.IOException: Could not get block locations. Source file "/user/abc/puthdfs_test/.test.txt" - Aborting...block==null: org.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8]: java.io.IOException: Could not get block locations. Source file "/user/abc/puthdfs_test/.test.txt" - Aborting...block==null
org.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=4d34342b-2901-125d-917f-567e466964c8]: java.io.IOException: Could not get block locations. Source file "/user/abc/puthdfs_test/.test.txt" - Aborting...block==null
at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2347)
at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2292)
at org.apache.nifi.processors.hadoop.PutHDFS$1.run(PutHDFS.java:320)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1710)
at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:250)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Could not get block locations. Source file "/user/abc/puthdfs_test/.test.txt" - Aborting...block==null
... View more
Labels:
06-11-2020
04:47 AM
0
I get this below error whenever I try to connect from NiFi to cloudera(5.16.2) hiveserver2, I have kerberos enabled and SSL enabled to hive service as well, the java version that hive runs is on OpenJDK and NiFi runs on Oracle JDK 1.8.162, the jdbc uri that I mention in the NiFi processor is " jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET
2020-06-11 10:03:34,853 INFO [NiFi Web Server-709] o.a.n.controller.StandardProcessorNode Starting PutHiveQL[id=734f0b34-d3fc-16bb-ffff-ffffeacb958f] 2020-06-11 10:03:34,853 INFO [Timer-Driven Process Thread-8] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PutHiveQL[id=734f0b34-d3fc-16bb-ffff-ffffeacb958f] to run with 1 threads 2020-06-11 10:03:34,857 INFO [Timer-Driven Process Thread-7] org.apache.hive.jdbc.Utils Supplied authorities: hostname:10000 2020-06-11 10:03:34,857 INFO [Timer-Driven Process Thread-7] org.apache.hive.jdbc.Utils Resolved authority: hostname:10000 2020-06-11 10:03:34,860 INFO [Timer-Driven Process Thread-7] org.apache.hive.jdbc.HiveConnection Will try to open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET 2020-06-11 10:03:34,861 INFO [Timer-Driven Process Thread-7] org.apache.hive.jdbc.HiveConnection Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET 2020-06-11 10:03:34,861 INFO [Timer-Driven Process Thread-7] org.apache.hive.jdbc.HiveConnection Transport Used for JDBC connection: null 2020-06-11 10:03:34,862 ERROR [Timer-Driven Process Thread-7] o.a.nifi.dbcp.hive.HiveConnectionPool HiveConnectionPool[id=734f0b3c-d3fc-16bb-ffff-fffffcb9ca34] Error getting Hive connection: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21) org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21) at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549) at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) at org.apache.nifi.dbcp.hive.HiveConnectionPool.lambda$getConnection$0(HiveConnectionPool.java:369) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:369) at org.apache.nifi.dbcp.DBCPService.getConnection(DBCPService.java:49) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:87) at com.sun.proxy.$Proxy227.getConnection(Unknown Source) at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$1(PutHiveQL.java:209) at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:97) at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21 at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:176) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556) at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545) ... 31 common frames omitted Caused by: org.apache.thrift.transport.TTransportException: Invalid status 21 at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:277) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204) ... 37 common frames omitted 2020-06-11 10:03:34,862 ERROR [Timer-Driven Process Thread-7] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=734f0b34-d3fc-16bb-ffff-ffffeacb958f] org.apache.nifi.processors.hive.PutHiveQL$$Lambda$960/419761608@2c9b9d46 failed to process due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21); rolling back session: org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21) org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21) at org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:384) at org.apache.nifi.dbcp.DBCPService.getConnection(DBCPService.java:49) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:87) at com.sun.proxy.$Proxy227.getConnection(Unknown Source) at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$1(PutHiveQL.java:209) at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:97) at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://hostname:10000/;ssl=true;sslTrustStore=/data/nifi-1.10.0/tls/truststore.jks;principal=hive/hostname@AA.BBCC.NET: Invalid status 21) at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549) at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) at
... View more
Labels:
06-28-2019
02:36 AM
Hi Dennis, Can we install CFM parcels to use NiFi in the cloudera enterprise cluster? How about the licensing does it needs to be purchased separately or is it included, I think if we use CDP only then we need to go for licensing to use NiFi? Could you please shed some light on the licensing part. Thanks David
... View more
04-23-2018
07:32 AM
Hello All, Am trying to connect JMeter from my windows machine to Cloudera Cluster, I have placed the Hive JDBC jar files in ../lib/ folder, gave properties in Jaas.conf file and system.properties, did all the required things and I still get the below error... 2018-04-20 10:29:23,776 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.assertions.BSFAssertion 2018-04-20 10:29:24,525 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.extractor.BSFPostProcessor 2018-04-20 10:29:24,588 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.modifiers.BSFPreProcessor 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/html is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for application/xhtml+xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for application/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/vnd.wap.wml is org.apache.jmeter.protocol.http.parser.RegexpHTMLParser 2018-04-20 10:29:24,651 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/css is org.apache.jmeter.protocol.http.parser.CssParser 2018-04-20 10:29:25,400 INFO o.a.j.e.KeyToolUtils: Exception checking for keytool existence, will return false, try another way. 2018-04-20 10:29:26,542 INFO o.a.j.e.KeyToolUtils: keytool found at 'C:\Program Files\Java\jre1.8.0_162\bin\keytool' 2018-04-20 10:29:26,542 INFO o.a.j.p.h.p.ProxyControl: HTTP(S) Test Script Recorder SSL Proxy will use keys that support embedded 3rd party resources in file C:\Users\extdku\Downloads\apache-jmeter-3.3\apache-jmeter-3.3\bin\proxyserver.jks 2018-04-20 10:29:28,165 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.protocol.java.sampler.BSFSampler 2018-04-20 10:29:28,212 INFO o.a.j.s.FileServer: Default base='C:\Users\extdku\Downloads\apache-jmeter-3.3\apache-jmeter-3.3\bin' 2018-04-20 10:29:28,290 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.protocol.mongodb.config.MongoSourceElement 2018-04-20 10:29:28,290 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.protocol.mongodb.sampler.MongoScriptSampler 2018-04-20 10:29:29,757 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.timers.BSFTimer 2018-04-20 10:29:29,788 INFO o.a.j.g.u.MenuFactory: Skipping org.apache.jmeter.visualizers.BSFListener 2018-04-20 10:29:30,287 INFO o.a.j.s.SampleResult: Note: Sample TimeStamps are START times 2018-04-20 10:29:30,287 INFO o.a.j.s.SampleResult: sampleresult.default.encoding is set to ISO-8859-1 2018-04-20 10:29:30,287 INFO o.a.j.s.SampleResult: sampleresult.useNanoTime=true 2018-04-20 10:29:30,287 INFO o.a.j.s.SampleResult: sampleresult.nanoThreadSleep=5000 2018-04-20 10:29:37,155 INFO o.a.j.g.a.Load: Loading file: C:\Users\extdku\Desktop\Klimatratt.jmx 2018-04-20 10:29:37,155 INFO o.a.j.s.FileServer: Set new base='C:\Users\extdku\Desktop' 2018-04-20 10:29:37,373 INFO o.a.j.s.SaveService: Testplan (JMX) version: 2.2. Testlog (JTL) version: 2.2 2018-04-20 10:29:37,436 INFO o.a.j.s.SaveService: Using SaveService properties file encoding UTF-8 2018-04-20 10:29:37,436 INFO o.a.j.s.SaveService: Using SaveService properties version 3.2 2018-04-20 10:29:37,451 INFO o.a.j.s.SaveService: Loading file: C:\Users\extdku\Desktop\Klimatratt.jmx 2018-04-20 10:29:38,044 INFO o.a.j.s.FileServer: Set new base='C:\Users\extdku\Desktop' 2018-04-20 10:29:40,166 INFO o.a.j.e.StandardJMeterEngine: Running the test! 2018-04-20 10:29:40,244 INFO o.a.j.s.SampleEvent: List of sample_variables: [] 2018-04-20 10:29:40,244 INFO o.a.j.s.SampleEvent: List of sample_variables: [] 2018-04-20 10:29:40,291 INFO o.a.j.g.u.JMeterMenuBar: setRunning(true, *local*) 2018-04-20 10:29:40,431 INFO o.a.j.e.StandardJMeterEngine: Starting ThreadGroup: 1 : Input Parameters 2018-04-20 10:29:40,431 INFO o.a.j.e.StandardJMeterEngine: Starting 1 threads for group Input Parameters. 2018-04-20 10:29:40,431 INFO o.a.j.e.StandardJMeterEngine: Thread will continue on error 2018-04-20 10:29:40,431 INFO o.a.j.t.ThreadGroup: Starting thread group... number=1 threads=1 ramp-up=1 perThread=1000.0 delayedStart=false 2018-04-20 10:29:40,431 INFO o.a.j.t.ThreadGroup: Started thread group number 1 2018-04-20 10:29:40,431 INFO o.a.j.e.StandardJMeterEngine: All thread groups have been started 2018-04-20 10:29:40,447 INFO o.a.j.t.JMeterThread: Thread started: Input Parameters 1-1 2018-04-20 10:29:40,463 INFO o.a.h.j.Utils: Supplied authorities: abc.ux.xxxx.net:10000 2018-04-20 10:29:40,463 INFO o.a.h.j.Utils: Resolved authority: abc .ux. xxxx .net:10000 2018-04-20 10:29:40,650 ERROR o.a.h.u.Shell: Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:404) ~[hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:419) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.util.Shell.<clinit>(Shell.java:412) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:168) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:132) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:100) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:435) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:337) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:304) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:891) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:857) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge.createClientWithConf(HadoopThriftAuthBridge.java:85) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:54) [hive-service-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:415) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:200) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2304) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2290) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.jmeter.protocol.jdbc.config.DataSourceElement$DataSourceComponentImpl.getConnection(DataSourceElement.java:326) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.protocol.jdbc.config.DataSourceElement.getConnection(DataSourceElement.java:191) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.protocol.jdbc.sampler.JDBCSampler.sample(JDBCSampler.java:79) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:498) [ApacheJMeter_core.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:424) [ApacheJMeter_core.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:255) [ApacheJMeter_core.jar:3.3 r1808647] at java.lang.Thread.run(Unknown Source) [?:1.8.0_162] 2018-04-20 10:29:40,884 ERROR o.a.t.t.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown Source) ~[?:1.8.0_162] at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_162] at javax.security.auth.Subject.doAs(Unknown Source) [?:1.8.0_162] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) [hadoop-common-2.6.0-cdh5.11.0.jar:?] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [hive-exec-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:202) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) [hive-jdbc-1.1.0-cdh5.11.0.jar:1.1.0-cdh5.11.0] at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2304) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2290) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533) [commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.jmeter.protocol.jdbc.config.DataSourceElement$DataSourceComponentImpl.getConnection(DataSourceElement.java:326) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.protocol.jdbc.config.DataSourceElement.getConnection(DataSourceElement.java:191) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.protocol.jdbc.sampler.JDBCSampler.sample(JDBCSampler.java:79) [ApacheJMeter_jdbc.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:498) [ApacheJMeter_core.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:424) [ApacheJMeter_core.jar:3.3 r1808647] at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:255) [ApacheJMeter_core.jar:3.3 r1808647] at java.lang.Thread.run(Unknown Source) [?:1.8.0_162] Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Unknown Source) ~[?:1.8.0_162] at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Unknown Source) ~[?:1.8.0_162] at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Unknown Source) ~[?:1.8.0_162] at sun.security.jgss.GSSManagerImpl.getMechanismContext(Unknown Source) ~[?:1.8.0_162] at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source) ~[?:1.8.0_162] at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source) ~[?:1.8.0_162] ... 26 more 2018-04-20 10:29:40,884 INFO o.a.j.t.JMeterThread: Thread is done: Input Parameters 1-1 2018-04-20 10:29:40,884 INFO o.a.j.t.JMeterThread: Thread finished: Input Parameters 1-1 2018-04-20 10:29:40,884 INFO o.a.j.e.StandardJMeterEngine: Notifying test listeners of end of test 2018-04-20 10:29:40,884 INFO o.a.j.g.u.JMeterMenuBar: setRunning(false, *local*) Jaas.conf ======== JMeter { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true doNotPrompt=true useKeyTab=true keyTab="C:\Users\extdku\Downloads\apache-jmeter-3.3\apache-jmeter-3.3\bin\hive.keytab" principal="hive/FQDN@REALM debug=true; }; Krb5 ==== [libdefaults] default_realm = UX.XXXXX.NET ; dns_lookup_realm = true ; #dns_lookup_kdc = true ; #rdns = false ; #ticket_lifetime = 24h ; #renew_lifetime = 7d ; #forwardable = yes ; #udp_preference_limit = 0 [realms] UX.XXXXX.NET = { kdc = abc.ux.xxx.net admin_server = abc .ux.xxx.net } [domain_realm] .ux.xxxx.net = UX.XXXXX.NET ux.xxxx.net = UX.XXXXX.NET System.Properties ============== java.security.krb5.conf="C:\Users\extdku\Downloads\apache-jmeter-3.3\apache-jmeter-3.3\bin\krb5.conf" java.security.auth.login.config=jaas.conf -Dsun.security.krb5.debug=true -Djava.security.debug=gssloginconfig,configfile,configparser,logincontext -Djava.library.path=C:\Users\extdku\Downloads\apache-jmeter-3.3\apache-jmeter-3.3\bin Quick responses are highly appreciated. These features are available in HDP but with no clear instructions though. Thanks David
... View more