Member since
02-29-2016
51
Posts
13
Kudos Received
0
Solutions
08-10-2017
12:45 PM
Hi All, I have created an encryption zone and I am not able to copy data into this encryption zone using USER_1 which belongs to GROUP_1 and getting the below error: copyFromLocal: User:USER_1 not allowed to do 'DECRYPT_EEK' on 'key1' In ranger ranger kms policies I have given full access to the group GROUP_1. But still I am facing this issue. Is it like group level policies does not apply for Ranger KMS or is there some configuration I have to tweak to make it work. Please help me understand this issue and also any clue or suggestion is appreciated. FYI, the cluster is kerberized. thanks in advance.
... View more
Labels:
07-28-2017
09:32 AM
there is a property in ranger kms configuration which blacklist hdfs user.Either remove the hdfs user from that property or try to do the following: 1. create another user 2. give that user permission to decrypt and encrypt key in ranger KMS policy 3. fire the command with the user Check if it works
... View more
07-26-2017
05:48 PM
@Don Bosco Durai Can you give an example for creating a encryption key in ranger KMS using rest api ?
... View more
07-25-2017
12:11 PM
So @Andrew Grande so did it work ? all the rest apis given for hadoop KMS worked for rangerr KMS ? If yes please give a sample url for creating an encryption key.
... View more
03-21-2016
01:19 PM
1 Kudo
it worked for some days data BTW can you please elaborate the meaning of these properties?
... View more
03-17-2016
05:56 PM
1 Kudo
hdp version is = HDP 2.1.2.0-402 I will also follow the steps given by you.
... View more
03-16-2016
06:29 PM
1 Kudo
actually its a select query which is creating this issue, so do you want me to set these properties while storing the data in the table??
... View more
03-16-2016
07:57 AM
4 Kudos
My hive query is failing even after setting the property ipc.maximum.data.length to the end limit of int which is 2147483647 with the following stacktrace: Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large. May be malicious. Use CodedInputStream.setSizeLimit() to increase the size limit.
at com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769)
at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462)
at com.google.protobuf.CodedInputStream.readUInt64(CodedInputStream.java:188)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4330)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4280)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4454)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4449)
at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12224)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12171)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12260)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12255)
at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12898)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12845)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12934)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12929)
at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.parseFrom(OrcProto.java:13045)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl$MetaInfoObjExtractor.<init>(ReaderImpl.java:426)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:295)
at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:197)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:999)
at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65)
... 16 more
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
02-29-2016
03:29 PM
that is fine. So can I get any help here for building this old release ??
... View more
02-29-2016
03:22 PM
download link for hive source code https://github.com/hortonworks/hive-release/tree/HDP-2.1.2.0 0.13 we are using for our development and we are facing some issue so I want to do remote debugging for that.
... View more
- « Previous
- Next »