Support Questions

Find answers, ask questions, and share your expertise

HDFS Encryption Zone - hdfs user not able to access file in encryption zone, even after providing access in Ranger

avatar
Expert Contributor

screen-shot-2017-01-23-at-63028-pm.png

screen-shot-2017-01-23-at-63106-pm.png

hi - i'm evaluating & implementing HDP Data at Rest encryption .. & hdfs user is not able to access file put in HDFS encryption zone.

Here is what is done -

- created hdfs folder, -> /enczone1

- created key -> testkeyfromcli & encryption zone using the key

- added 2 files to encryption zone - /enczone1/myfile.txt & /enczone1/myfile_1.txt

- Using Ranger, created policy to provide read/write access to user - hdfs

- User Ranger, provided access to key - testkeyfromcli

One other step i did was run the following command to ensure super-user does not have access to file myfile.txt->

sudo -u hdfs hadoop fs -setfattr -n security.hdfs.unreadable.by.superuser /enczone1/myfile.txt

On running the following commands, i'm unable to access /enczone1/myfile.txt (Expected result)

However, i'm not able to access file /enczone1/myfile_1.txt, the error says - user hdfs is not allowed to 'DECRYPT_EEK' on 'testkeyfromcli'

However, access is already given to user - hdfs (as seen in file uploaded)

Any ideas ?

----------------------------------------------------------------------------------------------------------------

[root@sandbox ~]# sudo -u hdfs hdfs dfs -cat /enczone1/myfile.txt cat: Access is denied for hdfs since the superuser is not allowed to perform this operation. [root@sandbox ~]# sudo -u hdfs hdfs dfs -cat /enczone1/myfile_1.txt cat: User:hdfs not allowed to do 'DECRYPT_EEK' on 'testkeyfromcli'

[root@sandbox ~]# sudo -u hdfs hdfs crypto -listZones /zone_encr key1 /enczone1 testkeyfromcli /enczone2 testkeyfromcli /enczone3 key2

1 ACCEPTED SOLUTION

avatar
Cloudera Employee

@Karan Alang

hdfs superuser will be blacklisted by the property, hadoop.kms.blacklist.DECRYPT_EEK (in ambari). This is the possible reason why you are unable to decrypt being an 'hdfs' user. It is recommended that the hdfs superusers are not to be given the privileges to decrypt the data. Try giving the decrypt permissions for another user, who has the basic read permissions to read /enczone1/myfile_1.txt

View solution in original post

10 REPLIES 10

avatar
Expert Contributor

@Carter Everett, @Ali Bajwa- any ideas on this ?

avatar
Expert Contributor

@Kuldeep Kulkarni - any ideas on this ?

avatar
Cloudera Employee

@Karan Alang

hdfs superuser will be blacklisted by the property, hadoop.kms.blacklist.DECRYPT_EEK (in ambari). This is the possible reason why you are unable to decrypt being an 'hdfs' user. It is recommended that the hdfs superusers are not to be given the privileges to decrypt the data. Try giving the decrypt permissions for another user, who has the basic read permissions to read /enczone1/myfile_1.txt

avatar
Rising Star

This is by design - we want only the authorized end users and apps to be able to access decrypted files. The hdfs super user is usually hadoop admins and by design we are providing this "separation of duties" where hadoop admin users who are the operators cannot see the decrypted content in TDE encrypted folders - an additional safeguard against the threat of rogue admins.

avatar
Expert Contributor

@Mahesh M. Pillai , @svenkat - seems i'm not able to get the user - hive to insert data into hive table, although i've provided access to user - 'hive' using Ranger

Here are the details ->

- created hdfs location -> /encrypt/hive

- created encryption zone

- changed the scrachdir to location in encryption zone -> /encrypt/hive/tmp & provided permission 777

- create table 'testhive' in location '/encrypt/hive'

0: jdbc:hive2://sandbox.hortonworks.com:10000> create table testhive (rno int, fname string, lname string) location '/encrypt/hive/testhive';

- when i try to add row to the table, it gives the following error ->

------------------------------------------

0: jdbc:hive2://sandbox.hortonworks.com:10000> insert into testhive values(1, 'karan', 'alang'); Error: Error while compiling statement: FAILED: SemanticException [Error 10293]: Unable to create temp file for insert values User:hive not allowed to do 'DECRYPT_EEK' on 'testkeyfromcli' (state=42000,code=10293)

-----------------------------------------------

Attached is the screenshot of Ranger permissions user - hive, location /encrypt/hive/

screen-shot-2017-01-24-at-30823-pm.png

Pls note - value of

hadoop.kms.blacklist.DECRYPT_EEK in -> /etc/ranger/kms/conf/dbks-site.xml

<property> <name>hadoop.kms.blacklist.DECRYPT_EEK</name> <value>hdfs</value> </property>

avatar
Expert Contributor

@Mahesh M. Pillai, @svenkat - was able to fix this, by adding permission to user - 'hive' to

'DECRYPT_EEK' on 'testkeyfromcli'

avatar
Expert Contributor

@Mahesh M. Pillai, @svenkat - getting one more issue when i create a table using command below ->

Any ideas on how to fix this ?

> create table testtable2 location '/encrypt/hive/testtable2' as select * from sample_07 limit 5;

----------------------------------------------

INFO : Moving data to: /encrypt/hive/testtable2 from hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/.hive-staging_hive_2017-01-24_23-28-52_250_4192737411546010800-4/-ext-10001 ERROR : Failed with exception Unable to move source hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/.hive-staging_hive_2017-01-24_23-28-52_250_4192737411546010800-4/-ext-10001 to destination /encrypt/hive/testtable2 org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/.hive-staging_hive_2017-01-24_23-28-52_250_4192737411546010800-4/-ext-10001 to destination /encrypt/hive/testtable2 at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2692) at org.apache.hadoop.hive.ql.exec.MoveTask.moveFile(MoveTask.java:106) at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:223) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1720) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1477) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.ipc.RemoteException(java.io.IOException): /apps/hive/warehouse/.hive-staging_hive_2017-01-24_23-28-52_250_4192737411546010800-4/-ext-10001 can't be moved into an encryption zone.

avatar
Cloudera Employee

@Karan Alang

From what I have understood, data cannot be 'moved' from the unencrypted zone to an encrypted zone in hdfs.

In your case, the temporary data from the table(sample_07) is loaded into the staging directory, which by default is created under /apps/hive/warehouse/.hive-staging*. And this directory does not belong to the encrypted zone(/encrypt/hive) that you created.

Hive user tries to "move" this data from the staging directory to the encrypted zone and thus fails.

So, you can point this staging directory, to the encrypted zone, by setting the following hive property.

set hive.exec.stagingdir=/encrypt/hive/tmp/;

I hope it works for you.

avatar
Expert Contributor

@Mahesh M. Pillai - hive.exec.stagingdir was already set to - /encrypt/hive/tmp/

There is an additional varaible that was to be changed ->

hive.metastore.warehouse.dir , I changed this from existing value (/apps/hive/warehouse) to

location in the encrypted zone -> /encrypt/hive, and this problem is fixed.

----------------------------------

INFO : Moving data to: /encrypt/hive/testtable2 from hdfs://sandbox.hortonworks.com:8020/encrypt/hive/.hive-staging_hive_2017-01-25_22-54-41_396_5265658181234256688-1/-ext-10001 INFO : Table default.testtable2 stats: [numFiles=1, numRows=5, totalSize=211, rawDataSize=206] No rows affected (47.001 seconds)