Member since
10-17-2017
13
Posts
0
Kudos Received
0
Solutions
05-16-2018
10:33 AM
Hi, we are doing some research on how to go for "Data at Rest Encryption" and "Data on wire Encryption" for Hbase. Right now we have explored Ranger KMS for encrypting data at rest in our hadoop cluster. I have a question that if we are moving to AWS , Since EBS will provide encryption of data at rest, is that sufficient or should Ranger KMS be also present in order to support encryption. Please provide some inputs on this to understand it better as how it works. Thanks
... View more
Labels:
04-27-2018
06:34 AM
Hi, I am using Rangers to test the authorisation of Hbase tables by enabling Hbase plug-in. I have created few test users in my linux box and have used Ranger APIs to create policies to give specific permissions to my test users. Everything works fine. But I want to know If I can grant/revoke permissions through phoenix for those hbase tables. I have tried running grant command from phoenix, but it did not work. It threw me syntax error. I need to understand if this is possible while I have my Rangers running in my cluster. Thanks.
... View more
04-02-2018
01:15 PM
Can I add that putting secrets in your s3a:// path is dangerous as it will end up in hadoop logs across the cluster Best: put them in a JCEKs file in HDFS or other secure keystore Good: have some options in the hadoop/hbase configurations Weak: setting them on the command line with -D options (visible with a ps command)
... View more
03-13-2018
04:30 PM
@Swetha Nelwad Please check my answer in the link. https://community.hortonworks.com/questions/147239/hbase-tables-are-not-visible-through-phoenix-clien.html Hope this helps
... View more
02-28-2018
02:09 PM
Does your user have permissions? See also: https://community.hortonworks.com/questions/47197/phoenix-backup.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_data-access/content/ch_hbase_bar.html You have to specify table names, so specify the SYSTEM ones as well. Creating and Maintaining a Complete Backup Image
The first step in running the backup-and-restore utilities is to perform a full backup and to store the data in a separate image from the source. At a minimum, you must do this to get a baseline before you can rely on incremental backups. Important Tip Record the backup ID that appears at the end of a successful backup. In case the source cluster fails and you need to recover the dataset with a restore operation, having the backup ID readily available can save time.
... View more
02-19-2018
06:51 AM
Hi, I am trying to take full backup using the command "hbase backup create full </local/path>". In my hbase , for testing purpose there is only one user table created and all others are default hbase tables. I am getting below ERROR when i run the backup command. Can someone please tell me what config i need to add inorder to resolve this timeout issue. Backup session finished. Status: FAILURE
2018-02-19 06:10:24,928 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.errorhandling.ForeignException): org.apache.hadoop.hbase.errorhandling.TimeoutException: Timeout elapsed! Source:Timeout caused Foreign Exception Start:1519020556896, End:1519020616896, diff:60000, max:60000 ms
at org.apache.hadoop.hbase.errorhandling.ForeignExceptionDispatcher.rethrowException(ForeignExceptionDispatcher.java:83)
at org.apache.hadoop.hbase.backup.master.LogRollMasterProcedureManager.execProcedure(LogRollMasterProcedureManager.java:129)
at org.apache.hadoop.hbase.master.procedure.MasterProcedureUtil.execProcedure(MasterProcedureUtil.java:93)
at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.executeFromState(FullTableBackupProcedure.java:525)
at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.executeFromState(FullTableBackupProcedure.java:69)
at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107)
at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:500)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:1086)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:888)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:841)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:77)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:443)
I have already tried adding below property in my hbase-site.xml file, but I still get the timeout issue <property>
<name>hbase.snapshot.region.timeout</name>
<value>300000</value>
</property>
... View more
Labels:
12-06-2017
05:53 AM
Hi, I have upgraded my HDP to 2.6.2 version and testing Rangers 0.7. I have added a new user using Rest API from the host machine as below curl -u admin:admin -v -i -s -X POST -H "Accept: application/json" -H "Content-Type: application/json" http://localhost:6080/service/xusers/users -d @/file_path/newuser.json
The above command executed successfully creating new user with and id = 23, so when I used the below curl command to get this user, it shows me the recently added user, But Ranger UI does not show the new user added. Can anyone please tell me why it is happening. curl -u admin:admin -v -i -s -X GET http://localhost:6080/service/xusers/users/23 Thanks.
... View more
Labels:
11-15-2017
08:18 AM
1 Kudo
@Swetha Nelwad It looks like a Doc issue with the link 2 2. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_support-matrices/content/ch_matrices-ambari.html#ambari_stack In order to manage HDP 2.6 we will need to have Ambari 2.5 or above. HDP 2.6 is compatible with Ambari 2.5.x and 2.6.x.
... View more
11-15-2017
08:23 AM
@Swetha Nelwad Apologies for late response on this thread. As we see that the Ranger port 6080 is not opened yet so it may be either a firewall issue which is blocking the Port access (OR) it may be due to some Ranger issue that the port is not getting opened, In that case looking at the ranger logs will help. It is not QuickLinks issue.
... View more