Member since
02-03-2016
123
Posts
23
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3832 | 04-13-2017 08:09 AM |
04-07-2017
08:01 AM
It worked. Thanks a lot. Have also accepted the best answer.
... View more
04-07-2017
07:13 AM
Yes, you are absolutely correct. Can this part be removed somehow? Actually we are using "sed" to change the flag to false as we are trying to automate the whole process. If you can guide for removing that part then it will be of great help. Thanks and Rajdip, Rajdip
... View more
04-07-2017
07:01 AM
Hello, Am not sure if I am missing any steps here but while executing am getting error. Have followed @Jay SenSharma comments and able to get the json output and updated the flag in json. But while uploading the JSON after changes using PUT am facing error and it is not working. Note that the RANGER is up and can perform operations from UI, but REST API PUT is not working (may be my error). Also the ip mentioned in below command contains the RANGER service. Need you help as we are stuck here and everytime we had to do work manually which we want to bypass. CURL command used to PUT the changed JSON is : curl -i -u admin:admin -H "Content-Type: application/json" -X PUT -d@/tmp/10_2.json http://xx.xx.xx.207:6080/service/plugins/policies/10 (changed the ip) Error thrown: HTTP/1.1 404 Not Found Server: Apache-Coyote/1.1 Set-Cookie: RANGERADMINSESSIONID=03A8D6199168A17D4C19D442E8C55617; Path=/; HttpOnly X-Frame-Options: DENY Content-Length: 0 Date: Fri, 07 Apr 2017 06:56:50 GMT Modified JSON: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Set-Cookie: RANGERADMINSESSIONID=EDCBDAFF124C9802A79BFD945662BC1A; Path=/; HttpOnly X-Frame-Options: DENY Content-Type: application/json Transfer-Encoding: chunked Date: Fri, 07 Apr 2017 07:00:49 GMT {"id":10,"guid":"c8afaae2-a4cc-4c25-b4b2-75ae9b0227eb","isEnabled":false,"createdBy":"Admin","updatedBy":"Admin","createTime":1491448221000,"updateTime":1491448221000,"version":1,"service":"TCSGEINTERNALCLUSTER_hive","name":"tcs_ge_user data masking test 2","policyType":1,"description":"tcs_ge_user data masking test 2","resourceSignature":"2cb6661609e66abfd9fbceaeac2be9d0","isAuditEnabled":true,"resources":{"database":{"values":["wells_fargo_poc"],"isExcludes":false,"isRecursive":false},"column":{"values":["card_number"],"isExcludes":false,"isRecursive":false},"table":{"values":["test_masked_2"],"isExcludes":false,"isRecursive":false}},"policyItems":[],"denyPolicyItems":[],"allowExceptions":[],"denyExceptions":[],"dataMaskPolicyItems":[{"accesses":[{"type":"select","isAllowed":true}],"users":["tcs_ge_user"],"groups":["tcs_ge_user"],"conditions":[],"delegateAdmin":false,"dataMaskInfo":{"dataMaskType":"MASK_HASH"}}],"rowFilterPolicyItems":[]}
... View more
04-06-2017
03:35 PM
Hello, We use Ranger for column filtering and data masking. Our use case is we will be ingesting data into Hive from source systems using Talend / Informatica but while executing the jobs we are getting error. What we found out that if the data masking policy on a particular table in Hive is on, data cannot be inserted. However, we can make the policy enabled once the data insert is complete. Can you please help for guiding us on how can the Ranger Data Masking policies be disabled and enabled using an Unix command so that we can include those in the data ingestion workflow so that there ain't any manual intervention? Looking for your guidance. Thanks and Regards, Rajdip
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
02-01-2017
02:39 PM
So can you please guide me then on how to convert this to pyspark script and give the same results? I am very sorry for posting this type naive query, but require help here.
... View more
02-01-2017
08:18 AM
Hello Friends, I am absolutely new to Hadoop and Spark. So trying to understand the knowledge of Spark/Hadoop. Currently am facing a big problem with pySpark coding. We have an use case of log analytics using python which successfully runs. However, we are thinking to convert the code to pySpark to gain speed. But am absolutely stuck for conversion of this python code to pySpark. Really need your help on how to do it and will use this learning experience on future assignments. Uploading the log files and py script for reference.sample.zip. Please help us with this conversion.
... View more
Labels:
- Labels:
-
Apache Spark
01-17-2017
01:25 PM
Hi Guys -- looking for your reply. Can you please guide me?
... View more
01-17-2017
07:20 AM
We have tried with same file and same strategy in a small dev cluster (2 nodes) with HDP 2.5.3 and it was done successfull. Is this an issue of HDP version? Please, need your urgent help here. Also, can't find a lot in the YARN logs, there are just INFO messages. Really looking for help and advice.
... View more
01-16-2017
01:31 PM
This PoC is part of CDC strategy in HBase. We have other 2 strategies also like using importtsv and creating HFiles. But not sure that why this above issue we are facing. Really looking for your help to address this issue.
... View more
01-16-2017
01:16 PM
Hi, We are using org.apache.hadoop.hive.hbase.HBaseStorageHandler to insert data in HBase using Hive external table for a PoC but everytime we are facing below error. Need your help in resolvinf this issue. Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:50, Vertex vertex_1484566407737_0004_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)
Hadoop distribution : HDP 2.4 File Size : 2GB Scripts attached.script.txt
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive