Member since
05-12-2017
37
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
489 | 07-05-2021 10:06 PM |
01-27-2022
07:47 PM
Hi Everyone, When I'm trying to assign permission to group for namespace in hbase, it's throwing null pointer exception through hbase shell Also same issue for when I want to check user permission for namespace.
... View more
Labels:
- Labels:
-
Apache HBase
07-07-2021
11:20 PM
Hi All, We have replicated hive tables from CDH TO CDP through hive replication policy(BDR). we did not see those tables when we connect to beeline and run show tables;also in hue as well. but we are able to run select * from table and also we can see these tables in ranger . Can anyone help me on this issue? Thanks Srini Podili
... View more
Labels:
07-05-2021
10:06 PM
1 Kudo
Hi All, It worked ,i did mistake .i forgot to add "access" before second group permission. Thanks Srinivas
... View more
07-05-2021
08:57 PM
Hi All, I have a requirement like, I need to create hive policy through REST API in CDP Env with two groups .Here , we are giving permission at group level not at user level. one group with "ALL" permissions and 2nd group with "select" permission. I have created policy through REST API with one group with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. I have tried the below method but did not worked. CURLURL="https://host:6182/service/public/v2/api/policy/"
CURLDATA='{ "isEnabled":true,"service":"cm_hive","name":"policy_test1","description":"Policy for employees database access","isAuditEnabled":true,"resources":{"database":{"values":["DBNAME"],"isExcludes":false,"isRecursive":false},"table":{"values":["*"],"isExcludes":false,"isRecursive":false}},"policyItems":[{"accesses":[{"type":"ALL","isAllowed":true}],"users":[""],"groups":["GROUP1"],[{"type":"SELECT","isAllowed":true}],"users":[""],"groups":["GROUP2"]"conditions":[],"delegateAdmin":false}],"denyPolicyItems":[],"allowExceptions":[],"denyExceptions":[],"dataMaskPolicyItems":[],"rowFilterPolicyItems":[]}'
RESPONSE=`curl -k -iv -u username:password -H "Content-Type: application/json" -X POST "$CURLURL" -d "$CURLDATA"` Thanks in advance! Srini Podili
... View more
Labels:
07-05-2021
05:04 AM
Hi, I have a requirement like, i need to create hive policy with two groups .one group with "ALL" permissions to some "x" user and 2nd group with "select" permission to "y" user. i have created policy through REST APi with one group but with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. Thanks in advance! Srini Podili
... View more
07-05-2021
05:02 AM
Hi, I have a requirement like, i need to create hive policy with two groups .one group with "ALL" permissions to some "x" user and 2nd group with "select" permission to "y" user. i have created policy through REST APi with one group but with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. Thanks in advance! Srini Podili
... View more
07-05-2021
05:01 AM
Hi, I have a requirement like, i need to create hive policy with two groups .one group with "ALL" permissions to some "x" user and 2nd group with "select" permission to "y" user. i have created policy through REST APi with one group but with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. Thanks in advance! Srini Podili
... View more
07-05-2021
04:58 AM
Hi, I have a requirement like, i need to create hive policy with two groups .one group with "ALL" permissions to some "x" user and 2nd group with "select" permission to "y" user. i have created policy through REST APi with one group but with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. Thanks in advance! Srini Podili
... View more
07-05-2021
04:56 AM
Hi, I have a requirement like, i need to create policy with two groups .one group with "ALL" permissions to some "x" user and 2nd group with "select" permission to "y" user. i have created policy through REST APi with one group but with "all" permissions but how to mention 2nd group with "select" permission in same create policy command. Thanks in advance! Srini Podili
... View more
07-02-2021
05:06 AM
Hi Asish. I saw this info from Apache documentation. https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties#ConfigurationProperties-HiveConfigurationProperties Thanks, Srinivas
... View more
07-02-2021
04:07 AM
Hi Asish, I'm new to hadoop ,I'm sorry asking in detail. so the value for ishive.security.authorization.sqlstd.confwhitelist hive\.msck\.path\.validation|hive\.msck\.repair\.batch\.size can u please help. Thanks Srini Podili
... View more
07-02-2021
02:21 AM
Hi We are also getting same issue . what property should be changed here? this is from my env. Please let me know. Thanks in advance.. Srini Podili
... View more
06-23-2021
09:25 PM
Hi All,
We have a Script for creating - HDFS directories , hive database and HBase namespace and also giving permissions as well in a script in CDH environment with Sentry.
we have created roles and granting permission to the groups on database level through same script.
Now ,we have a different cluster with CDP and we want to update the script for Ranger.
Can anyone help us can we use same script(database permission) for CDP with ranger or do we need to make any changes?
Thanks
Srini Podili
... View more
04-01-2020
06:43 AM
@jsensharma I have same issue as above, i exported path and script ran fine. I can see alert " [Custom] Host Mount Point Usage" added in Ambari alerts page but i'm not seeing alerts .I followed all the steps.
... View more
02-11-2020
02:54 AM
Hi, i see this is mentioned in above link " AMS data would be stored in 'hbase.rootdir' identified above. Backup and remove the AMS data. If the Metrics Service operation mode is ' embedded ', then the data is stored in OS files. Use regular OS commands to backup and remove the files in hbase.rootdir" so we need to remove dir structure or only files inside folders? please let me know
... View more
05-07-2018
05:53 PM
We couldn't create a new processor, or load a template and It says something like transaction still in progress but it never ends.
... View more
Labels:
- Labels:
-
Apache NiFi
05-07-2018
04:26 PM
but still the same issue..
... View more
05-07-2018
04:25 PM
@Matt Clarke I did not find "HTTP requests" or ""Request Counts Per URI" . in nifi-app.logs I have increased ifi.cluster.node.protocol.threads=50" and Increased "nifi.web.jetty.threads=400 My cluster is 16 nodes cluster among 3 nodes are NIFI nodes. Thanks, Srinivas
... View more
05-07-2018
02:58 PM
@Davide Vergari , I made above changes but still the same issue , even i have increased both to 60 sec .
... View more
05-07-2018
02:56 PM
@Matt Clarke, I made above changes but still the same issue , even i have increased both to 60 sec .
... View more
05-07-2018
02:55 PM
I made above changes but still the same issue , even i have increased both to 60 sec .
... View more
09-28-2017
05:00 PM
@Rajkumar Singh
Accepted all dependant configurations (default values). Restarted Hive and any other related service (Stall configs) Created a test table: CREATETABLE resource.hello_acid (key int, value int)
PARTITIONED BY (load_date date)
CLUSTERED BY(key) INTO 3 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true');
Inserted a few rows: INSERT INTO hello_acid partition (load_date='2016-03-03') VALUES (1, 1);
INSERT INTO hello_acid partition (load_date='2016-03-03') VALUES (2, 2);
INSERT INTO hello_acid partition (load_date='2016-03-03') VALUES (3, 3);
Everything look great at this point. I've been able to add rows,
and query the table as usual. This is the content of HDFS directory of table partition (/apps/hive/warehouse/resource.db/hello_acid/load_date=2016-03-03/): 3 delta directories (1 per insert transaction)
In Hive, I issued a minor compaction command. If i understood it
right, this should have merged all delta directories into one. Didn't work! ALTER TABLE hello_acid partition (load_date='2016-03-03') COMPACT 'minor';
Next, I issued a major compaction command. This should have deleted
all delta files and created a base file, with all the info. Didn't work either! Finally, I ran this last command: SHOW COMPACTIONS; +-----------+-------------------------------+-----------------------+--------+------------+---------------------------------+----------------+--+ | dbname | tabname
| partname
| type | state |
workerid
| starttime | +-----------+-------------------------------+-----------------------+--------+------------+---------------------------------+----------------+--+ | Database | Table
| Partition
| Type | State | Worker
| Start Time | | resource | hello_acid
| load_date=2016-03-03 | MINOR | failed
| hadoop-master1.claro.com.co-52 | 1506440161747 | | resource | hello_acid
| load_date=2016-03-03 | MAJOR | failed
| hadoop-master2.claro.com.co-46 | 1506440185353 | +-----------+-------------------------------+-----------------------+--------+------------+---------------------------------+----------------+--+
... View more
07-20-2017
05:47 PM
@kalai selvan, All 3 datanodes going down frequently..The datanodes are going down one after the other,
quite seemingly, one of the node gets hit harder than the rest.
... View more
07-18-2017
12:45 PM
@jsensharma,@nkumar, We have a cluster running HDP 2.5 with 3 worker nodes and around 9.1 million blocks with
an average block size of 0.5 MB.Is could be the reason for frequent JVM pause ?
... View more
07-17-2017
07:31 AM
@jsensharma, did you check hdfs-site.xml,,,core-site.xml?please have a look and let me know if any changes needded.
... View more
07-16-2017
09:44 AM
@jsensharma what is the recommendation for Datanode heap size and new generation heap size? now i set datanode heapsize to 24 GB and new genreration heap size to 10 GB.
... View more
07-16-2017
09:36 AM
@jsensharma What I found curious is that the Cached Mem grew a lot
just before the node stopped sending heartbeats. Do you know why would that be? cache.jpg
... View more
07-16-2017
09:27 AM
@jsensharma I did not see Datanode prcoess generating the "hs_err_pid" files under /var/log/hadoop/$USER.
... View more
07-16-2017
09:17 AM
@jsensharma, I have already added above -XX:CMSInitiatingOccupancyFraction=60-XX:+UseCMSInitiatingOccupancyOnly to HADOOP_DATANODE_OPTS for both if and else .
... View more
07-16-2017
08:20 AM
@nkumar, I tried increasing Java heap memory for datanodes from 16Gb to 24 GB but still same issue.
... View more