Member since
10-24-2015
207
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4379 | 03-04-2018 08:18 PM | |
4299 | 09-19-2017 04:01 PM | |
1795 | 01-28-2017 10:31 PM | |
970 | 12-08-2016 03:04 PM |
05-28-2019
08:57 AM
@Vinay Thanks for the reply. I want to know how to restrict a hive database not to have more than 1 policy. Thanks for your time.
... View more
05-28-2019
03:35 AM
Hi, I have a very simple question. Can hive dB have multiple ranger policies? In any circumstance can you restrict having multiple ranger policies?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
12-31-2018
03:13 AM
@Jay Kumar SenSharma It works, Great!! There was a typo in your command. This works for me:
http://$AMS_COLLECTOR_HOSTNAME:6188/ws/v1/timeline/metrics?metricNames=yarn.QueueMetrics.Queue=root.default.AvailableMB._max&appId=resourcemanager&startTime=1545613074&endTime=1546217874 Thanks again.
... View more
12-31-2018
02:19 AM
@Jay Kumar SenSharma Thank you so much for the explanation and commands, but i am getting 404 not found error when i try to run this command using curl. I see there are some special characters and was wondering if it has to do anything with it? For eg below path has a ☆ before tTime and after max there is a = curly sign. http://$AMS_COLLECTOR_HOSTNAME:6188/ws/v1/timeline/metrics?metricNames=yarn.QueueMetrics.Queue=root.default.AvailableMB._max≈pId=resourcemanager☆tTime=1545613074&endTime=1546217874
... View more
12-30-2018
08:53 PM
Hi, i am trying to get the ambari metrics rest api path to get the memory usage for a particular queue Q1 for the last 1 week. how can i get that?
... View more
Labels:
- Labels:
-
Apache Ambari
11-12-2018
07:19 PM
Hi, We use an application which uses Hive and spark jobs. We want to monitor from our application (edge node of hadoop) to check if the connection to hive and spark working or not using script... hive uses beeline with logins,we cannot use login in the script, is there any other way to monitor hive and spark jdbc connections? using curl? Please suggest.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
10-30-2018
12:37 AM
@nyadav Hi, we are having same issue ... even though by default gpl compression libraries are installed, still i am getting this error. please let me know what you did to resolve this.
... View more
10-15-2018
10:06 PM
I have a user user1 with group group1 I use a third party tool which writes spark event logs to a directory in hdfs. Currently using the user home directory to write the logs temporarily. I created this directory like: Hdfs dfs -mkdir /user/user1/sparkeventlogs Right now this is created under user1:group2 Changed ownership to the right group: Hdfs dfs -chown -R user1:group1 /user/user1/sparkeventlogs I also added ACL to the above directory using setfacl command and when I do getfacl it gives me correct user and group assigned. User and group both have rwx permissions Now when the job is run, it the giving permission denied with a user who ran it under that AD group saying User1:group2 drwx——— When actually it is User1:group1 drwxrwx—- We have ranger enabled but I don’t have access to it Thanks for you help
... View more
Labels:
- Labels:
-
Apache Hadoop
04-04-2018
04:11 PM
@dthakkar @Sindhu yes, i did mention the -Dmapreduce.job.queuename=<queue_name> already but 2 applications run if you look at the yarn jobs list, first one uses the mentioned queue in the above property and second job uses default queue. I have no idea why it lauches 2 separate jobs. i resolved this my configuring queue mappings and increasing the am resource percent. Thanks.
... View more
04-01-2018
03:13 PM
@Goutham Koneru @Artem Ervit @Ram Venkatesh Hi, i am trying to install python 3 too in my hdp 2.5.3 cluster? how does this affect the other copoennts other than spark? is this recommended to do in production? can i use anaconda instead?
... View more