Member since
04-03-2019
962
Posts
1743
Kudos Received
146
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
11335 | 03-08-2019 06:33 PM | |
4837 | 02-15-2019 08:47 PM | |
4140 | 09-26-2018 06:02 PM | |
10510 | 09-07-2018 10:33 PM | |
5564 | 04-25-2018 01:55 AM |
01-07-2016
03:37 AM
1 Kudo
Thanks @Artem Ervits. I'm aware of Nagios/Ganglia deprecation in later versions of Ambari 🙂 I just gave the reference to get the better idea of question. Thanks again! Will check the documentation
... View more
01-07-2016
03:04 AM
8 Kudos
I would like to add a custom alert for single or multiple nodes in the cluster apart from hadoop services which are already defined under "Alert" section of Ambari. I know that we can do this using Nagios however just wanted to know if it is possible via Ambari? e.g. I would like to monitor disk space of my /xyz directory which contains input data to some hadoop job.
... View more
Labels:
- Labels:
-
Apache Ambari
01-06-2016
04:04 AM
@Vidya SK - Goto Ambari UI -> select Hive service -> you should see mysql server there -> just point your mouse arrow over the link and you should get the name of mysql server OR you will get it in hive-site.xml grep -A 2 mysql /etc/hive/conf/hive-site.xml
... View more
01-06-2016
03:58 AM
3 Kudos
@Vidya SK Run below command to list the tables: sqoop list-tables --connect jdbc:mysql://<mysql-server>/database --username <username> --password <password>
... View more
01-05-2016
03:22 PM
4 Kudos
Hey @Ashnee Sharma Please refer this - https://community.hortonworks.com/questions/4496/how-to-migrate-hive-data-over-to-new-cluster.html
... View more
01-05-2016
05:00 AM
@Suresh Bonam Unfortunately No! Either we need to pass it via command line or set "exectype=tez" in pig.properties via Ambari
... View more
01-05-2016
04:25 AM
5 Kudos
@Suresh Bonam To use both the options, use below command: pig -useHCatalog -x tez -f script.pig To use only Hcat: pig -useHCatalog -f script.pig To use only tez: pig -x tez -f script.pig
... View more
12-26-2015
12:49 PM
1 Kudo
@Sergey Orlov - As per error given in question FAILED:HiveException java.security.AccessControlException:Permission denied: user=tech_dmon, access=WRITE, inode="/user/p/data":tech_p:bgd_p:drwxr-xr-x Could you please confirm if inode name is /user/tech_p/data or its /user/p/data ? In any of the above case, you are running your oozie cordinator/workflow as tech_dmon user and its trying to write into some other user's user directory in HDFS instead of /user/tech_dmo/ hence you are getting permission denied error. Could you please check application logs of this job and let me know what is the value of below properties
user.name mapreduce.job.user.name
... View more
12-24-2015
11:00 AM
@ Sergey Orlov - Could you please share your job.properties and workflow.xml ? I can see in error log inode is "" it should be something like "hdfs://<namenode-host>:8020/<hdfs-path>"
... View more