Member since
04-03-2019
962
Posts
1743
Kudos Received
146
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 15021 | 03-08-2019 06:33 PM | |
| 6180 | 02-15-2019 08:47 PM | |
| 5101 | 09-26-2018 06:02 PM | |
| 12609 | 09-07-2018 10:33 PM | |
| 7449 | 04-25-2018 01:55 AM |
08-02-2016
11:10 PM
@Vaibhav Kumar - Can you please write a new question on this new error? I could see that reported issue got resolved. I have accepted answer by @Joy. Please post the new question on this error and you can tag in Joy or other people in there to get quick attention 🙂
... View more
08-02-2016
10:54 PM
2 Kudos
@Abdul Qadeer I see what are you asking. Basically you want your application to be run on dedicated nodemanager. You can have a look at YARN's node label feature. https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/NodeLabel.html Please do let me know if this is not what you are looking for.
... View more
08-02-2016
10:47 PM
@kishore sanchina - If this information helps then please accept the appropriate answer to close this question 🙂 Thank you.
... View more
08-02-2016
10:46 PM
1 Kudo
@kishore sanchina Let's say you want to set ulimit for user 'kishore', please login to the system via root and edit /etc/security/limits.conf Add below lines in it kishore - nofile 32768
kishote - nproc 65536 Here basically we are increasing number of open files limit to 32K and number of processes limit to 65K
... View more
08-02-2016
04:17 PM
3 Kudos
@kishore sanchina I had fixed it yesterday by increasing ulimit. Error shown in provided logs were pointing to ulimit issue. nofile is variable for number of open files and nproc is number of processes limit for hue user.
... View more
08-01-2016
08:14 PM
2 Kudos
@kishore sanchina Can you please check your /etc/hue/conf/hue.ini file and see below variable [[database]]
engine=sqlite3
name=/var/lib/hue/desktop.db Can you please check if path to desktop.db or any db name according to your configuration exists? By looking at the error, it looks like path issue/permission issue or database is missing.
... View more
08-01-2016
06:43 AM
3 Kudos
@Yibing Liu +1 to @Joy's answer. Also, I know you must have taken care of this, however if there are any state entries in /etc/hosts file, please remove that from all the agents.
... View more
08-01-2016
06:41 AM
3 Kudos
@Saurabh Kumar
Please have a look at below documents, this information is useful for recovery https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_upgrading_hdp_manually/content/configure-yarn-mr-22.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_upgrading_hdp_manually/content/start-webhcat-20.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_upgrading_hdp_manually/content/start-tez-22.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_installing_manually_book/content/upload_pig_hive_sqoop_tarballs_to_hdfs.html
... View more
08-01-2016
06:32 AM
5 Kudos
@ripunjay godhani Don't mount all the partitions on same disk, it will create lot of disk I/O. I would suggest to partition the disk according to your requirement and use dedicated disks for each component like DNs/NNs etc. Also, Please have a look at below links for Hadoop performance tuning http://crazyadmins.com/tune-hadoop-cluster-to-get-maximum-performance-part-1/ http://crazyadmins.com/tune-hadoop-cluster-to-get-maximum-performance-part-2/
... View more
07-31-2016
06:45 AM
@Vaibhav Kumar +1 to answer given by @Joy - Can you please check his answer and try to set JAVA_HOME properly ?
... View more