Member since
06-17-2015
61
Posts
20
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1976 | 01-21-2017 06:18 PM | |
2403 | 08-19-2016 06:24 AM | |
1728 | 06-09-2016 03:23 AM | |
2879 | 05-27-2016 08:27 AM |
06-20-2016
01:56 PM
Hi Ahmad, Great to know that you managed But Strange it supposed to work what command did you use? i use below command in my ambari setup and it works perfectly on different os ambari-server setup –j /usr/java/jdk1.8.0_74 please refer for more http://docs.hortonworks.com/HDPDocuments/Ambari-2.1.0.0/bk_ambari_reference_guide/content/ch_changing_the_jdk_version_on_an_existing_cluster.html btw which OS you are working with ? also i don think $PATH necessarily needs to be updated , we are already specifying "–j /usr/java/jdk1.8.0_74" Just make sure on all machines you have teh java home accessible
... View more
06-20-2016
01:42 PM
Hi Eric,
Thanks for answer can you please clarify bit more do you agree with having KDC master on separate server in production scenario or not ? do you see any issues having KDC slave incase master KDC goes down ? Thanks Ripunjay
... View more
06-20-2016
02:03 AM
Thanks a lot @Predrag , this is what i was looking for
... View more
06-19-2016
04:16 PM
1 Kudo
Hi Hadoop Experts, can you please advise Keberos Implementation Approach: is it recommended to have KDC on one of hadoop nodes or on separate server in production environment i am trying to search on hortonworks website but only got https://community.hortonworks.com/articles/17336/choosing-kerberos-approach-for-hadoop-cluster-in-a.html Please share your suggestions and ideas for production environment
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Kerberos
06-09-2016
03:23 AM
this error comes when we are trying to load coprocessor jar which is compiled with higher java version but in our $PATH we have lower java version , so we can override java required by coonfiguring it properly in cloudera manager as shown and issue will be resolved
... View more
06-09-2016
03:20 AM
i was facing issue in enable disable table in hbase , but i am unable to do because of error in hbase logs as unsupported major minor version 52
... View more
Labels:
- Labels:
-
Apache HBase
05-30-2016
06:03 AM
Hi @Rushikesh Deshmukh The following table provides an overview for quickly comparing these approaches, which I’ll describe in detail below. http://blog.cloudera.com/blog/2013/11/approaches-to-backup-and-disaster-recovery-in-hbase/ i used distcp as well but that did not work for me , in the sense data was copied but while running hbck i had issue if you want to create backup on same cluster then copytable and sanpshot are very easy for inter cluster snapshot works good let me know if you need more details Also this below link is really very useful and clear http://hbase.apache.org/0.94/book/ops.backup.html
... View more
05-30-2016
05:26 AM
Hi @Rushikesh Deshmukh The following table provides an overview for quickly comparing these approaches, which I’ll describe in detail below. http://blog.cloudera.com/blog/2013/11/approaches-to-backup-and-disaster-recovery-in-hbase/ i used distcp as well but that did not work for me , in the sense data was copied but while running hbck i had issue if you want to create backup on same cluster then copytable and sanpshot are very easy for inter cluster snapshot works good let me know if you need more details
... View more
05-27-2016
08:27 AM
1 Kudo
i reinstalled and problem was fixed , dont know what was the issue Thsi time installation went fine
... View more
05-19-2016
03:03 PM
Hi Kuldeep , i see symlinks are broken on 2 machines, and i have only 1 machine where symlinks are good but required symlinks are not present on good node how to resolve this issue , sorry for replying late how come we have unexpected behavior during installation that some symlinks are broken and some are good , its bug in hortonworks installer , what do you think ? Thanks Ripunjay Godhani
... View more
- « Previous
- Next »