Member since
02-01-2019
650
Posts
143
Kudos Received
117
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2741 | 04-01-2019 09:53 AM | |
1426 | 04-01-2019 09:34 AM | |
6707 | 01-28-2019 03:50 PM | |
1529 | 11-08-2018 09:26 AM | |
3728 | 11-08-2018 08:55 AM |
08-25-2016
05:28 AM
@Sami Ahmad If i'm not wrong , you are trying to copy the data within same cluster to different directories. You can simply use the copy command. hadoop fs -cp hdfs:///user/sami/ hdfs:///user/zhang
... View more
08-22-2016
02:05 PM
1 Kudo
@bigdata.neophyte
As per this jira : https://issues.apache.org/jira/browse/HADOOP-8934 -t option is supported from hadoop 2.8. Possibly a doc bug in apache doc here.
... View more
08-16-2016
10:25 AM
@pan bocun check if you have correct permissions and ownership to this directory : /mnt hdfs:hadoop ownership and 755 permissions.
... View more
08-16-2016
09:58 AM
1 Kudo
@sujitha sanku : "hadoop" is the root password of mysql server in HDP 2.5 sandbox.
... View more
08-16-2016
07:59 AM
@Antonio Ye
Can you check if you have enough storage in your sandbox ? hdfs dfsadmin -report
... View more
08-08-2016
05:14 AM
2 Kudos
@kishore sanchina If you are seeing this error , you are running out of free disk space. Increasing yarn.nodemanager.disk-health-checker.max-disk-utilization-per-disk-percentage will help in resolving your issue but it is highly recommended to clean up the disk and restart the NM to avoid further disk related issues.
... View more
07-29-2016
06:19 AM
@Sunile Manjee
As of Cloudbreak 1.3.0 , Kerberos on cloudbreak is still in TP (http://sequenceiq.com/cloudbreak-docs/latest/kerberos/)
... View more
07-26-2016
07:23 AM
@ARUN Right now there is now way we can do it in ambari. You will have to write custom scripts to take the backup. Use import/export or snapshot features of HBase to take the backups.
... View more
07-18-2016
05:59 PM
@henryon wen Setting "mail.smtp.ssl.enable : true" in ambari SMTP configurations should help in resolving your issue.
... View more