Member since
02-08-2016
793
Posts
669
Kudos Received
85
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3141 | 06-30-2017 05:30 PM | |
| 4099 | 06-30-2017 02:57 PM | |
| 3404 | 05-30-2017 07:00 AM | |
| 3983 | 01-20-2017 10:18 AM | |
| 8628 | 01-11-2017 02:11 PM |
10-24-2016
05:41 PM
1 Kudo
@Jessika314 ninja
In addition to what @dgoodhand mentioned, also make sure of below points - 1. Hostname is correctly set using $hostname and 'hostname -f' command output. 2. Make sure you have correct entries for IPaddress and hostname in 'cat /etc/hosts' 3. Make sure iptables is disabled on all hosts in cluster - '$iptables -L' 4. Selinux must disabled. $sestatus
... View more
10-24-2016
12:53 PM
@Jessika314 ninja Stop both namenodes. If we are able to recall which was the active namenode last time then just start that namenode and let the other be down. Make sure ZKFC and JN's should be running when you do this activity. Once the namenode is started, check if its showing Active status in ambari UI. If YES, wait for namenode to come outof safemode. If the namenode doesnot show active please check the ZKFC and Namenode logs. Once the namenode is out of safe mode start the other namenode and check the status in Ambari UI. Let me know if that helps.
... View more
10-24-2016
10:24 AM
@jayachandra BABU The command should be use on cli. If you are not using cli then you might go for something like FTP server - https://sites.google.com/a/iponweb.net/hadoop/Home/hdfs-over-ftp If you want to continuously fetch data from remote system and dump in hadoop the you can also look for FLUME/Kafka.
... View more
10-24-2016
06:27 AM
@pankaj singh Please find the steps to move ambari server to different host - https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_ambari_reference_guide/content/ch_amb_ref_moving_the_ambari_server.html
... View more
10-24-2016
04:41 AM
++ @Kuldeep Kulkarni Can you help...
... View more
10-24-2016
04:22 AM
@Rainer Geissendoerfer Please check below steps to debug - 1. Check if HIVE service is up using "ps -aef |grep hive" command 2. Launch hive cli and try accessing tables. 3. Telnet <hive_host>:<port> from localhost 4. Check the database connection for hive. Seems like the checks are failing for HCAT service.
... View more
10-23-2016
07:55 AM
@jayachandra BABU You can have multiple options to do it - - Copy the zip file on gateway node on local Filesystem path and use "hadoop fs -put" or "hadoop fs -copyFromLocal" command to copy file from local FS to HDFS. - You can have NFS gateway configured for hadoop hdfs and mount the hdfs to local FS path, where you can directly copy the files using scp or copy command.
... View more
10-19-2016
05:43 AM
@cduby For provisioning of cluster there are multiple options you can go with. below are few - 1. Using Cloudbreak, pls check link - http://hortonworks.com/apache/cloudbreak/ 2. Using Sahara, pls check link - http://docs.openstack.org/developer/sahara/userdoc/overview.html 3. Using Blueprint, pls check link - http://crazyadmins.com/automate-hdp-installation-using-ambari-blueprints-part-1/ [For more information adding @Kuldeep Kulkarni]
... View more
10-18-2016
08:55 AM
@Allen Niu It seems to be a problem with downloading of packaged from repository. Is it Local repository you are using ? You can kill the action here on ambari webui and lets try downloading single package from cli using yum command "yum install hdp-select". This will make sure you have access to download packages from internet/local repository. Let me know if that works from cli.
... View more
10-18-2016
07:08 AM
Hi @Allen Niu Can you attach the screenshot here of the screen on which you are struck. Also do check below - 1. Login on ambari server and change path to - $cd /var/lib/ambari-agent/data/ 2. $ ls -ltr output-* and $ ls -ltr error-* 3. Check the last output or error file to see if there are any useful logs.
... View more