Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2717 | 04-27-2020 03:48 AM | |
| 5279 | 04-26-2020 06:18 PM | |
| 4445 | 04-26-2020 06:05 PM | |
| 3567 | 04-13-2020 08:53 PM | |
| 5377 | 03-31-2020 02:10 AM |
06-20-2017
06:05 AM
@Harald Bögeholz
Is this the Address of your ambari server "118.138.237.168" And are you accessing ambari File View directly or via some WebServer in front. .
... View more
06-20-2017
05:05 AM
@Harald Bögeholz Which version of Ambari Server are you using? If you want to use the Ambari File View then you will need to perform few additional setup before using it. https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-views/content/configuring_your_cluster_for_files_view.html . Like following (Proxy User Setup) - 1. If you are running your ambari server as root user then you will need to add the following properties in the Ambari UI --> Services --> HDFS --> Configs --> Advanced tab --> Custom core-site . hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=*
Here Instead of "hadoop.proxyuser.root.hosts" value as "*" you can also define the comma separated list of Hosts/addresses where the FileView will be running. This is needed for the Files View to access HDFS, the Ambari Server daemon hosting the view needs to act as the proxy user for HDFS. This allows Ambari to submit requests to HDFS on behalf of the users using the Files View.
Based on your error we see that you are running your ambari server as "root" user. But the IP Address from which the FileView is running is not allowed to be proxied that's why you are getting this error: Unauthorized connection for super-user: root from IP 118.138.237.168
- 2. If you have logged in to Ambari File View as an "admin" user then you will need to make sure that you have created a directory for that user in HDFS as following: (if you have logged in to ambari file view with some other user then you will need to create the user specific directory in advance in HDFS)
Example: su - hdfs
hdfs dfs -mkdir /user/admin
hdfs dfs -chown admin:hadoop /user/admin
3. Now you should be able to login and perform various operations to Ambari File View without any issue. .
... View more
06-20-2017
03:12 AM
1 Kudo
@Vamsi N Regarding your query: "But cannot find a link to explore HDFS views in Ambari." You can use the HDFS "File View". Ambari UI --> "Manage Ambari" --> "Views" --> FILES --> Click on "Create Instance" . Regarding Hue view, You can also find the "HUE TO AMBARI Migration" View in ambari. Ambari UI --> "Manage Ambari" --> "Views" --> HUETOAMBARI_MIGRATION --> Click on "Create Instance"
For more information please see: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.2.0/bk_ambari-views/content/creating_a_htv_instance.html
The proxy users settings can be set to * means allow every group and request from every host. (or you can use a comma separated list of hostsnames/groups as well in the respective properties) hadoop.proxyuser.hue.groups=*
hadoop.proxyuser.hue.hosts=* .
... View more
06-20-2017
02:37 AM
@Sami Ahmad Can you please check the following link: https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.1.0/bk_ambari-installation/content/hdp_26_repositories.html There you will find the "md5" sum results of the Tarball. Please compare your Downloaded tarball md5 with this one to double check if you got any corrupted reppo or the correct one. . Tarball md5 | asc
http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.1.0/HDP-2.6.1.0-centos6-rpm.tar.gz
... View more
06-19-2017
12:20 PM
@Linlin Li I see that you are trying to install Ambari 2.4.1 on a 32 Bit operating system machine. Ambari Packages are available on 64 Bit mode only. Please see that the following link says "64-bit operating systems are supported" and the list of supported OS are listed here:
https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-installation/content/operating_systems_requirements.html .
... View more
06-19-2017
11:43 AM
1 Kudo
@Farrukh Munir If you want to enable HA for different services (Fir example "Enable NameNode HA") then in that case you can try the following: Ambaru UI --> HDFS --> "Service Actions" (Drop Down) --> "Enable Name Node HA" Please See: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-user-guide/content/how_to_configure_namenode_high_availability.html
Similarly, you can enable the HA for other services as well like "Enable ResourceManager HA") https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.1/bk_ambari-user-guide/content/how_to_configure_resourcemanager_high_availability.html . *************** *************** *************** . NOTE: Following Information is regarding Enabling HA for Ambari Itself. Unfortunately currently there is no HA feature available for Ambari yet. However you can refer to the following JIRAs where the discussion is going on in this regard: https://issues.apache.org/jira/browse/AMBARI-17126 https://issues.apache.org/jira/browse/AMBARI-7896 - Following link also has some good information regarding the same: https://community.hortonworks.com/questions/402/how-to-setup-high-availability-for-ambari-server.html .
... View more
06-19-2017
04:07 AM
1 Kudo
@Brendan Smith Have you done the following? Open the Firefox "about:config" and then search for the following two properties and set the values to the hostname/domain that are secured:
In the following example i am using sandbox domain name. In your case you will need to use the domainname/Hostname that you are using to access the Falcon UI. network.negotiate-auth.delegation-uris=sandbox.hortonworks.com,.hortonworks.com
network.negotiate-auth.trusted-uris=sandbox.hortonworks.com,.hortonworks.com
In the above you need to define the hostname/domain that you are using. Now get the Falcon keytab on your local machine (laptop) where browser is running and then do the kinit. Then refresh the browser. For more information on the same please refer to: https://community.hortonworks.com/articles/28537/user-authentication-from-windows-workstation-to-hd.html .
... View more
06-19-2017
12:45 AM
@Veena N Can you please check the following on the host where you have installed the VNC server. 1. In your Ambari Server Machine you have the correct package of "yum install tigervnc-server" installed as described in the following doc: https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/7/html/System_Administrators_Guide/ch-TigerVNC.html
2. Please check if it tigervnc service is running fine or not? # ps ax | grep tightvnc
3. Please use the "netstat" command to list the port opened by the "tigervnc-server" server. # netstat -tnlpa | grep $PID_VNC
4. If the port is opened, then please check if the "iptables" (Firewall) is turned off?
# service iptables status
# service iptables stop
5. Double check the IPAddress of the host is correct becore connecting to the host. Use the following command to list the IP Addresses # ifconfig . Ideally to access ambari server you do not need to install VNC server. You can simply make use of "SSH" (on windows putty based ssh access) to login to the VM host and then perform various operations. You can access the WebUI of the VM form your local machine. .
... View more
06-18-2017
03:32 PM
@Sami Ahmad
Try using the file name as "$USER_HOME/.wgetrc" Wget initialization file can reside inside the following files:
1. "/usr/local/etc/wgetrc" (global, for all users)
(OR)
2. $HOME/.wgetrc (for a single user).
<strong></strong> You can try defining the entry of no_proxy inside the "$HOME/.wgetrc" file. "no_proxy"to use string as the comma-separated list of domains to avoid in proxy loading, instead of the one specified in environment.
Please see: [0] https://www.gnu.org/software/wget/manual/html_node/Sample-Wgetrc.html [1] https://www.gnu.org/software/wget/manual/html_node/Wgetrc-Location.html [2] https://www.gnu.org/software/wget/manual/html_node/Proxies.html .
... View more
06-18-2017
09:23 AM
@Andreas Knapp - Any attempt of upgrading the "glibc" version might affect many other components installed at the OS level as well as the hadoop services. Hence it is not usually advised to upgrade the "glibc" package downloaded from other places apart from the one that is shipped with the OS. - The combination of later version of glibc with the existing packages have not been tested, although most of the programs may be able work because of backward compatibility. - The HDP components are also tested/certified on these RHEL/Centos releases which contains the "glibc" versions shipped along with the distribution. - So if you really want to use the higher version of "glibc" then you should try using the RHEL7 versions that includes higher version of glibc. .
... View more