Member since
02-08-2016
793
Posts
669
Kudos Received
85
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2388 | 06-30-2017 05:30 PM | |
3104 | 06-30-2017 02:57 PM | |
2539 | 05-30-2017 07:00 AM | |
3009 | 01-20-2017 10:18 AM | |
5922 | 01-11-2017 02:11 PM |
02-23-2016
04:45 AM
I see that there are no ready jars available on internet for ambari shell. But what the best way we can do is to build/compile the ambari-shell code on our test/local environment and use the jar directly at other places. You do no need to compile it everytime. Ever i can pass you the jar on email and you can directly test with below command in your environment...Just need to ensure the OS and java version should be same. # java -jar /opt/ambari-shell/build/libs/ambari-shell-0.1.DEV.jar --ambari.server=localhost --ambari.port=8080 --ambari.user=admin --ambari.password=admin
... View more
02-17-2016
06:44 PM
11 Kudos
Please follow the steps below to install ambari-shell on your system - Environment setup details - Virtual Box : 5.0.14
OS : CentOS release 6.7 (64-bit)
Ambari-shell code : https://github.com/sequenceiq/ambari-shell.git
Gradle Version: gradle-1.12 [ https://services.gradle.org/distributions/gradle-2.11-all.zip ]
Java Version: 1.7.0_79 Installation steps: 1.Login using root/superuser
2.Install "git" package using rpm/yum command # yum install -y git [Note: Make sure you have internet access to your system and /etc/yum.repos.d/CentOS-Base.repo in place unless you have local repository configured to download packages]
3.Download Gradle from the link given above and unzip it in a directory. Here i will download code in /opt directory # cd /opt
# wget https://services.gradle.org/distributions/gradle-2.11-all.zip
# unzip # cd /opt
# wget https://services.gradle.org/distributions/gradle-2.11-all.zip
# unzip gradle-2.11-all.zip 4. Export gradle home path # export PATH=$PATH:/opt/gradle-2.11/bin
5. Make sure you have Java installed and set the JAVA_HOME variable in your path [root@test opt]# rpm -qa |grep java-1.7.0
java-1.7.0-openjdk-1.7.0.79-2.5.5.4.el6.x86_64
#export JAVA_HOME=/usr/jdk64/jdk1.7.0_67/
Note: I have installed java in "/usr/jdk64". Please replace the path, where you have java installed. 6. Download "ambari-shell" code from git and compile the code using gradle - # cd /opt/# git clone https://github.com/sequenceiq/ambari-shell.git
# cd /opt/ambari-shell
# gradle clean build 7. The above command will successfully compile ambari-shell code and you should be able to see sample output as shown below - ......
:processResources
:classes
:jar
:startScripts
:distTar
:distZip
:bootRepackage
:assemble
:compileTestJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:processTestResources
:testClasses
:test
:check
:build
BUILD SUCCESSFUL
Total time: 32.558 secs
8. Once build is successful it will create a jar which will be located in below path - [root@test ~]# ls /opt/ambari-shell/build/libs/
ambari-shell-0.1.DEV.jar ambari-shell-0.1.DEV.jar.original
9. After compiling the project, the shell is ready to use (make sure you use Java 7 or above). # java -jar /opt/ambari-shell/build/libs/ambari-shell-0.1.DEV.jar --ambari.server=localhost --ambari.port=8080 --ambari.user=admin --ambari.password=admin Below is sample output after executing above command -
10. Login to ambari-shell and try building your cluster. Please refer "https://github.com/sequenceiq/ambari-shell" for more details.
... View more
Labels:
02-17-2016
04:26 AM
1 Kudo
Hi @Vikas Gadade - I think this is not the case. Even if you have kerberized cluster you still have only user added on Gateway/Client node. Make sure you have proper keyabs in place. Hadoop Service always use Delegation token to nodes and access/execute jobs within kerberized cluster where it executes task.
... View more
02-16-2016
03:52 AM
4 Kudos
Hi @Revathy Mourouguessane 10.0.2.15 is a NAT network IP and hence you will not be able to access it from your Machine. Inspite, if you need to browse the Ambari login [http://10.0.2.15:8080] from your browser, then below is what i suggest - Go to the Sanbox settings Click on "Network" Change the network settings from "NAT" to "Bridged Adapter" Restart network service [service network restart] to take new settings in effect. [Note: if settings are not taking effect you need to Reboot the Sandbox sometimes] You now see the IP address is changed and which will be in same network as your machine network[windows/linux] Now try browsing with new ipaddress from your browser. Also for ssh access as @Neeraj Sabharwal mentioned you need to access using below command - ssh -p 2222 root@127.0.0.1 Let me know if that works.
... View more
02-12-2016
07:01 PM
13 Kudos
slapd-conf.tar.gzEnvironment: Hortonworks Sandbox
“HDP_2.3.2_virtualbox" Ranger
Version: 0.5.0.2.3 LDAP Version: openldap-2.4.40-7.el6_7.x86_64 LDAP setup url : LDAP setup tutorial
[Note: The ldap setup
here does not have ssl implemented. Pls find working “slapd.conf”attached] Steps: 1.Configure
openldap server on sandbox as mentioned in the Ldap setup url above. 2.Login
to Ambari console using “admin” user. 3.Click
on “Ranger” service -> “Configs” 4.Go
to “LDAP Settings” and
set the below properties – ranger.ldap.user.searchfilter=(uid={0})
ranger.ldap.user.dnpattern=cn=Manager,dc=hortonworks,dc=com
ranger.ldap.url=ldap://127.0.0.1:389
ranger.ldap.referral=ignore
ranger.ldap.group.roleattribute=uid
ranger.ldap.bind.password=*****
ranger.ldap.bind.dn=cn=Manager,dc=hortonworks,dc=com
ranger.ldap.base.dn=dc=hortonworks,dc=com Note: ldap.bind.password=<Admin password of openldap> 5.Go to “Advanced
ranger-admin-site” and set below properties - ranger.ldap.group.searchfilter=(member=uid={0},ou=Users,dc=hortonworks,dc=com)
ranger.ldap.group.searchbase=dc=hortonworks,dc=com
6. Go to “Advanced
ranger-ugsync-site” and set below properties - ranger.usersync.ldap.username.caseconversion= none
ranger.usersync.group.memberattributename=member
ranger.usersync.group.nameattribute=cn
ranger.usersync.group.objectclass= groupofnames
ranger.usersync.group.searchbase=dc=hortonworks,dc=com
ranger.usersync.group.searchenabled= false
ranger.usersync.group.searchscope=sub
ranger.usersync.group.usermapsyncenabled=false
ranger.usersync.ldap.user.searchscope=sub
ranger.usersync.ldap.user.searchbase=ou=Users,dc=hortonworks,dc=com
ranger.usersync.ldap.user.objectclass=person
ranger.usersync.ldap.user.nameattribute=uid
ranger.usersync.ldap.url=ldap://127.0.0.1:389
ranger.usersync.ldap.searchBase=dc=hortonworks,dc=com
ranger.usersync.ldap.referral=ignore
ranger.usersync.ldap.ldapbindpassword=*****
ranger.usersync.ldap.groupname.caseconversion=none
ranger.usersync.ldap.binddn=cn=Manager,dc=hortonworks,dc=com
ranger.usersync.ldap.bindalias= ranger.usersync.ldap.bindalias
ranger.usersync.source.impl.class= org.apache.ranger.ldapusersync.process.LdapUserGroupBuilder
ranger.usersync.sink.impl.class= org.apache.ranger.unixusersync.process.PolicyMgrUserGroupBuilder
7.Restart all affected components for “Ranger”
service. 8.Browse “Ranger UI” on http://<ranger-host>:6080 9.Login
to Ranger UI using “admin” 10.Make sure you have following entry in slap.conf access to *
by anonymous read
by * none
11.Click on “Settings” -> “Users/Groups” and make sure you are able to see ldap
users in “Users” section 12. Once you are able to view the users in Ranger UI, then logout from "admin" user. 13. Now try login in Ranger UI using ldap user.
... View more
Labels:
02-12-2016
06:59 PM
2 Kudos
Hi @Arti Wadhwani @Predrag Minovic I was able to successfully integrate Ranger with LDAP. I have mentioned detailed steps in the link below - https://community.hortonworks.com/articles/16696/ranger-ldap-integration.html
... View more
02-09-2016
07:01 AM
1 Kudo
@ARUNKUMAR RAMASAMY The root directory "/" permissions are 755[ie. rwxr-xr-x], by default these permissions are as per linux standards [ie umask].Umask for hdfs user is "022"
And the owner and group are set to hdfs:hdfs.
... View more
02-09-2016
02:30 AM
8 Kudos
Hi @ARUNKUMAR RAMASAMY No. User should not have account on all the nodes of the cluster. He should only have account on edge node. For a new user there are 2 types are directories we need to create before the user access the cluster.
1- User home directory [directory created on Linux Filesystem ie. /home/<username>]
2- User HDFS directory [directory created on HDFS filesystem ie. /user/<username>] As per neeraj, you only need to create HDFS home directory[ie. /user/<username>] on edge node.
You can still run jobs with the new user on cluster, even if you havent created his home directory in linux. ============== Below are 2 scenarios - a. I added new user on edge node using command -
#useradd <username>
Before launching job on cluster, i need to create hdfs directory for user
#sudo -u hdfs hadoop fs -mkdir </user/{username}>
#sudo -u hdfs hadoop fs chown -R <username>:<grp_name> </user/{username}> b. If the user is coming from ldap server, then you only need to make your edge node as ldap client and create a directory in HDFS using below command - #sudo -u hdfs hadoop fs -mkdir </user/{username}>
#sudo -u hdfs hadoop fs chown -R <username>:<grp_name> </user/{username}> Let me know if this clears, what you are looking for.
... View more
02-08-2016
10:29 AM
4 Kudos
Hi @Pradeep kumar 1, If you have installed ambari server then you can find version using below steps - Point your browser to http://{ambari.server.hostname}:8080. Log in to the Ambari Server using the default username/password: admin/admin[unless you have changed the login password]. Once login pls dropdown "admin" tab located on top right side of the webui and click on "About". This will display the ambari server version. Pls find screenshot for the same - -- As mentioned by @Benjamin Leonhardi you can use yum and also the rpm command to display ambari server and agent version, as shown below - [Note: Make sure you have root/superuser login on server.] 2. If you have not installed Ambari-server or agent and just want to explore information prior to installation then please refer link for the version and release details - Ambari Releases and Versions [http://hortonworks.com/hadoop/ambari/#section_5] Let me know if this helps you out for what you are looking for.
... View more
- « Previous
- Next »