Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Instalation of HUE using Ambari for hdp 2.6.5

avatar
Expert Contributor

Team ,

 

I am following below link to install managed by Ambari .

Hdp version :- 2.6.5

Ambari version :-

 

https://github.com/EsharEditor/ambari-hue-service .

 

After following  few steps and when i try to install the HUE services from ambari  i am getting below error .

 

2020-03-31 16:43:07,626 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2020-03-31 16:43:07,630 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2020-03-31 16:43:07,631 - Group['kms'] {}
2020-03-31 16:43:07,634 - Group['livy'] {}
2020-03-31 16:43:07,634 - Group['spark'] {}
2020-03-31 16:43:07,634 - Group['ranger'] {}
2020-03-31 16:43:07,635 - Group['hue'] {}
2020-03-31 16:43:07,642 - Adding group Group['hue']
2020-03-31 16:43:07,670 - Group['hdfs'] {}
2020-03-31 16:43:07,671 - Group['zeppelin'] {}
2020-03-31 16:43:07,672 - Group['hadoop'] {}
2020-03-31 16:43:07,673 - Group['users'] {}
2020-03-31 16:43:07,673 - Group['knox'] {}
2020-03-31 16:43:07,675 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,679 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,682 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,685 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,687 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,688 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,690 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2020-03-31 16:43:07,691 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,693 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2020-03-31 16:43:07,694 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,696 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,697 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,699 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,700 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,701 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,703 - User['hue'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,721 - Adding user User['hue']
2020-03-31 16:43:08,285 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2020-03-31 16:43:08,288 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,292 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,295 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,297 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,298 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,301 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,303 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,307 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2020-03-31 16:43:08,317 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2020-03-31 16:43:08,317 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2020-03-31 16:43:08,318 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,319 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,320 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2020-03-31 16:43:08,333 - call returned (0, '57467')
2020-03-31 16:43:08,333 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 57467'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2020-03-31 16:43:08,340 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 57467'] due to not_if
2020-03-31 16:43:08,340 - Group['hdfs'] {}
2020-03-31 16:43:08,342 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2020-03-31 16:43:08,343 - User['admin'] {'fetch_nonlocal_groups': True}
2020-03-31 16:43:08,345 - FS Type: 
2020-03-31 16:43:08,346 - Directory['/etc/hadoop'] {'mode': 0755}
2020-03-31 16:43:08,362 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2020-03-31 16:43:08,363 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2020-03-31 16:43:08,363 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2020-03-31 16:43:08,377 - Repository['HDP-2.6-repo-301'] {'append_to_file': False, 'base_url': 'http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-301', 'mirror_list': None}
2020-03-31 16:43:08,384 - File['/etc/yum.repos.d/ambari-hdp-301.repo'] {'content': '[HDP-2.6-repo-301]\nname=HDP-2.6-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-03-31 16:43:08,385 - Writing File['/etc/yum.repos.d/ambari-hdp-301.repo'] because contents don't match
2020-03-31 16:43:08,385 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.1050 is not created due to its tags: set([u'GPL'])
2020-03-31 16:43:08,385 - Repository['HDP-UTILS-1.1.0.22-repo-301'] {'append_to_file': True, 'base_url': 'http://private-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-301', 'mirror_list': None}
2020-03-31 16:43:08,388 - File['/etc/yum.repos.d/ambari-hdp-301.repo'] {'content': '[HDP-2.6-repo-301]\nname=HDP-2.6-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-301]\nname=HDP-UTILS-1.1.0.22-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-03-31 16:43:08,388 - Writing File['/etc/yum.repos.d/ambari-hdp-301.repo'] because contents don't match
2020-03-31 16:43:08,388 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:08,836 - Skipping installation of existing package unzip
2020-03-31 16:43:08,836 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:08,919 - Skipping installation of existing package curl
2020-03-31 16:43:08,919 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,003 - Skipping installation of existing package hdp-select
2020-03-31 16:43:09,007 - The repository with version 2.6.5.128-2 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2020-03-31 16:43:09,011 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
2020-03-31 16:43:09,257 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2020-03-31 16:43:09,260 - Package['wget'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,428 - Skipping installation of existing package wget
2020-03-31 16:43:09,429 - Package['tar'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,561 - Skipping installation of existing package tar
2020-03-31 16:43:09,562 - Package['asciidoc'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,646 - Installing package asciidoc ('/usr/bin/yum -d 0 -e 0 -y install asciidoc')
2020-03-31 16:43:10,410 - Execution of '/usr/bin/yum -d 0 -e 0 -y install asciidoc' returned 1. Error: Nothing to do
Loaded plugins: product-id
Cannot upload enabled repos report, is this client registered?
2020-03-31 16:43:10,410 - Failed to install package asciidoc. Executing '/usr/bin/yum clean metadata'
2020-03-31 16:43:10,623 - Retrying to install package asciidoc after 30 seconds
2020-03-31 16:43:41,768 - The repository with version 2.6.5.128-2 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2020-03-31 16:43:41,773 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.

 

 

Any advice or solution is highly appreciated  .

 

Regards

Bharad

 

25 REPLIES 25

avatar

@bhara I believe this question was asked and answered here. Hope that helps. 

 

 

 

Bill Brooks, Community Moderator
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Super Guru

@bhara I replied on the other topic you responded on.    

 

I am shifting into high gear to get my management pack ready for the original Ambari Hue Service (3.x).

 

I should have it ready here this morning:

 

https://github.com/steven-dfheinz/HDP3-Hue-Service

https://github.com/steven-dfheinz/dfhz_hue_mpack

avatar
Expert Contributor

@stevenmatison 

 

Thank you Sir . We are in HDP 2.6.x and if we can get Hue 4.x management pack it would be really helpful .

 

Thanks

Bharad

avatar
Super Guru

Working on HDP 2.4 with hue 3.11 now.  These are the minimal required version to get the original repo working with gethue.com.  Hue and Hortonworks changed since it was created.  This is why the original repo doesn't work.


Soon as I am done I will make another one for HDP 2.x with 4.x.   

 

 

avatar
Expert Contributor

Thank you Sir . Just wondering what does the time like look like for both of them to be out .

 

Regards

Bharad

avatar
Super Guru

@bhara Sorry I don't know how long this stuff takes, sometimes takes me a few days.  I have had some issues with the HDP 2 Management Pack.

 

I did get the Hue Service working and created a repo for those required changes.   You can find that here:

https://github.com/steven-dfheinz/HDP2-Hue-Service

 

Install is very easy and it should work out of the box if there are no dependency issues.  The dependencies are documented at gethue.com.  For my test the epel repo was what I needed to get centos 7.4 ready to install everything.

 

Here are the node commands in my test to install Ambari and HDP 2.6.5:

yum install nano git wget -y
yum --enablerepo=extras install epel-release -y
wget -nv http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.6.2.0/ambari.repo -O /etc/yum.repos.d/ambari.repo
yum install java java-devel mariadb mariadb-server mysql-connector-java ambari-server ambari-agent -y
ambari-server setup -s && ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar && ambari-server start && ambari-agent start && service mariadb start && chkconfig mariadb on

 Next I installed ambari and then once everything was running the actual Hue Service Install is just:

 

sudo git clone https://github.com/steven-dfheinz/HDP2-Hue-Service.git /var/lib/ambari-server/resources/stacks/HDP/2.6/services/HUE
service ambari-server restart

 Then install from Ambari.  

 

avatar
Super Guru

I am working on a repo for Hue 4.6.0 now:

 

https://github.com/steven-dfheinz/HDP2-Hue4-Service

 

I am installing it now.  This should work, but I want to make some disclosures so you understand.  This is a custom service and not supported.   It will work just fine but you will have to do more work after "install".

 

The change from Hue3 to Hue4 is quite a bit more differences than just the required differences to make the original service work.   The Hue 3 included a configuration for "pseudo-distributed.ini".  The Hue 4 uses a default "hue.ini" file with minimal changes.

 

Once you install, you will need to make changes to the configs hue.ini via ambari as you work towards getting Hue Plugins configured: HDFS, Hive, Hbase, Zeppelin, etc.     

If you have any issues please open a new question here and tag me in it.

 

 

avatar
Expert Contributor

@stevenmatisonI have tried the above 4.6 approach and i am getting belwo error .

 

creating build/lib.linux-x86_64-2.7
copying _mysql_exceptions.py -> build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/__init__.py -> build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/converters.py -> build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/connections.py -> build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/cursors.py -> build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/release.py -> build/lib.linux-x86_64-2.7/MySQLdb
copying MySQLdb/times.py -> build/lib.linux-x86_64-2.7/MySQLdb
creating build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/__init__.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/CR.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/FIELD_TYPE.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/ER.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/FLAG.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/REFRESH.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
copying MySQLdb/constants/CLIENT.py -> build/lib.linux-x86_64-2.7/MySQLdb/constants
running build_ext
building '_mysql' extension
creating build/temp.linux-x86_64-2.7
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -Dversion_info=(1,2,5,'final',1) -D__version__=1.2.5 -I/usr/include/mysql -I/usr/include/python2.7 -c _mysql.c -o build/temp.linux-x86_64-2.7/_mysql.o -m64
_mysql.c:44:23: fatal error: my_config.h: No such file or directory
 #include "my_config.h"
                       ^
compilation terminated.
error: command 'gcc' failed with exit status 1
make[2]: *** [/usr/local/hue-4.6.0/desktop/core/build/MySQL-python-1.2.5/egg.stamp] Error 1
make[2]: Leaving directory `/usr/local/hue-4.6.0/desktop/core'
make[1]: *** [.recursive-install-bdist/core] Error 2
make[1]: Leaving directory `/usr/local/hue-4.6.0/desktop'
make: *** [install-desktop] Error 2

avatar
Super Guru

That is a dependency issue during the command "make install" for hue.  Your specific error seems to be related to python-tools and mysql client.

 

You need to make sure you have the environment ready before installing Hue.

 

https://docs.gethue.com/administrator/installation/dependencies/

 

I put them all into the service, but if your environment does not contain repos to deliver the dependencies, then they will fail.

 

 

What operating system are you testing with?