Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Can I install Hue in HDP 3.1.0

Explorer

Hello there,

I 'm using HDP 3.1.0. I want to use Hue for Hive and Oozie.

I found this guide for HDP 2.6.5: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_command-line-installation/content/instal...

Ca I install Hue in HDP 3.1.0?

13 REPLIES 13

Expert Contributor

You can install Hue, but you would need to get the Hue from gethue.com and set it up the way its documented there. Using HDP 2.6.x release of Hue may not work with HDP 3.x release.

Explorer

Yeah. I am going to try and post my result. I hope it will success.

Hello @luan ha

Did you get hue working on hdp 3.x ?

New Contributor

Yeah. You could check the latest guide on HDP3 and manually tweak what Ambari is not doing.

Hi @AKRAM JEBALI

We will try next week this installation and configuration (authentification, grafana integration, ...).

To be continued

New Contributor

hi~friends:

The Hue install with Hdp-3.1 is ok?

I try to install hue with hdp-3.1,but install hadoop-httpfs .

I setup configuration with hadoop-httpfs service script,but I try to start hadoop-httpfs service failed.

Anyone success install hue with hdp-3.1


PS: I have namenode HA with my hadoop cluster


kerwin~


Hi @kerwin chen

I finally gave up on installing Hue and chose to install Data Analytics Studio. I compiled the product from open source code under github and installed the mpack and the product on my edge node.


Mathieu

New Contributor

hi~ Mathieu Couron:

Can you provide info about DAS install or step with github ?


Thanks~





Hi @kerwin1217 

These are the steps to install DAS on HDP 3.1 (os centos) from open source code :

  • First clone https://github.com/hortonworks/data_analytics_studio and run maven install (you must not be under a proxy, that does not work)
  • Then you can install the mpack you obtain (mpack/target/hdp3-data-analytics-studio-mpack-1.2.0.tar.gz) as describe here (do not start add service at this time).
  • Create a rpm (my cluster is on centos) hdp3_data_analytics_studio-1.2.0-0.0.x86_64.rpm containing data_analytics_studio-event-processor-1.2.0.jar and data_analytics_studio-webapp-1.2.0.jar (jars obtained from step 1). The rpm should copy the jars on /usr/das/1.2.0.0.0/data_analytics_studio/lib and create /etc/das/conf and /var/log/das directories.
  • Put the rpm on your local yum repo.
  • I use Postgres database, so I created das database with das user.
  • On ambari, use Manage Ambari to modify HDP 3.1 version to indicate the yum repo for DAS.
  • On ambari, add service DAS (uncheck create database).

Hope this can help.

 

Mathieu

New Contributor
throw exception when install das Failed to execute goal com.github.eirslett:frontend-maven-plugin:1.4:npm (npm install) on project das-dp-ui-app: Failed to run task: 'npm install' failed. org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

Hi @IvanLeung 

 

May be the same error as http://apache-nifi-developer-list.39713.n7.nabble.com/failed-to-build-nifi-1-8-0-SNAPSHOT-on-OSX-hig...

 

You can try unistall node if you previously install it on your machine and re-run maven clean install.

New Contributor

Thanks for your reply! But the repository dps-apps.git seems not exist.

Explorer

I face this error when build with "mvn clean package -P mpack":

[WARNING] The requested profile "mpack" could not be activated because it does not exist.
[ERROR] Failed to execute goal on project das-commons: Could not resolve dependencies for project com.hortonworks.das:das-commons:jar:1.2.0-SNAPSHOT: Failure to find org.apache.hadoop:hadoop-common:jar:3.1.0.3.0.0.0-1634 in <a href="https://repo.maven.apache.org/maven2" target="_blank">https://repo.maven.apache.org/maven2</a> was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]