Member since
12-30-2015
19
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
270 | 02-28-2018 07:56 PM |
02-28-2018
07:56 PM
Pivotal HDB (commercial version of Apache Hawq) has been discontinued. You would need to build Apache Hawq from their open source repo http://github.com/apache/incubator-hawq. Here are the instructions for building Hawq https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install
... View more
02-28-2018
07:55 PM
1 Kudo
Pivotal HDB (commercial version of Apache Hawq) has been discontinued. You would need to build Apache Hawq from their open source repo http://github.com/apache/incubator-hawq. Here are the instructions for building Hawq https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install
... View more
01-09-2018
10:57 PM
@grabowski14 What kind of platform is your Hadoop cluster running under (e.g. OSX, Centos6, etc)? You must have PXF server running on all the data-nodes as well as on the name node. It is best if you use Ambari to install HAWQ and PXF as it guides you with the setup.
... View more
01-08-2018
06:40 PM
Please take a look at pxf-private.classpath to make sure the hive jars have the correct path. Also, take a look at pxf-service.log under tomcat where PXF is running. The above error is being returned from the PXF java server to C client.
... View more
11-21-2017
09:30 PM
I am running into the same issue. Any ideas?
... View more
10-25-2017
06:36 PM
Here are some sample entries from my pg_hba.conf for HAWQ master local all all ident
host all all 127.0.0.1/28 trust
host all gpadmin ::1/128 trust
... View more
10-23-2017
04:07 AM
It looks like something is running on port 51200 that is blocking PXFwebapp. You may try using: netstat -plan | grep 51200
... View more
01-03-2017
08:08 PM
Leonard, I was referring to your last message on Dec 20, where you mentioned that you were unable to initialize HAWQ 2.0.1 with HDP 2.4 VM. Let me try installing HAWQ with HDP 2.5 VM sometime this week.
... View more
01-03-2017
07:37 PM
Hi Leonard, I would need the URL that you used to download HDP 2.4 VM. Is your sandbox using docker or vagrant?
... View more
12-15-2016
09:42 PM
Pivotal HDB 2.1 does not delete the hawqmaster.py script. It includes a plugin that copies metainfo.xml onto the HDP 2.5 stack directory inside Ambari so that Ambari recognizes HDB 2.1 as part of the stack.
I can take a look at your cluster if you deploy it to an AWS instance using the HDP 2.5 docker sandbox. One other option would be to uninstall SOLR before upgrading.
... View more
12-14-2016
06:18 PM
Hi Leonard, I checked both Docker as well as VirbualBox sandboxes for HDP 2.5 and they both contain hawqmaster.py I did not get a time to try the install of HDB 2.1 yet. The upgrade to Ambari 2.4.2 is independent of HDP install (looks like upgrade of SOLR component has a problem).
... View more
12-10-2016
05:51 PM
Hi Leonard, The installation that you have sounds like a stripped down version of Ambari. Please download Ambari 2.4.2 from Hortonworks (Ambari for Centos6) and install HDP 2.5 stack using Ambari. Please also post the URL for the tarball that you used for your current setup so we can investigate further.
... View more
12-09-2016
11:02 PM
Which version of Ambari are you using? The file hawqmaster.py is part of Ambari (not HDB 2.1)
... View more
12-09-2016
08:24 PM
Leonard, If Ambari is installed correctly then /var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/* should be present on each node. As part of bootstrapping the hosts in the cluster, Ambari-server copies the common-services directory and some other stuff to each of the hosts.
... View more
02-02-2016
06:56 PM
2 Kudos
The following curl command gives the token (but uses the name node directly). $ curl -s --negotiate -u : "http://<active-namenode-hostname>:50070/webhdfs/v1/?op=GETDELEGATIONTOKEN" Is it possible to use dfs.nameservices property to get the token instead.
... View more
Labels:
01-04-2016
08:05 PM
Thanks again! Here is what worked for me (using python 2.6.6). Still need to investigate how to run the test from ambari root directory. $ pip install discover
$ export PYTHONPATH=~/git/ambari/ambari-common/src/test/python
$ cd ~/git/ambari/ambari-server/src/test/python/stacks/2.3/common
$ python -m discover -v
Test that HAWQSTANDBY is not recommended on a single node cluster ... ok
...
----------------------------------------------------------------------
Ran 21 tests in 0.318s
OK
... View more
12-31-2015
10:00 PM
Hi Artem, Thanks a lot for your input. I was looking to test stack advisor recommendations and validations. e.g. ambari/ambari-server/src/test/python/stacks/2.3/common/test_stack_advisor.py
... View more
12-30-2015
04:30 AM
1 Kudo
e.g. mvn -Dtest=TestHDP23StackAdvisor test
... View more
Labels: