I just installed HDP 2.5 on my physical cluster (not Sandbox) and found that there is no Hive 2.1 as promised. I see both Spark 1.6.2 and Spark 2.0.0, but only one Hive.
What to do? I really want Hive 2.1...
Hive 2.1 is installed automatically when Hive is installed. This is done partly to ensure that Hive 1 and Hive 2 always use the same metastore so there is no "split brain". To use Hive 2 via Ambari managed clusters, enable Interactive Query within Ambari under the Hive service. A HiveServer2 endpoint will be provisioned that connects specifically to Hive 2.
Now you can connect to Hive 2 with your tool of choice, using either Zookeeper discovery (preferred, pictured here) or connecting to port 10500 of the HiveServer2 Interactive Host, pictured above in the configuration screen.
Please respond this question. I am sure that you can see for yourself and adjust your response, as well the down votes on similar questions. HiveServer2 does not imply that Hive 2.1 is part of HDP 2.5. Check the facts.
According to "Stack and Versions" I have only one Hive (see screenshot), but in /etc I have two folders with different set of files.
Does it mean that two Hives are installed?
I can't see the screenshot. My request was for carter. I disagree with carter's response. I don't think is accurate. Hive2 Server does not imply Hive 2.0. If you have a Hive 2.0 feature that is LLAP only in tech preview.
@carter Thanks a lot for the detailed explanation!
Although I still was not able to run it((
had to ask another question https://community.hortonworks.com/questions/55387/cannot-start-hiveserver2-interactive-llap.html
@Alena Melnikova@carterHello, did you build the HDP2.5 cluster successfully？I am in the deployment of HDP2.5 cluster, other services are installed properly, but the installation hive failed, error reporting lack python-argparse, but I manually install argparse, still at the same mistake, I want to know why? Thank you! I am very eager to know where is the problem?