Member since
08-23-2016
261
Posts
201
Kudos Received
106
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1770 | 01-26-2018 07:28 PM | |
1408 | 11-29-2017 04:02 PM | |
35364 | 11-29-2017 03:56 PM | |
3537 | 11-28-2017 01:01 AM | |
968 | 11-22-2017 04:08 PM |
04-27-2018
08:10 PM
@Marshal Tito The hostname should be the node you have installed the hiveserver2 on. The username/pw should be a hadoop user you have created (or if using the Sandbox, one of the pre-created users such as maria_dev). Within Azure, make sure the port that you are connecting over is open (10000) from your tableau system to your hadoop cluster. Usually on public clouds, it is the closed firewall ports that is the issue.
... View more
04-27-2018
08:00 PM
I haven't seen it work that, I think in the background it is generating SQL queries and sending them over the JDBC driver.
... View more
01-26-2018
07:28 PM
Hi @Pierre Gunet-Caplain I don't believe multiple HS2 interactive instances are currently supported. I'll see if I can find out anything further within the community.
... View more
11-29-2017
04:02 PM
1 Kudo
Hi @sijeesh kunnotharamal As per the Apache Knox project website, Phoenix is a supported service of Knox. http://knox.apache.org/ Specifically, see this section of the User's Guide: http://knox.apache.org/books/knox-0-13-0/user-guide.html#Avatica
... View more
11-29-2017
03:56 PM
2 Kudos
Hi @Nilesh If you are using HDP via Ambari, you can use the Stacks and Versions feature to see all of the installed components and versions from the stack. Via command line, you can navigate to /usr/hdp/current/kafka-broker/libs and see the jar files with the versions. See the attachments for examples of each. I don't believe there is a --version type of command in any of the Kafka cli scripts from what I remember. screen-shot-2017-11-29-at-84935-am.pngscreen-shot-2017-11-29-at-85206-am.png
... View more
11-28-2017
01:01 AM
1 Kudo
Hi @Mike Bit If you want to use a CLI to access Hive, the recommend client tool to use is called beeline. It is normally included in the client tools installation, so you probably already have it ready to go. From your client tools machine/host/server, you will use the beeline client to connect to the HiveServer2 JDBC URL, and everything from there is the usual SQL commands. In Ambari, you can easily copy the HiveServer2 JDBC URL directly from the Hive service config screen, and paste it right into a beeline connect string. So for example, if my machines were ssahi[0-2].hortonworks.com, where I was running ZooKeeper, and I was using hive/hive as the user/password, my beeline command to open the client and connect to Hive may look like: beeline -u "jdbc:hive2://ssahi1.hortonworks.com:2181,ssahi0.hortonworks.com:2181,ssahi2.hortonworks.com:2181/;ServiceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2"
-n hive -p hive You can find more on Beeline from the project documentation here, including example syntax, and config to have things display nicer, etc: https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Beeline–CommandLineShell
... View more
11-22-2017
04:08 PM
1 Kudo
hi @Sofiane CHIBANI The releases are listed, and will be updated here: https://hortonworks.com/services/training/certification/hdpca-certification/ Currently, HDP 2.3 and Ambari 2.1 are the releases that are part of the exam. I did hear there was work underway to upgrade them, but no timeline was available to me. The website, the practice exams, and the links will be updated when the exam changes over to newer versions. Good luck!
... View more
11-20-2017
06:33 PM
My personal preference is HDFS put over flume if those are the options. Even better would be HDF, but it sounds like a simple HDFS put would solve it
... View more
11-09-2017
06:10 PM
Hi @Larry Wilson The connect string that Hive View 2.0 is using looks ok, however, the Hive View logs show an underlying error: ava.sql.SQLException: org.apache.hive.jdbc.ZooKeeperHiveClientException: Unable to read HiveServer2 configs from ZooKeeper Do you have HiveServerHA enabled? I would start by starting/stopping ZooKeeper, then starting/stopping Hive all from Ambari and trying again. Can you do this and report back results.
... View more
11-09-2017
05:20 PM
Hi @Philip Walenta You can definitely create you own installable services through Ambari's service definition, they are extensible. One word of caution though is that third party services/software can sometimes affect Ambari-based upgrades. If you choose to use them, be sure to thoroughly test your upgrades in a non-prod environment first as always. You can find several examples of Ambari services at the HCC repo here: https://community.hortonworks.com/search.html?f=&type=repo&redirect=search%2Fsearch&sort=relevance&q=ambari+service and also the Ambari doc's service definition: https://cwiki.apache.org/confluence/display/AMBARI/Defining+a+Custom+Stack+and+Services I don't have a timeline on when Druid may be upgraded unfortunately.
... View more