Member since
08-23-2016
261
Posts
201
Kudos Received
106
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1040 | 01-26-2018 07:28 PM | |
855 | 11-29-2017 04:02 PM | |
30010 | 11-29-2017 03:56 PM | |
1911 | 11-28-2017 01:01 AM | |
476 | 11-22-2017 04:08 PM |
04-27-2018
08:10 PM
@Marshal Tito The hostname should be the node you have installed the hiveserver2 on. The username/pw should be a hadoop user you have created (or if using the Sandbox, one of the pre-created users such as maria_dev). Within Azure, make sure the port that you are connecting over is open (10000) from your tableau system to your hadoop cluster. Usually on public clouds, it is the closed firewall ports that is the issue.
... View more
04-27-2018
08:00 PM
I haven't seen it work that, I think in the background it is generating SQL queries and sending them over the JDBC driver.
... View more
02-14-2018
08:27 PM
Hi, my pleasure. The docs are decent at describing how it works, and where the data comes from. You can probably start the research from this point, and follow the links on the page as needed: https://cwiki.apache.org/confluence/display/AMBARI/Metrics
... View more
02-13-2018
05:53 PM
hi @Anurag Mishra the tables are listed here in the Apache wiki: https://cwiki.apache.org/confluence/display/AMBARI/Phoenix+Schema
... View more
02-12-2018
06:24 PM
Hi @Anurag Mishra AMS uses an embedded hbase/phoenix layer. You can query it, and there is a decent writeup on how to do that here: https://community.hortonworks.com/articles/71206/using-phoenix-sqlline-utility-to-browse-ambari-met.html
... View more
01-26-2018
07:28 PM
Hi @Pierre Gunet-Caplain I don't believe multiple HS2 interactive instances are currently supported. I'll see if I can find out anything further within the community.
... View more
01-26-2018
04:53 PM
Hi @Claus Jorgensen See if this post helps: https://community.hortonworks.com/questions/103201/what-do-i-enter-into-this-tableau-data-connector-t.html
... View more
12-21-2017
09:56 PM
Hi @Marshal Tito If you are using a modern version of HDP, you may wish to try using the Beeline CLI. Here's an answer that can help get you started: https://community.hortonworks.com/questions/148531/run-hive-queries-in-command-shell-cmd-not-in-ambar.html?childToView=148535#answer-148535
... View more
12-21-2017
09:37 PM
Hi @Jon Page if you have an application ID, you can view the yarn logs both from the YARN Resource Manager UI (Ambari->Yarn->Quick Links) or via CLI: yarn
logs -applicationId <app ID>
... View more
11-29-2017
04:44 PM
Hi @sbx_hadoop Azim, You can do this via Grafana UI and Ambari Metrics, and even see it in action via the HDP Sandbox. If you are using the Ambari Metrics system, ensure that the Grafana UI is started, and go to the UI via Ambari Metrics Service -> Quick Links -> Grafana UI. There is a prebuilt dashboard called System - Servers that has dashboards built for CPU Utilization by User and across the System. These can be exported as well using the menu on each individual tile to a CSV/JSON/etc http://sandbox-hdp.hortonworks.com:3000/dashboard/db/system-servers screen-shot-2017-11-29-at-94031-am.png
... View more
11-29-2017
04:02 PM
1 Kudo
Hi @sijeesh kunnotharamal As per the Apache Knox project website, Phoenix is a supported service of Knox. http://knox.apache.org/ Specifically, see this section of the User's Guide: http://knox.apache.org/books/knox-0-13-0/user-guide.html#Avatica
... View more
11-29-2017
03:56 PM
2 Kudos
Hi @Nilesh If you are using HDP via Ambari, you can use the Stacks and Versions feature to see all of the installed components and versions from the stack. Via command line, you can navigate to /usr/hdp/current/kafka-broker/libs and see the jar files with the versions. See the attachments for examples of each. I don't believe there is a --version type of command in any of the Kafka cli scripts from what I remember. screen-shot-2017-11-29-at-84935-am.pngscreen-shot-2017-11-29-at-85206-am.png
... View more
11-28-2017
01:01 AM
1 Kudo
Hi @Mike Bit If you want to use a CLI to access Hive, the recommend client tool to use is called beeline. It is normally included in the client tools installation, so you probably already have it ready to go. From your client tools machine/host/server, you will use the beeline client to connect to the HiveServer2 JDBC URL, and everything from there is the usual SQL commands. In Ambari, you can easily copy the HiveServer2 JDBC URL directly from the Hive service config screen, and paste it right into a beeline connect string. So for example, if my machines were ssahi[0-2].hortonworks.com, where I was running ZooKeeper, and I was using hive/hive as the user/password, my beeline command to open the client and connect to Hive may look like: beeline -u "jdbc:hive2://ssahi1.hortonworks.com:2181,ssahi0.hortonworks.com:2181,ssahi2.hortonworks.com:2181/;ServiceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2"
-n hive -p hive You can find more on Beeline from the project documentation here, including example syntax, and config to have things display nicer, etc: https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Beeline–CommandLineShell
... View more
11-22-2017
04:08 PM
1 Kudo
hi @Sofiane CHIBANI The releases are listed, and will be updated here: https://hortonworks.com/services/training/certification/hdpca-certification/ Currently, HDP 2.3 and Ambari 2.1 are the releases that are part of the exam. I did hear there was work underway to upgrade them, but no timeline was available to me. The website, the practice exams, and the links will be updated when the exam changes over to newer versions. Good luck!
... View more
11-20-2017
06:33 PM
My personal preference is HDFS put over flume if those are the options. Even better would be HDF, but it sounds like a simple HDFS put would solve it
... View more
11-20-2017
04:09 PM
HI @Mudassar Hussain you should practice all of the Exam Objectives from this link near the bottom: https://hortonworks.com/services/training/certification/hdpca-certification/ If you are comfortable with hands-on for each of the objectives, you should have no problem with the exam.
... View more
11-09-2017
06:10 PM
Hi @Larry Wilson The connect string that Hive View 2.0 is using looks ok, however, the Hive View logs show an underlying error: ava.sql.SQLException: org.apache.hive.jdbc.ZooKeeperHiveClientException: Unable to read HiveServer2 configs from ZooKeeper Do you have HiveServerHA enabled? I would start by starting/stopping ZooKeeper, then starting/stopping Hive all from Ambari and trying again. Can you do this and report back results.
... View more
11-09-2017
05:38 PM
Hi @Charlie Halpern-Hamu I've just downloaded a fresh copy of the HDP 2.6.1 sandbox, and attempted to load the geolocation's truck.csv into HDFS and into Hive using the Hive View 2.0 and following the steps in the tutorial you are using. With the maria_dev user, I was able to successfully load the data into both HDFS and Hive without a problem. I've seen errors around this twice in the past, once with vmware (can you try with VirtualBox?) and another with not quite enough ram assigned to the VM. If you can confirm you have at least 8GB assigned to the VM, and can paste the entire error line, that could help us find a resolution
... View more
11-09-2017
05:20 PM
Hi @Philip Walenta You can definitely create you own installable services through Ambari's service definition, they are extensible. One word of caution though is that third party services/software can sometimes affect Ambari-based upgrades. If you choose to use them, be sure to thoroughly test your upgrades in a non-prod environment first as always. You can find several examples of Ambari services at the HCC repo here: https://community.hortonworks.com/search.html?f=&type=repo&redirect=search%2Fsearch&sort=relevance&q=ambari+service and also the Ambari doc's service definition: https://cwiki.apache.org/confluence/display/AMBARI/Defining+a+Custom+Stack+and+Services I don't have a timeline on when Druid may be upgraded unfortunately.
... View more
11-09-2017
04:39 PM
hi @Larry Wilson The error could actually be in the Hive View log, and not necessarily in the hiveserver2 log. You might want to check /var/log/ambari-server/hive20-view/hive20-view.log to see what the connect string error is and could help point us to the right resolution.
... View more
11-08-2017
09:09 PM
Hi @Larry Wilson I just did a fresh download of the 2.6.1 Sandbox for VirtualBox, and was able to use the 'admin' account to load the files view and the hive server 2.0 view without any issues. Do you have the following entry in your system's hosts file? 127.0.0.1 sandbox.hortonworks.com sandbox
... View more
11-08-2017
08:52 PM
Hi @Rashid Mehdi I just tried it now to my Mac laptop, and the file download does indeed start. After submitting the registration form, I wait maybe 1-2 sec and then I am asked where to save the file. I am using Chrome and not Safari if that helps.
... View more
11-08-2017
06:18 PM
The docs are indeed being updated, thanks!
... View more
11-08-2017
06:18 PM
1 Kudo
Hi @David Williamson That is odd, the only time i've seen it is a browser refresh issue like I mentioned. As far as logs go, on the Ranger Admin server you can check /var/log/ranger/admin/xa_portal.log and see if any errors are in there.
... View more
11-08-2017
06:07 PM
1 Kudo
Hi @fnu rasool There are some good links online, but this is one of the better ones despite its age: https://hortonworks.com/blog/deploying-hadoop-cluster-amazon-ec2-hortonworks/
... View more
11-07-2017
11:24 PM
1 Kudo
Hi @David Williamson There's no known issue that I'm aware of. When you install the plugin, you'll likely need to restart any affected services and then also it is a good idea to refresh the Ranger UI (sometimes just logging out/in isn't good enough, try the CTL+SHIFT+R browser refresh).
... View more
11-06-2017
04:56 PM
1 Kudo
Hi @yassine sihi Having both Ambari and its mysql db on the same machine does not normally cause any issues logging in. Most of my lab clusters are setup in the fashion. I think Jay is on the right track to look at the grants and hostnames. I've also seen one similar login error when a user tried to access mysql without using the server's FQDN - ensure FQDN are being used whenever possible.
... View more
10-24-2017
02:21 PM
Hi @Andrey Emelyanenko Nothing that I've heard of internally just yet. Ambari, however, is extensible. You can always write an extension that allows you to add/manage/delete Airflow as a service if you wish. There are many examples online including one in the HCC Repo: https://community.hortonworks.com/repos/3912/r-service-ambari-service.html
... View more
10-23-2017
06:16 PM
Hi @Evan Tattam I don't think multiple YARN ATS are currently supported by the Apache Yarn project just yet, it is likely a roadmap feature. https://hadoop.apache.org/docs/r2.7.3/hadoop-yarn/hadoop-yarn-site/TimelineServer.html I would assume it is a documentation issue for now and proceed with the remainder of the components. I'll follow up internally.
... View more
10-20-2017
09:45 PM
1 Kudo
Hi @sbx hadoop you should probably check with your local Hortonworks team for a proper answer. I don't think you need to worry about logically/physically moving anything, just ensuring that you have the proper support subscriptions in place from the HDP and HDF products that you need. Again, your local Hortonworks team can provide an official answer to your inquiry.
... View more