Member since
10-02-2015
20
Posts
51
Kudos Received
0
Solutions
12-22-2016
03:18 PM
Have a look at this article. While it discusses DbVisualizer not SQuirreL, the underlying problem and solution was the same.
... View more
12-21-2016
08:18 PM
4 Kudos
Introduction
SQL development tools like DbVisualizer, SQuirreL SQL and DataGrip are popular options for database development. Although these tools don't offer native Hive support they can be easily configured to connect to Hive using JDBC. While connecting these tools to clusters without kerberos is relatively straightforward, the process of connecting them to kerberized clusters can be complex and error prone. This article, in combination with a project I created (
Hive JDBC Uber Jar), aim to simplify and standardize this process. Prerequisites
There are a few key things that must be properly configured before attempting to connect to a kerberized cluster. A full description of these tasks is out of scope for this article, but at a high level, make sure that:
You have downloaded the latest release of my Hive JDBC Uber Jar and placed it somewhere sensible DbVisualizer and/or DataGrip have been successfully installed on your workstation The krb5.conf file on your workstation matches the one on your cluster You have a valid kerberos principal that can access the appropriate services your cluster You can successfully kinit from your workstation against the realm specified in your krb5.conf file DbVisualizer Setup
kinit with an appropriate principal and launch DbVisualizer
Open DbVisualizer preferences ("DbVisualizer" > "Preferences") and add the following properties. DbVisualizer will need to be restarted after applying these changes.
-Dsun.security.krb5.debug=true
-Djavax.security.auth.useSubjectCredsOnly=false
Open the Diver Manager dialog ("Tools" > "Driver Manager...") and hit the "Create a new driver" icon.
Fill in the information as seen below. For the "Driver File Paths" you are pointing to the
hive-jdbc-uber-x.jar that you just downloaded.
jdbc:hive2://<server>:<port10000>/<database>
Create a new connection ("Database" > "Create Database Connection") and fill out the details based on your cluster as seen below. Please note that you must append the "principal" to the "database" parameter for kerberized connections.
Hit the "Connect" button to test the connection. You should see something like the following in the "Connection Message" text area if the connection is successful.
Apache Hive
1.2.1000.2.5.3.0-37
null
null
You are now ready to execute your first query against Hive using DbVisualizer! JetBrains DataGrip Setup
kinit with an appropriate principal and launch DataGrip
Under "File" > "Data Sources...", create a new Driver. Make sure you load the hive-jdbc-uber-x.jar that you just downloaded.
jdbc:hive2://{host}:{port}/{database}[;<;,{:identifier}={:param}>]
Create a new "Project Data Source" using the new Driver. On the "General" tab, do the following:
Then add the following flags to "VM Options" on the "Advanced" tab.
-Dsun.security.krb5.debug=true
-Djavax.security.auth.useSubjectCredsOnly=false
After creating the "Project Data Source", test the connection. You should see the following:
You are now ready to execute your first query against Hive using DataGrip!
A note about the Hive JDBC Uber Jar When I first created this project the intent was to gather all required Hive dependencies into one single jar file to simplify scenarios like the one described here. This worked very well for connecting to non-kerberized clusters, but when I began to test against kerberized clusters I hit the following exception: java.lang.RuntimeException: Illegal Hadoop Version: Unknown (expected A.B.* format)
at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:168)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:143)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:129)
at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:54)
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:414)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:191)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:155)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
This exception is caused because a class named org.apache.hadoop.util.VersionInfo fails to find a file called *-version-info.properties when loaded by some tools. A number of articles on the web suggest resolving this "classpath" issue by copying jars into unnatural places or hacking tool startup scripts. Neither approach sat well with me. Instead, I enhanced the way org.apache.hadoop.util.VersionInfo locates the required properties file and included this updated version of the code in my jar. For more details, check out the README.
... View more
Labels:
06-09-2016
07:10 PM
This was the hint I needed. Here is a link to the Vagrantfile I used to test. It includes both the Kerberos command prerequisites and the Ambari Blueprint with related calls. The key, for me, was ensuring this was run before creating submitting the blueprint. # make sure Kerberos packages are installed
yum install krb5-libs krb5-server krb5-workstation -y
# modify Kerberos files
sed -i "s/kerberos.example.com/hdp-common-secure.hdp.local/gI" /etc/krb5.conf
sed -i "s/EXAMPLE.COM/hdp.local/gI" /etc/krb5.conf
sed -i "s/#//g" /etc/krb5.conf
sed -i "s/EXAMPLE.COM/hdp.local/gI" /var/kerberos/krb5kdc/kadm5.acl
# create Kerberos database and add principal. "Bbh2z8HrVx" is my master password
kdb5_util create -s -P Bbh2z8HrVx
kadmin.local -q 'addprinc -pw admin admin/admin' -w Bbh2z8HrVx
# start and enable Kerberos services
systemctl start krb5kdc
systemctl enable krb5kdc
systemctl start kadmin
systemctl enable kadmin
... View more
06-08-2016
01:10 AM
1 Kudo
I'm trying to create an ambari blueprint that will provision a single node cluster using KERBEROS (see https://issues.apache.org/jira/browse/AMBARI-13431 and Ambari Blueprint Example). My confusion is around the "credentials" block in the cluster creation template. All available documentation includes this snippet: "credentials" : [
{
"alias" : "kdc.admin.credential",
"principal" : "admin/admin",
"key" : "admin",
"type" : "TEMPORARY"
}
]
My question is this... Are the principal and key (password) included above intended to describe new credentials (to be created/used by ambari) or existing credentials previously created by calling something like: kadmin.local -q "addprinc admin/admin" It boils down to what KERBEROS configuration is required before using Blueprints to install and configure the cluster. In otherwords, how much of this should be done before creating the cluster via blueprints.
... View more
Labels:
- Labels:
-
Apache Ambari
05-18-2016
01:47 PM
2 Kudos
@Ancil McBarnett check out the repo i've been maintianing that uses maven to pull the required jars. https://community.hortonworks.com/repos/33592/hive-jdbc-uber-jar.html
... View more
05-16-2016
07:07 PM
1 Kudo
I started there but didn't see anything that looked like what i needed. There are a few options to PUT or POST but i didn't see anything that mapped users to queues. Lets see what else we can find.
... View more
05-16-2016
06:53 PM
2 Kudos
Is there a Java or REST API for mapping users to capacity queues similar to what is accomplished by setting "yarn.scheduler.capacity.queue-mappings" in "capacity-scheduler.xml". Looking for a programatic approach to manage queues as new users are added to the system.
... View more
Labels:
- Labels:
-
Apache YARN
05-12-2016
06:37 PM
1 Kudo
I copied falcon.jar from /usr/hdp/current/falcon-server/client/lib and ran prepare-war. That did the trick. Many thanks!
... View more
05-12-2016
06:07 PM
3 Kudos
After upgrading Ambari from 2.2.1.1 to 2.2.2.0, I get the following error when attempting to start Oozie. My HDP version is HDP-2.4.0.0-169. Any thoughts or workarounds? 2016-05-11 21:53:45,763 FATAL Services - SERVER[deadpool.lab.local] E0113: class not found [org.apache.oozie.extensions.OozieELExtensions]
org.apache.oozie.service.ServiceException: E0113: class not found [org.apache.oozie.extensions.OozieELExtensions]
at org.apache.oozie.service.ELService.findMethod(ELService.java:226)
at org.apache.oozie.service.ELService.extractFunctions(ELService.java:104)
at org.apache.oozie.service.ELService.init(ELService.java:135)
at org.apache.oozie.service.Services.setServiceInternal(Services.java:386)
at org.apache.oozie.service.Services.setService(Services.java:372)
at org.apache.oozie.service.Services.loadServices(Services.java:305)
at org.apache.oozie.service.Services.init(Services.java:213)
at org.apache.oozie.servlet.ServicesLoader.contextInitialized(ServicesLoader.java:46)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:802)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:676)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:602)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:503)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1322)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:325)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1068)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1060)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:759)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
2016-05-11 21:53:45,768 INFO Services - SERVER[deadpool.lab.local] Shutdown
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Oozie