Member since
09-09-2016
31
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4307 | 01-10-2018 02:25 PM |
09-05-2018
08:39 PM
Thanks Andrew. I thought that was probably the answer. Hoping there was a work around
... View more
08-30-2018
02:28 PM
Adding to this..... Obviously, the reason that I would want to use the example.com DNS instead of a specific server's FQDN is for fault tolerance, but out of curiosity what would happen to our environment if we setup the ldaps kereros using one of the AD server FQDNs and the one that we used was removed from the cluster or crashed later on down the road?
... View more
08-29-2018
07:39 PM
I'm setting up kerberos with an existing Active Directory as KDC and having an issue communicating to the ldaps server. We have a cluster of servers for AD. let's say server1.example.com,server2.example.com,server3.example.com and the company just uses example.com to connect. I've setup ldap integration with amabri for user access to the portal via the ambari-server setup-ldap, but did it without ssl and I can use ldap://example.com as the ldap server and it works fine. With ldaps, however, ldaps://example.com:636 doesn't work. I get an error in the ambari-server.log: "java.security.cert.CertificateException: No subject alternative DNS name matching example.com found". I have imported the CA cert and each individual server's certificate into my keystore and put the ca in /etc/pki/ca-trust/source/anchors/activedirectory.pem, but I still can't get it to work for example.com. I can get it to work for server1.example.com and all the others individually, but I can't get it to work for the example.com dns name. I don't have control over the certificate creation on the AD ldaps side. These certs were self-signed by the AD server and each server has it's own certificate. Is there anyway to tell ambari to accept invalid certs for the kerberos wizard, or any other way to get the broader domain name to work? Thanks in advance for any help.
... View more
Labels:
- Labels:
-
Apache Ambari
01-24-2018
08:30 PM
@wyang Do you have any insight into why I can't get hbase-spark to work with Spark 2.2?
... View more
01-10-2018
02:25 PM
I created a hive table with HBase integration and was able to read from that table in my Spark job to resolve this for now
... View more
01-02-2018
08:50 PM
@Dongjoon Hyun What dependency are you using to get it to work with 2.2? I'm getting a missing or invalid dependency detected while loading class file HBaseContext.class. Looking at the hortonworks repo (http://repo.hortonworks.com/content/repositories/releases/) it looks like version 1.1.0.2.6.3.0-235 is built for Spark 2.2, but the matching hbase-spark dependency POM has Spark 2.1.1 as the spark version still. I'm guessing that's probably my issue, if you were able to get it to work, maybe I'm just doing something wrong.
... View more
01-02-2018
07:15 PM
Looking for some suggestions on how to read HBase tables using Spark 2.2. I currently have HDP 2.6.3 installed and have been starting to use Spark 2.2. We have been using 1.6.3 with the spark hbase connector and that worked alright, but it doesn't seem to work with spark2. I also see a lot of references to using Phoenix, but that also doesn't support Spark2 until version 4.10 and HDP is still on 4.7. Does anyone have any suggestions or examples of how they are accomplishing interaction with HBase on Spark2?
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Spark
12-12-2017
04:46 PM
I do have HADOOP_USER_CLASSPATH_FIRST set to true. How do I find where the Hadoop classpath is? In the Hadoop_Env file it's just set as HADOOP_CLASSPATH=${HADOOP_CLASSPATH}${JAVA_JDBC_LIBS}
... View more
12-08-2017
05:10 PM
I'm getting a vertex failed error when I'm trying to run a query using the Hive interactive site. The actual error is a NoSuchMethodError for the org.apache.hadoop.ipc.RemoteException, but I'm not sure if that's the actual error or not. The query is joining 3 large tables together and It works fine if I just query one of the tables, but as soon as I join one of them together it fails with the below error. Most of the vertex failed questions I've found online have to do with memory, but their error message says something about memory in the error. Mine does not and I've tried doing all of the recommendation for their issues without any different result. The query with the joins seems to work if I turn off LLAP, but it takes a really long time and I want to be able to use this feature if possible. Does anyone know what might be the issue? I'm stuck on this one. Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 3, vertexId=vertex_1512748246177_0021_2_01, diagnostics=[Task failed, taskId=task_1512748246177_0021_2_01_000001, diagnostics=[TaskAttempt 0 failed, info=[org.apache.hadoop.ipc.RemoteException(java.lang.NoSuchMethodError): org.apache.log4j.MDC.put(Ljava/lang/String;Ljava/lang/String;)V
at org.apache.hadoop.hive.llap.daemon.impl.ContainerRunnerImpl.submitWork(ContainerRunnerImpl.java:214)
at org.apache.hadoop.hive.llap.daemon.impl.LlapDaemon.submitWork(LlapDaemon.java:547)
at org.apache.hadoop.hive.llap.daemon.impl.LlapProtocolServerImpl.submitWork(LlapProtocolServerImpl.java:101)
at org.apache.hadoop.hive.llap.daemon.rpc.LlapDaemonProtocolProtos$LlapDaemonProtocol$2.callBlockingMethod(LlapDaemonProtocolProtos.java:16728)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Tez
07-17-2017
03:30 PM
I like to use SQL Workbench/J. We use this tool for querying Amazon Redshift and Phoenix also. It's free and I haven't had any problems with it. I tried SQuirreL, but it was finicky and didn't like it much. Haven't tried DBVisualizer. http://www.sql-workbench.net/
... View more