Member since
12-18-2015
5
Posts
4
Kudos Received
0
Solutions
04-28-2016
03:04 AM
Finally, I was able to authenticate with LDAP from HiveServer2, the issue was with the LDAP Directory Server I was using, after changing it to the Virtual Directory Server it started working.... Now, I have another issue to use multiple organizational units to authenticate against. One being the user accounts from the OU=PEOPLE and other being the service accounts from OU=NONPEOPLE. With Hive 1.2.1 version, I am not able set the hive.server2.authentication.ldap.baseDN with both the OU's, but works if I set one at a time. I tried with hive.server2.authentication.ldap.customLDAPQuery but did not work.
... View more
04-22-2016
01:28 PM
In beeline, I get the below error, Error: Could not open client transport with JDBC Uri: jdbc:hive2://<hiveserver2>:10000/default;user=LDAP_Userid;password=LDAP_Password: Peer indicated failure: Error validating the login (state=08S01,code=0)
0: jdbc:hive2://<hiveserver2>:100 (closed)>
... View more
04-22-2016
12:51 AM
1 Kudo
I configured HiveServer2 to use LDAP by adding the below properties to the hive-site.xml through Ambari as given here. And restarted HiveServer2 and all the depended services through Ambari.
<property>
<name>hive.server2.authentication</name>
<value>LDAP</value>
</property> <property>
<name>hive.server2.authentication.ldap.url</name>
<value>ldap://ldaphostserver.com:389</value>
</property> <property>
<name>hive.server2.authentication.ldap.baseDN</name>
<value>dc=domain, dc=com</value>
</property>
After completing the above changes, and I try to connect to Hive through JDBC with the ldap userid/password or use Hive view in Ambari, I get the error: "Could not establish connecton to <HiveServer2Host>:10000: org.apache.thrift.transport.TTransportException: Peer indicated failure: Error validating the login: org.apache.thrift.transport.TTransportException: Peer indicated failure: Error validating the login" Java jdbc connection string used for connect:
DriverManager.getConnection("jdbc:hive2://<HiveServer2Host>:10000/<dbname>", "ldapuid", "ldappwd");
... View more
Labels:
- Labels:
-
Apache Hive
02-19-2016
07:21 PM
1 Kudo
Thanks all for your replies...
After adding fs.s3a.proxy.port & fs.s3a.proxy.host to the core-site.xml as Suggested by stevel, I am able to move HDFS files directly
to aws s3 using s3a:// URI scheme form distcp tool.
... View more
12-18-2015
03:02 AM
2 Kudos
I used hadoop distcp as given below: hadoop distcp hdfs://hdfs_host:hdfs_port/hdfs_path/hdfs_file.txt s3n://s3_aws_access_key_id:s3_aws_access_key_secret@my_bucketname/ My Hadoop cluster is behind the company http proxy server, I can't figure out how to specify this when connecting to s3. The error I get is: ERROR tools.DistCp: Invalid arguments: org.apache.http.conn.ConnectTimeoutException: Connect to my_bucketname.s3.amazonaws.com:443 timed out.
... View more
Labels:
- Labels:
-
Apache Hadoop