Created 10-23-2020 05:38 AM
Trust you are doing fine!!
We came across an issue with anaconda to connect Hive environment. Hive version is 3.1.0 of Ambari 2.7.4 which is a multi node cluster. Python connection into Hive is working fine from RHEL server. But same environment is not getting connected via Anaconda from Windows10. Conda version is 4.9.0. Please find the exact error below
D:\ProgramData\Anaconda3\lib\site-packages\thrift_sasl\__init__.py in open(self)
83 if not ret:
84 raise TTransportException(type=TTransportException.NOT_OPEN,
---> 85 message=("Could not start SASL: %s" % self.sasl.getError()))
86
87 # Send initial response
TTransportException: Could not start SASL: b'Error in sasl_client_start (-4) SASL(-4): no mechanism available: Unable to find a callback: 2'
Python Code
============
import pandas as pd
import sasl
from pyhive import hive
con = hive.Connection(host="X.X.X.X", port=10000)
cur=con.cursor()
sq_str = " select from table"
df = pd.read_sql(sq_str,conn)
print(df)
Appreciate your quick help please.
Thanks
Created 10-24-2020 08:06 AM
@K_K Check out below thread this might help you.
Created 10-28-2020 11:15 PM
Hello GangWar,
Thanks for your response.
But the issue is not related to Kerberos connection. It's four node Ambari (2.7.4) cluster. Python connection is working fine from RHEL environment. But same connection is not working from Wnidows (Anaconda) environment. Hope this is clear of our issue.
Appreciate to help on this regards.
Thanks,
KK
Created 10-30-2020 02:39 AM
@K_K I don’t have any Win machine to this behaviour but I tried Googling this and found some links this might can help you.