Trust you are doing fine!!
We came across an issue with anaconda to connect Hive environment. Hive version is 3.1.0 of Ambari 2.7.4 which is a multi node cluster. Python connection into Hive is working fine from RHEL server. But same environment is not getting connected via Anaconda from Windows10. Conda version is 4.9.0. Please find the exact error below
D:\ProgramData\Anaconda3\lib\site-packages\thrift_sasl\__init__.py in open(self)
83 if not ret:
84 raise TTransportException(type=TTransportException.NOT_OPEN,
---> 85 message=("Could not start SASL: %s" % self.sasl.getError()))
87 # Send initial response
TTransportException: Could not start SASL: b'Error in sasl_client_start (-4) SASL(-4): no mechanism available: Unable to find a callback: 2'
import pandas as pd
from pyhive import hive
con = hive.Connection(host="X.X.X.X", port=10000)
sq_str = " select from table"
df = pd.read_sql(sq_str,conn)
Appreciate your quick help please.
@K_K Check out below thread this might help you.
Thanks for your response.
But the issue is not related to Kerberos connection. It's four node Ambari (2.7.4) cluster. Python connection is working fine from RHEL environment. But same connection is not working from Wnidows (Anaconda) environment. Hope this is clear of our issue.
Appreciate to help on this regards.
@K_K I don’t have any Win machine to this behaviour but I tried Googling this and found some links this might can help you.