I am trying to setup spark thrift server, so that I can connect tableau desktop BI tool. My cluster is kerberized. I am running the following script
export set SPARK_HOME=/opt/spark-2.0.2-bin-hadoop2.7/
export set SPARK_CONF_DIR=/etc/cluster/spark/
export set HADOOP_CONF_DIR=/opt/spark-2.0.2-bin-hadoop2.7/hadoop/
./sbin/start-thriftserver.sh --master yarn --queue=DUQ20 --executor-memory 512m --hiveconf hive.server2.thrift.port=10001 --hiveconf hive.metastore.kerberos.keytab.file=/PATH/TO/FILE.keytab --hiveconf hive.metastore.kerberos.principal=SERVICE_ID@<SOME_URL>.COM
when I run the script and check the logs. i get the following error:
java.io.IOException: Kerberos principal name does NOT have the expected hostname part: DUQ20SRVEDL01@DEVMAPLE.DEVFG.RBC.COM
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:401)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
It seems it is expecting the principal name to be in the following format:
hive/_HOST@SOME_URL.COM
question
-----------
1- How do I make it accept my principal as is.
2- Can I generate keytab principal => hive/_HOST@SOME_URL.COM and the associated keytab?