Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark Thrift Server for ODBC/JDBC connection

Spark Thrift Server for ODBC/JDBC connection

New Contributor

Team,

I have a requirement to install Thrift server for spark to connect Tableau VIA ODBC/JDBC . Can someone help me the steps to install Thrift server

5 REPLIES 5

Re: Spark Thrift Server for ODBC/JDBC connection

You can use...

su spark

./sbin/start-thriftserver.sh --master yarn-client --executor-memory 512m --hiveconf hive.server2.thrift.port=100015

For more information go to below link

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_installing_manually_book/content/startin...

Re: Spark Thrift Server for ODBC/JDBC connection

New Contributor

HI Mukesh,

Thanks for the update. But the above command is to start the thrift server or to just start ? I need to install . And my another question is. do i need to run the above command each and everytime when i use spark SQL ? please suggest

Highlighted

Re: Spark Thrift Server for ODBC/JDBC connection

No, you dont need to use to run command everytime when you use spark sql.

Thriftserver is also used for Hive, with a default port of 10000. In order to ensure there are no problems, I change the port to 100015.

Re: Spark Thrift Server for ODBC/JDBC connection

Expert Contributor

You may want to consider using Ambari to install and manage the Spark Thrift Server. Here's a link that explains how (for HDP 2.3): https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_spark-guide/content/install-sts-after-sp...

Re: Spark Thrift Server for ODBC/JDBC connection

Super Guru

If you install standalone Spark it comes with ThriftServer.

If you install HDP 2.5 you get Spark and Thrift and all included with Ambari management.

See here: http://spark.apache.org/docs/latest/sql-programming-guide.html just run the shell script

download the version you want (probably Spark 1.6.2 or 2.0.1) and unzip

http://spark.apache.org/downloads.html