Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to run Spark Sql on Spark Thrift Server

How to run Spark Sql on Spark Thrift Server

Contributor

I have installed Spark Thrift Server on one of the node. Can any one suggest me how to execute Spark Sql on Spark Thrift Server or Please help me how to execute a spark job on Spark Thrift Server.

3 REPLIES 3

Re: How to run Spark Sql on Spark Thrift Server

Contributor

@shyam gurram You can try below example:

1.Inside spark-shell, Paste in :paste mode

val df = spark  
.read  
.option("url", "jdbc:hive2://localhost:10000")
.option("dbtable", "people")
.format("jdbc")  
.load

2. Connect to Spark Thrift Server at localhost on port 10000

3. Use people table. It assumes that people table is available.

Re: How to run Spark Sql on Spark Thrift Server

Contributor

@lraheja and team,

Still I am unable to see the job on Spark History Server.

I followed the below steps

1) spark-shell

2) scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

3) scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS employee(id INT, name STRING, age INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'")

I have a employee.txt in my directory (/home/sg865w/employee.txt)

4) scala> sqlContext.sql("LOAD DATA LOCAL INPATH 'employee.txt' INTO TABLE employee")

5) scala> val result = sqlContext.sql("FROM employee SELECT id, name, age")

6) scala> result.show()

7) :paste

  1. val df = spark
  2. .read
  3. .option("url","jdbc:hive2://localhost:10000")
  4. .option("dbtable","people")
  5. .format("jdbc")
  6. .load

8) Ctrl +D The out message I got it df = spark undefined.

Another Error I got is:

1)I am trying to start the spark-thrift-server it is throwing an error message.

Error Message:

17/01/13 00:49:45 INFO metastore: Connected to metastore. 17/01/13 00:49:45 INFO SessionState: Created local directory: /tmp/cc5a2235-5a42-4e82-acd8-0b9c9677c3f1_resources 17/01/13 00:49:45 INFO SessionState: Created HDFS directory: /tmp/hive/sg865w/cc5a2235-5a42-4e82-acd8-0b9c9677c3f1 17/01/13 00:49:45 INFO SessionState: Created local directory: /tmp/sg865w/cc5a2235-5a42-4e82-acd8-0b9c9677c3f1 17/01/13 00:49:45 INFO SessionState: Created HDFS directory: /tmp/hive/sg865w/cc5a2235-5a42-4e82-acd8-0b9c9677c3f1/_tmp_space.db SET spark.sql.hive.version=1.2.1 17/01/13 00:49:46 ERROR HiveThriftServer2: Error starting HiveThriftServer2 org.apache.hive.service.ServiceException: Unable to login to kerberos with given principal/keytab at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.init(SparkSQLCLIService.scala:58) at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$class.initCompositeService(SparkSQLCLIService.scala:79) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.initCompositeService(HiveThriftServer2.scala:263) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.init(HiveThriftServer2.scala:283) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:85) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.io.IOException: Login failure for hive/blpd217.bhdc.att.com@BRHMLAB01.LAB.ATT.COM from keytab /etc/security/keytabs/hive.service.keytab: javax.security.auth.login.LoginException: Unable to obtain password from user at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:987) at org.apache.hive.service.auth.HiveAuthFactory.loginFromKeytab(HiveAuthFactory.java:198) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.init(SparkSQLCLIService.scala:53) ... 20 more Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:856) at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:719) at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:584) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at javax.security.auth.login.LoginContext.invoke(LoginContext.java:784) at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203) at javax.security.auth.login.LoginContext$5.run(LoginContext.java:721) at javax.security.auth.login.LoginContext$5.run(LoginContext.java:719) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:718) at javax.security.auth.login.LoginContext.login(LoginContext.java:590) at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:978) ... 22 more 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 17/01/13 00:49:46 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 17/01/13 00:49:46 INFO SparkUI: Stopped Spark web UI at http://130.5.106.7:4042 17/01/13 00:49:46 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/01/13 00:49:46 INFO MemoryStore: MemoryStore cleared 17/01/13 00:49:46 INFO BlockManager: BlockManager stopped 17/01/13 00:49:46 INFO BlockManagerMaster: BlockManagerMaster stopped 17/01/13 00:49:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/01/13 00:49:46 INFO SparkContext: Successfully stopped SparkContext 17/01/13 00:49:46 ERROR Utils: Uncaught exception in thread pool-7-thread-1 java.lang.NullPointerException at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$anonfun$main$1.apply$mcV$sp(HiveThriftServer2.scala:80) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1801) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1.apply(ShutdownHookManager.scala:239) at org.apache.spark.util.SparkShutdownHookManager$anonfun$runAll$1.apply(ShutdownHookManager.scala:239) at scala.util.Try$.apply(Try.scala:161) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239) at org.apache.spark.util.SparkShutdownHookManager$anon$2.run(ShutdownHookManager.scala:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) 17/01/13 00:49:46 INFO ShutdownHookManager: Shutdown hook called 17/01/13 00:49:46 INFO ShutdownHookManager: Deleting directory /opt/data/data01/paxata/spark-8fdd13a5-0db8-42e7-b172-9b910d82df55 17/01/13 00:49:46 INFO ShutdownHookManager: Deleting directory /tmp/spark-79a77df7-0ab3-4be6-bea4-472bd8597cab 17/01/13 00:49:46 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/01/13 00:49:46 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 17/01/13 00:49:46 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.

Awaiting for your reply. Please let me know in case of any concerns.

Highlighted

Re: How to run Spark Sql on Spark Thrift Server

Expert Contributor

Is Spark Thrift Server installed via Ambari? Or done manually. It seems like the hive principal being used does not match the keytab being provided for Kerberos.

Don't have an account?
Coming from Hortonworks? Activate your account here