Support Questions

Find answers, ask questions, and share your expertise

Getting Table not found Exception in %sql interpreter after registering it in %livy.sparkr interpreter

avatar
Super Collaborator

Hi experts,

I am using livy sparkr interpreter to do some analysis on airline data. I have registered my SparkR data frame as temporary table which works successfully but when I am running :

%sql

select * from tableName

It shows that table is not found. When I start sparkR, it starts a new application in Yarn and when I do select *, it starts an another application. Have a look at the yarn-applications.png. I am assuming the table got registered in the first sqlContext and I am doing select * from another sqlContext which do no have this table.

Is there a way to do both operations in just one application? If this is not possible, then how do I assign the same sqlContext across all applications?

Thanks in advance.

1 ACCEPTED SOLUTION

avatar
Super Collaborator

for now %livy.sql can only access tables registered %livy.spark, but not %livy.pyspark and %livy.sparkr.

View solution in original post

4 REPLIES 4

avatar
Super Collaborator

for now %livy.sql can only access tables registered %livy.spark, but not %livy.pyspark and %livy.sparkr.

avatar
Super Collaborator

thanks for the reply @jzhang, is there any way to initialize the new sqlContext object with the first one?

avatar
Super Collaborator

livy currently don't support spark context sharing between interpreters. You can use zeppelin's builtin spark interpreter for achieve this. (%spark, %spark.sql)

avatar
Super Collaborator

that makes perfect sense, thanks a lot.