Support Questions
Find answers, ask questions, and share your expertise

Can temporary tables be shared in a sqlContext between sparkr and pyspark?

Rising Star

If I have a sqlContext where I create temporary tables in a livy.sparkr interpreter, is there a way to get them to be shared in a livy.pyspark paragraph?

For example, if I create a temp table from livy.sparkr:

test_table <- sql(sqlContext, "SELECT * FROM mydatabase.mytable")
registerTempTable(test_table, "test_table_tmp")
cacheTable(sqlContext, "test_table_tmp")
head(test_table, 5)
dxver    dx ccs_cat
1     9 01000   C0001
2     9 01001   C0001
3     9 01002   C0001
4     9 01003   C0001

I can see it in a separate sparkr paragraph:

head(sql(sqlContext, "Show tables"),100)
tableName isTemporary
1 test_table_tmp        TRUE
2      patient12       FALSE
3            src       FALSE
4   temp_decimal       FALSE

But it doesn't seem to appear in a livy pyspark paragraph:

sqlContext.sql("show tables").show(20)
|   tableName|isTemporary|
|   patient12|      false|
|         src|      false|
|temp_decimal|      false|
|   test_json|      false|
| test_json_1|      false|

This is running in Zeppelin under HDP 2.6 and using Spark 1.6.3.


@William Brooks It is a work in progress and currently you might not be able to share context between sparkr and pyspark. You can however share context between livy.spark and livy.sql

Rising Star

Is there a roadmap/timeline or existing Jira for this feature?

@William Brooks

Please refer to these JIRAs :

This feature that you are requesting will be available with %livy interpreter(and not %spark interpreter) when these 2 JIRAs are resolved