Created 08-14-2017 09:15 PM
If I have a sqlContext where I create temporary tables in a livy.sparkr interpreter, is there a way to get them to be shared in a livy.pyspark paragraph?
For example, if I create a temp table from livy.sparkr:
%livy.sparkr test_table <- sql(sqlContext, "SELECT * FROM mydatabase.mytable") registerTempTable(test_table, "test_table_tmp") cacheTable(sqlContext, "test_table_tmp") head(test_table, 5)
dxver dx ccs_cat 1 9 01000 C0001 2 9 01001 C0001 3 9 01002 C0001 4 9 01003 C0001
I can see it in a separate sparkr paragraph:
%livy.sparkr head(sql(sqlContext, "Show tables"),100)
tableName isTemporary 1 test_table_tmp TRUE 2 patient12 FALSE 3 src FALSE 4 temp_decimal FALSE
But it doesn't seem to appear in a livy pyspark paragraph:
%livy.pyspark sqlContext.sql("show tables").show(20)
+------------+-----------+ | tableName|isTemporary| +------------+-----------+ | patient12| false| | src| false| |temp_decimal| false| | test_json| false| | test_json_1| false| +------------+-----------+
This is running in Zeppelin 0.7.0.2.6 under HDP 2.6 and using Spark 1.6.3.
Created 08-14-2017 09:42 PM
@William Brooks It is a work in progress and currently you might not be able to share context between sparkr and pyspark. You can however share context between livy.spark and livy.sql
Created 08-23-2017 04:23 PM
Is there a roadmap/timeline or existing Jira for this feature?
Created 09-13-2017 12:01 AM
Please refer to these JIRAs :
https://issues.cloudera.org/browse/LIVY-194
https://issues.apache.org/jira/browse/LIVY-325
This feature that you are requesting will be available with %livy interpreter(and not %spark interpreter) when these 2 JIRAs are resolved