- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
(Zeppelin) pyspark read hive TypeError: 'JavaPackage' object is not callable
- Labels:
-
Apache Hive
-
Apache Spark
-
Apache Zeppelin
Created ‎12-22-2022 12:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
when i tried to running this note
%spark2.pyspark
import pyspark
from pyspark_llap import HiveWarehouseSession
hive = HiveWarehouseSession.session(spark).build()
df = hive.execute(" show databases ")
the result is
File "/tmp/spark-00e8a412-9248-4a78-803f-3ca9d8ddac2d/userFiles-2f95cdc1-3caa-42d7-8356-5b022b67b3b9/pyspark_hwc-1.0.0.3.1.0.0-78.zip/pyspark_llap/sql/session.py", line 228, in session return HiveWarehouseBuilder.session(session) File "/tmp/spark-00e8a412-9248-4a78-803f-3ca9d8ddac2d/userFiles-2f95cdc1-3caa-42d7-8356-5b022b67b3b9/pyspark_hwc-1.0.0.3.1.0.0-78.zip/pyspark_llap/sql/session.py", line 44, in session jvm.com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(jsparkSession)) TypeError: 'JavaPackage' object is not callable
how do i fix it ? thanks
Created ‎12-26-2022 07:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i solved it by re configure spark.jars & spark.submit.Pyfiles in spark interpreters
Created ‎12-22-2022 04:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
HI @myzard Is the Hive warehouse connector configured with the interpterter? If so, ensure that HWC is configured correctly with the settings mentioned in the below doc.
Created ‎12-22-2022 06:29 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yes i already did that but still
Created ‎12-24-2022 12:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@myzard Can you share the screenshot of interpreter settings
Created ‎12-26-2022 07:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i solved it by re configure spark.jars & spark.submit.Pyfiles in spark interpreters
