Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

spark exception when reading a parquet file

avatar
New Contributor

when I try to read parquet file from Azure datalake container from databricks, I am getting spark exemption. Below is my query

 

 


import pyarrow.parquet as pq
from pyspark.sql.functions import *
from datetime import datetime
data = spark.read.parquet(f"/mnt/data/country/abb/countrydata.parquet")

 

 

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 14.0 failed 4 times, most recent failure: Lost task 0.3 in stage 14.0 (TID 35) (10.135.39.71 executor 0): org.apache.spark.SparkException: Exception thrown in awaitResult:

 

what does this mean? What I need to do for this?

1 REPLY 1

avatar
Expert Contributor

@shamly  Can you share full stack trace?