Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

calculating median on grouped data

avatar
Explorer

Hello! I was trying to use spark to calculate median on grouped values in a dataframe, but have not had much success. I have tried using agg(), but median() is not available; tried to apply rank() to window function but the rank was not grouped; also tried to pivot the table to avoid the grouped step but the data frame is huge (8million rows) and it fails multiple times. Calculating median should be something straightforward to do since data analysts use it a lot. Maybe I'm missing something obvious? 

 

Thanks!!

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Calculating a median or other quantiles is in general much harder than computing a moment like a mean. You want to look for functions like Spark that compute quantiles, rather than look for a median function -- median is the 0.5 quantile. There is an efficient approximate implementation for DataFrames in Spark.

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

Calculating a median or other quantiles is in general much harder than computing a moment like a mean. You want to look for functions like Spark that compute quantiles, rather than look for a median function -- median is the 0.5 quantile. There is an efficient approximate implementation for DataFrames in Spark.

avatar
Explorer

Thanks! Yes percent_rank() and window function together did the trick. A different way is to sort the column and pick the one that is in the middle. The results are close.