Support Questions
Find answers, ask questions, and share your expertise

Get multiple filter counts from Spark Scala Dataframe

Highlighted

Get multiple filter counts from Spark Scala Dataframe

Explorer

What is the most efficient way to get multiple counts from a Spark-Scala Dataframe without multiple scans

ex: val count1 = myDF.filer("col1 = 'val1' " ).count()

      val count2  = myDF.filter("col2 = val2").filter("col2 = val22").count()

      val count3  = myDF.filter("colx = valx").count()

      etc.

 

Looking for a way to get these counts in one scan of the Dataframe.

Note: have 100+ counts needed