Support Questions
Find answers, ask questions, and share your expertise

How to sort each line of a rdd in spark using scala?


I have a textfile with the below data.


I want to sort each row in a descending order.I have tried the below code

val file = sc.textFile("Maximum values").map(x=>x.split(","))
val sorted = file.sortBy(x=> -x(2).toInt)

I got the below output

[[55, 56, 70, 78, 53], [10, 14, 16, 19, 52], [08, 09, 12, 20, 45]]

The above result shows that the entire list has been sorted in the descending order.But I'm looking to sort each and every value in descending order E.g

[10,14,16,19,52],[08,09,12,20,45],[55,56,70,78,53] should be [52,19,16,14,10],[45,20,12,09,08],[78,70,56,55,53]

Please spare sometime to answer this.Thanks in advance.


Hi @Abdul Rahim

this is not exactly a DataSet but you can get an idea.

I am also not sure about what you want to accomplish.

val l = List(List("10","14","16","19","52"), List("08","09","12","20","45"), List("55","56","70","78","53"))
l map ( i => i.sortBy(x => x.toInt).reverse)
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.