Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

scala - Read files to compare the lines



I wanted to write hdfs file compare in scala functional programming. To start with I have written some code(googled to handle file closing and catching exceptions) to read a single file. I have proceeded so far, I was successful in reading first line but the code does not loop to read the next lines. Any help please. I do not want to use spark.

import{BufferedReader, FileInputStream, InputStreamReader}
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FSDataInputStream, FileSystem, Path}
import scala.util.{Failure, Success, Try}

object DRCompareHDFSFiles {
def main(args: Array[String]): Unit = {
val hdfs = FileSystem.get(new Configuration())
val path1 = new Path(args(0))
val path2 = new Path(args(1))
readHDFSFile(hdfs, path1, path2)

// Accept a parameter which implements a close method
def using[A <: { def close(): Unit }, B](resource: A)(f: A => B): B =
try {
} finally {

def readHDFSFile(hdfs: FileSystem, path1: Path, path2: Path): Option[Stream[(String,String)]] = {
Try(using(new BufferedReader(new InputStreamReader(
} match {
case Success(result) => {
I am expecting collections of string but get only string
case Failure(ex) => {
println(s"Could not read file $path1, detail ${ex.getClass.getName}:${ex.getMessage}")

def readFileStream(br: BufferedReader)= {
for {
line <- Try(br.readLine())
if (line != null )
} yield line




I got it working. I used Streams. Thanks

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.