Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

how to get input file name of a record in spark dataframe?

New Contributor

I am creating a dataframe in spark by loading tab separated files from s3. I need to get the input file name information of each record in the dataframe for further processing. I tried

But I am getting null value for input_file_name.


Master Mentor

@Amal Babu this is my take, I'm sure there's better ways

import sqlContext.implicits._
val data = sc.wholeTextFiles("hdfs://")
val dataDF = data.toDF()"_1").show()


import sqlContext.implicits._
data: org.apache.spark.rdd.RDD[(String, String)] = hdfs:// MapPartitionsRDD[64] at wholeTextFiles at <console>:68
dataDF: org.apache.spark.sql.DataFrame = [_1: string, _2: string]
|                  _1|

as long as you use wholeTextFiles you should be able to maintain filenames. From the documentation SparkContext.wholeTextFiles lets you read a directory containing multiple small text files, and returns each of them as (filename, content) pairs. This is in contrast with textFile, which would return one record per line in each file.


@Amal Babu See this Stackoverflow question, I would follow that approach create a case class like they show:

case class Person(inputPath: String, name: String, age: Int)
val inputPath = "hdfs://localhost:9000/tmp/demo-input-data/persons.txt"
val rdd = sc.textFile(inputPath).map {
    l =>
      val tokens = l.split(",")
      Person(inputPath, tokens(0), tokens(1).trim().toInt)

//and than convert RDD to DF

import sqlContext.implicits._
val df = rdd.toDF()

New Contributor

inputFileName() will return a column containing the file location info of current dataframe

dont use it with select

New Contributor

try inputFiles function;

returns an array

for a dataframe df


var locationInfo = df.inputFiles(0) //might give OutOfIndexError

//locationInfo has the complete path like "/FileStore/tables/xyz.csv split it to get the name of file


to add a column to dataframe df with the file name

var df2 = df.withColumn("file_name", input_file_name()) //adds column with complete path of file

//create a UDF if you need on file name

def funFileName: ((String) => String) = { (s) =>(s.split("/")(3))}

import org.apache.spark.sql.functions.udf

val myFileName = udf(funFileName)

var df3 = df..withColumn("file_name",myFileName(input_file_name()))