Support Questions
Find answers, ask questions, and share your expertise

Handling decimal values in spark scala

Handling decimal values in spark scala

Hi

I have the data in the data file as shown below

7373743343333444.
7373743343333432.

This data should be converted to decimal values and should be in a position of 8.7 where 8 are the digits before decimal and 7 are the digits after decimal.

I am trying to read the data file like below.

val readDataFile = Initialize.spark.read.format("com.databricks.spark.csv").option("header", "true").option("delimiter", "|").schema(***SCHEMA*****).load(****DATA FILE PATH******)


I have tried this

val changed = dataFileWithSchema.withColumn("COLUMN NAME ", dataFileWithSchema.col("COLUMN NAME ").cast(new DecimalType(38,3)))

println(changed.show(5))

but it gives me only the zeros like this

7373743343333444.0000

But I want digits How can I achieve this ?

can you please help me?

Don't have an account?