Support Questions
Find answers, ask questions, and share your expertise

spark.read.json error with dataframe

spark.read.json error with dataframe

Explorer

Hi,

when I run it in spark-shell, it will execute fine, but when I run the same statement in scala program it error out as "overloaded method value".

 

Spark-Shell: spark.read.json(dataframe.select("col_name").as[String]).schema

 

After research I found that in scala we need to pass rdd as an argument instead of dataframe.

 

is there a way to make Spark-Shell statment to run in scala without converting it as an rdd.

 

Note: because my dataframe has both straight column values and json columns. above statement takes only json value as input.

 

Help is much appreciated.

 

Thanks,

Waseem