Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

spark.read.json error with dataframe

spark.read.json error with dataframe

Explorer

Hi,

when I run it in spark-shell, it will execute fine, but when I run the same statement in scala program it error out as "overloaded method value".

 

Spark-Shell: spark.read.json(dataframe.select("col_name").as[String]).schema

 

After research I found that in scala we need to pass rdd as an argument instead of dataframe.

 

is there a way to make Spark-Shell statment to run in scala without converting it as an rdd.

 

Note: because my dataframe has both straight column values and json columns. above statement takes only json value as input.

 

Help is much appreciated.

 

Thanks,

Waseem

 

Don't have an account?
Coming from Hortonworks? Activate your account here