Support Questions

Find answers, ask questions, and share your expertise

Data Type Conversion issue

avatar
Expert Contributor

Hi,

Trying to finalize lab 4 : Riskfactor Analysis with Spark.

When executing the final instruction, I got the following error:

%spark hiveContext.sql("create table riskfactor as select * from finalresults")

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 7, vds001.databridge.tn):

java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.DoubleWritable cannot be cast to org.apache.hadoop.io.LongWritable

Enclosed all log records.

Any idea please?spark-error.txt

4 REPLIES 4

avatar
Super Guru

@Wael Horchani

Earlier in the tutorial, you created the finalresults table using:

%spark
hiveContext.sql("create table finalresults( driverid String, occurance bigint,totmiles bigint,riskfactor double) stored as orc").toDF()

As you can see, riskfactor is a double and the error you are getting is a conversion error from double to long. You can try changing the table or recreating it as a Long instead of Double.

avatar
Expert Contributor

@Michael Young

Hi,

Why it try to convert the data type? the statement is "create as select", then the result table columns data type should follow the origin table. Why converting the data type of this column? Also, double data type is supported by Hive and when trying the same query in Hive it works fine (create table riskfactor as select * from finalresults;)

avatar
Expert Contributor
@Michael Young

Hi,

Long type is not permitted in Hive, trying to use float for riskfactor column, but error persist.

any idea please?

log-error.txt

avatar
Super Guru

@Wael Horchani

You can try setting riskfactor to a float. I've seen a newer version of the tutorial that uses float for that table definition.