Reply
Highlighted
New Contributor
Posts: 2
Registered: ‎10-07-2018

Simple Scala code not working in Spark-shell repl...

[ Edited ]

I have CDH 5.1 installed on a 5 node cluster.  I am building up a spark program from a series of REPL commands but am experiencing unexpected behaviour 

the commnads are as follows 

 

case class Reading(accountId: String, senderId: String, sensorId: String, metricName: String , readTime: Long , value: Double)

var r2 =Reading("sss","fff","FGGF","hjjj", 232L, 22.3)

import scala.collection.mutable.ListBuffer

var readings = ListBuffer[Reading]()

readings.append(r2)

 

The last line thows a mismatch error

 

<console>:18: error: type mismatch;
found : Reading
required: Reading
readings.append(r2)

 

This works as expected from a standalone instance of spark and scala  on an Ubuntu box , CDH is installed on Centos  where java versions differ  , Centos uses oracle jdk, whilst Ubuntu uses OpenJDK   .   I have tried the code on several instances of CDH that we have here locally and the issue is the same, 

 

I would expect the spark-shell repl to have full scala functionality ,  would this be  a valid assumption ?  

 

If some else could try the same commands on a CDH instance I would be grateful  to know if it worked as expected or not

 

vr

Hugh McBride

Master
Posts: 326
Registered: ‎07-01-2015

Re: Simple Scala code not working in Spark-shell repl...

Using scala version 2.11.8, no error is thrown.

scala> readings
res1: scala.collection.mutable.ListBuffer[Reading] = ListBuffer(Reading(sss,fff,FGGF,hjjj,232,22.3))
Announcements