Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Unable to run Spark project in eclipse

Highlighted

Unable to run Spark project in eclipse

New Contributor

I have configured my scala project to run spark application. But I'm getting the below two errors,

1. Symbol 'term org.apache.hadoop' is missing from the classpath. This symbol is required by 'type org.apache.spark.rdd.RDD._$12'. Make sure that term hadoop is in your classpath and check for conflicting dependencies with `-Ylog-classpath`. A full rebuild may help if 'RDD.class' was compiled against an incompatible version of org.apache.

2. Symbol 'term <none>.hadoop.io' is missing from the classpath. This symbol is required by 'type org.apache.spark.rdd.RDD._$12'. Make sure that term io is in your classpath and check for conflicting dependencies with `-Ylog-classpath`. A full rebuild may help if 'RDD.class' was compiled against an incompatible version of <none>.hadoop.

Please let me know if there is any fix available.