- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Unable to use the Spark Conecxt Variable in the Spark-Shell (Scala) in CDH5.1
Created on ‎08-03-2014 02:48 AM - edited ‎09-16-2022 02:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am using Spark 1.0.0 in a CDH 5.1 VM. I am trying to use the Spark-Shell (Scala). But I am getting the below error regularly and not able to use thecontect variable
Please assist.
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_55) Type in expressions to have them evaluated. Type :help for more information. Spark context available as sc. scala> val text = sc.textfile("/user/cloudera/test1") <console>:12: error: value textfile is not a member of org.apache.spark.SparkContext val text = sc.textfile("/user/cloudera/test1")
Created ‎08-03-2014 02:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The method is "textFile" not "textfile"
https://spark.apache.org/docs/1.0.0/api/scala/index.html#org.apache.spark.SparkContext
Created ‎08-03-2014 02:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The method is "textFile" not "textfile"
https://spark.apache.org/docs/1.0.0/api/scala/index.html#org.apache.spark.SparkContext
