reduceByKey method not being found in Scala Spark -


attempting run http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala source.

this line:

val wordcounts = textfile.flatmap(line => line.split(" ")).map(word => (word, 1)).reducebykey((a, b) => + b) 

is throwing error

value reducebykey not member of org.apache.spark.rdd.rdd[(string, int)]   val wordcounts = logdata.flatmap(line => line.split(" ")).map(word => (word, 1)).reducebykey((a, b) => + b) 

logdata.flatmap(line => line.split(" ")).map(word => (word, 1)) returns mappedrdd cannot find type in http://spark.apache.org/docs/0.9.1/api/core/index.html#org.apache.spark.rdd.rdd

i'm running code spark source classpath problem ? required dependencies on classpath.

you should import implicit conversions sparkcontext:

import org.apache.spark.sparkcontext._ 

they use 'pimp library' pattern add methods rdd's of specific types. if curious, see sparkcontext:1296


Comments

Popular posts from this blog

php - render data via PDO::FETCH_FUNC vs loop -

c++ - OpenCV Error: Assertion failed <scn == 3 ::scn == 4> in unknown function, -

The canvas has been tainted by cross-origin data in chrome only -