NoSuchMethodError for "list.toMap" in spark-submit -


when ran spark-submit following simple spark program of:

import org.apache.spark.sparkcontext._ import org.apache.spark.sparkconf import org.apache.spark.rdd.rdd import org.apache.spark.sparkcontext import org.apache.spark._ import sparkcontext._  object test2{ def main(args:array[string]) {   val conf = new sparkconf().setappname("test")   val sc=new sparkcontext(conf)    val list=list(("aa",1),("bb",2),("cc",3))   val maps=list.tomap }  } 

i got java.lang.nosuchmethoderror line of "val maps=list.tomap". in spark-shell or scala, has no problem:

scala> val list=list(("aa",1),("bb",2),("cc",3)) list: list[(string, int)] = list((aa,1), (bb,2), (cc,3))  scala> val maps=list.tomap maps: scala.collection.immutable.map[string,int] = map(aa -> 1, bb -> 2, cc -> 3) 

so use "tomap" method, missing in spark-submit? use "sbt package" compile program , without problem. thanks!

p.s: build.sbt file as:

name := "test2" version := "1.0" scalaversion := "2.11.6" 

sbt package creates jar containing only project, not dependencies.

when using spark-submit, need jar contains project , dependencies except spark libraries (already provided spark installation on cluster)

this uber-jar can created using sbt assembly plugin:

  • in project folder, create file project/assembly.sbt line :

addsbtplugin("com.eed3si9n" % "sbt-assembly" % "0.14.0")

  • in build.sbt, spark dependencies should tagged "provided":

    librarydependencies ++= seq(   ...   "org.apache.spark"  %% "spark-core" % sparkversion % "provided"   ... ) 
  • run sbt assembly, create uber-jar target/scala-2.11/test2-1.0.jar


Comments

Popular posts from this blog

python - pip install -U PySide error -

arrays - C++ error: a brace-enclosed initializer is not allowed here before ‘{’ token -

cytoscape.js - How to add nodes to Dagre layout with Cytoscape -