Failed to load class for data source: com.databricks.spark.csv -


my build.sbt file has this:

scalaversion := "2.10.3" librarydependencies += "com.databricks" % "spark-csv_2.10" % "1.1.0" 

i running spark in standalone cluster mode , sparkconf sparkconf().setmaster("spark://ec2-[ip].compute-1.amazonaws.com:7077").setappname("simple application") (i not using method setjars, not sure whether need it).

i package jar using command sbt package. command use run application ./bin/spark-submit --master spark://ec2-[ip].compute-1.amazonaws.com:7077 --class "[classname]" target/scala-2.10/[jarname]_2.10-1.0.jar.

on running this, error:

java.lang.runtimeexception: failed load class data source: com.databricks.spark.csv

what's issue?

use dependencies accordingly. example:

<dependency>     <groupid>org.apache.spark</groupid>     <artifactid>spark-core_2.10</artifactid>     <version>1.6.1</version> </dependency>  <dependency>     <groupid>org.apache.spark</groupid>     <artifactid>spark-sql_2.10</artifactid>     <version>1.6.1</version> </dependency>  <dependency>     <groupid>com.databricks</groupid>     <artifactid>spark-csv_2.10</artifactid>     <version>1.4.0</version> </dependency> 

Comments

Popular posts from this blog

python - pip install -U PySide error -

arrays - C++ error: a brace-enclosed initializer is not allowed here before ‘{’ token -

cytoscape.js - How to add nodes to Dagre layout with Cytoscape -