Spark Unable to find JDBC Driver -


so i've been using sbt assembly package dependencies single jar spark jobs. i've got several jobs using c3p0 setup connection pool information, broadcast out, , use foreachpartition on rdd grab connection, , insert data database. in sbt build script, include

"mysql" % "mysql-connector-java" % "5.1.33" 

this makes sure jdbc connector packaged job. works great.

so started playing around sparksql , realized it's easier take dataframe , save jdbc source new features in 1.3.0

i'm getting following exception :

java.sql.sqlexception: no suitable driver found jdbc:mysql://some.domain.com/myschema?user=user&password=password @ java.sql.drivermanager.getconnection(drivermanager.java:596) @ java.sql.drivermanager.getconnection(drivermanager.java:233)

when running locally got around setting

spark_classpath=/path/where/mysql-connector-is.jar 

ultimately i'm wanting know is, why job not capable of finding driver when should packaged it? other jobs never had problem. can tell both c3p0 , dataframe code both make use of java.sql.drivermanager (which handles importing can tell) should work fine?? if there prevents assembly method working, need make work?

this person having similar issue: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-use-dataframe-with-mysql-td22178.html

have updated connector drivers recent version? did specify driver class when called load()?

map<string, string> options = new hashmap<string, string>(); options.put("url", "jdbc:mysql://localhost:3306/video_rcmd?user=root&password=123456"); options.put("dbtable", "video"); options.put("driver", "com.mysql.jdbc.driver"); //here dataframe jdbcdf = sqlcontext.load("jdbc", options);  

in spark/conf/spark-defaults.conf, can set spark.driver.extraclasspath , spark.executor.extraclasspath path of mysql driver .jar


Popular posts from this blog