我想使用JDBC驱动程序从sql server导入一个表到spark local,并在其上运行spark sql。我下载了sqljdbc for sql server,并将这一行添加到conf目录下的spark-env.sh中:
SPARK_CLASSPATH= "C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/sqljdbc4.jar" ./bin/spark-shell
如此处所示
并使用这一行来加载数据df = sqlContext.load(source="jdbc", url="jdbc:sqlserver:dd", dbtable="Reporting.dbo.datatable")
但是,它抛出一个错误:
Py4JJavaError: An error occurred while calling o28.load.
: java.sql.SQLException: No suitable driver found for jdbc:sqlserver:PC-BFS2
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:118)
at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128)
at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:113)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Unknown Source)
所以早期版本的Spark可能会有这个问题,如果它也没有设置在驱动类路径上,所以你可以尝试将jar添加到驱动类路径中(可以用--driver-class-path
指定)。我对在Windows系统上部署不是很熟悉,但是您可能也希望按照http://spark.apache.org/docs/latest/configuration.html .
conf/spark-env.cmd
中设置属性。将以下行添加到spark-defaults.conf
spark.driver.extraClassPath "C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/*"
另一种方法是在执行spark-submit时传递驱动程序的位置,如下所示(它对我来说很有效)
./bin/spark-submit --driver-class-path C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/sqljdbc4.jar --master spark://ip:7077 mycode.py
请确保在$SPARK_HOME/conf/spark-default.conf和属性spark.jars