到AWS MySql的Pyspark Dataframe:要求失败:驱动程序无法打开JDBC连接



我想在AWS RDS中的MySQL表中写入pyspark数据帧,但我一直收到错误

pyspark.sql.utils.IllegalArgumentException: requirement failed: The driver could not open a JDBC connection. Check the URL: jdbc:mysql:mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com

我的代码如下:

import os
import sys
spark = SparkSession.builder
.appName('test-app')
.config('spark.jars.packages', 'mysql:mysql-connector-java:8.0.28')
.getOrCreate()
properties = {'user':'admin', 'password':'password', 'driver':'com.mysql.cj.jdbc.Driver'}
resultDF.write.jdbc(url='jdbc:mysql:mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com', table='mcm_objects', properties=properties)
.mode('append')
.save()

我还尝试了url"jdbc:mysql://mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com',但后来我得到了错误:

java.sql.SQLException: No database selected

不知道我做错了什么。如有任何帮助,将不胜感激

table应为{dbName}.{dbtable}:

resultDF.write.jdbc(url='jdbc:mysql:mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com', table='{dbname}.mcm_objects', properties=properties)
.mode('append')
.save()

相关内容

  • 没有找到相关文章

最新更新