我在执行pyspark代码时收到此警告。我从S3到雪花写作。我的Snowflake-pyspark包是
net.snowflake:snowflake-jdbc:3.13.10,
net.snowflake:spark-snowflake_2.12:2.9.2-spark_3.1
我的本地pyspark版本是
Spark version 3.2.1
Hadoop version 3.3.1
警告:
WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.1 with a connector designed to support Spark 3.1. Either use the version of Spark supported by the connector or install a version of the connector that supports your version of Spark.
这是合适的包裹吗?或者我们还有其他的吗?我的程序正在按预期工作,从s3读取存储结果到snowflake。如何删除此警告?
对于Spark 3.2,您需要使用Snowflake Spark连接器2.10:
- 对于Scala 2.12:
https://search.maven.org/search?q=a:spark-雪花_2.12
- 对于Scala 2.13:
https://search.maven.org/search?q=a:spark-雪花_2.13