我创建了一个 pyspark 脚本,当我使用 spark-submit
执行它时,它工作正常:
spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.11:2.0.6 --conf spark.cassandra.connection.host=12.34.56.68 test_cassandra.py
由于我正在使用 Azure 数据工厂,因此我也想在 ADF 中执行此作业。我创建了以下作业:
{
"name": "spark write to cassandra",
"type": "HDInsightSpark",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"rootPath": "dev/apps/spikes",
"entryFilePath": "test_cassandra.py",
"sparkConfig": {
"packages": "datastax:spark-cassandra-connector:2.0.7-s_2.10",
"conf": "spark.cassandra.connection.host=12.34.56.78"
},
"sparkJobLinkedService": {
"referenceName": "linkedServiceStorageBlobHDI",
"type": "LinkedServiceReference"
}
},
"linkedServiceName": {
"referenceName": "linkedServiceHDI",
"type": "LinkedServiceReference"
}
}
我以为这就足够了,但包装显然有问题。我收到错误:
java.lang.ClassNotFoundException:找不到数据源:org.apache.spark.sql.cassandra。请在 https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects 找到套餐
你能帮我正确配置我的活动以使其运行吗?
在 ADF 中,它们在 Spark-submit 中的选项略有不同。
--packages
变得spark.jars.packages
,--conf spark.cassandra.connection.host=12.34.56.78
变得"spark.cassandra.connection.host": "12.34.56.78"
最终代码是:
{
"name": "spark write to cassandra",
"type": "HDInsightSpark",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"rootPath": "dev/apps/spikes",
"entryFilePath": "test_cassandra.py",
"sparkConfig": {
"spark.jars.packages": "datastax:spark-cassandra-connector:2.0.7-s_2.10",
"spark.cassandra.connection.host": "12.34.56.78"
},
"sparkJobLinkedService": {
"referenceName": "linkedServiceStorageBlobHDI",
"type": "LinkedServiceReference"
}
},
"linkedServiceName": {
"referenceName": "linkedServiceHDI",
"type": "LinkedServiceReference"
}
}
失败是由于依赖项包定义不正确导致的。您可以按照下面的文档将所有依赖项 jar 包含在活动的 rootPath 的子目录 "jars" 中:
https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-spark#folder-structure
或者,可以使用 Spark 内置机制通过自定义 Spark 配置来管理依赖项。有关详细信息,请参阅以下文档:https://spark.apache.org/docs/latest/configuration.html#available-properties