ValueError:root_directory必须是绝对路径:从Synapse Workspace访问ADLS中的目



当尝试在Apache Spark中使用以下PySpark代码访问ADLS目录时,我得到了错误:

ValueError: root_directory must be an absolute path. Got abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/ instead.
Traceback (most recent call last):
File "/home/trusted-service-user/cluster-env/env/lib/python3.6/site-packages/great_expectations/core/usage_statistics/usage_statistics.py", line 262, in usage_statistics_wrapped_method
result = func(*args, **kwargs)

当我试图访问目录时,出现上述错误的代码如下:

data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)
context = BaseDataContext(project_config=data_context_config)

当我将代码更改为时

data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)

我收到以下错误消息:

PermissionError: [Errno 13] Permission denied: '/abfss:'
Traceback (most recent call last):

当我输入以下代码时

data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/'),
)
context = BaseDataContext(project_config=data_context_config)

我收到错误消息:

PermissionError: [Errno 13] Permission denied: '/expectations'
Traceback (most recent call last):

然而,我没有一个名为'/expects 的目录

顺便说一句,我正在尝试执行Great_Expectations。

Great_Expectations的开发人员已经通知我,这个错误将在新版本的Great_Expractions 中修复

最新更新