ADLS Gen 1中DELTA格式文件的外部表



我们在ADLS Gen1上创建了许多数据块DELTA表。此外,在其中一个databricks工作区中,每个表的顶部都构建了外部表。

类似地,我试图在相同的DELTA格式文件上创建相同类型的外部表,但在不同的工作空间中
我确实可以通过ADLS Gen1的服务原则进行只读访问。所以我可以通过spark数据帧读取DELTA文件,如下所示:

read_data_df = spark.read.format("delta").load('dbfs:/mnt/data/<foldername>')

我甚至可以创建配置单元外部表,但在从同一个表读取数据时,我确实看到了以下警告:


Error in SQL statement: AnalysisException: Incompatible format detected.
A transaction log for Databricks Delta was found at `dbfs:/mnt/data/<foldername>/_delta_log`,
but you are trying to read from `dbfs:/mnt/data/<foldername>` using format("hive"). You must use
'format("delta")' when reading and writing to a delta table.
To disable this check, SET spark.databricks.delta.formatCheck.enabled=false
To learn more about Delta, see https://learn.microsoft.com/azure/databricks/delta/index
;

如果我"使用DELTA"创建外部表,那么我会看到不同的访问错误,如:

Caused by: org.apache.hadoop.security.AccessControlException: 
OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. 
Either the resource does not exist or the user is not authorized to perform the requested operation.). 
failed with error 0x83090aa2 (Forbidden. ACL verification failed.
Either the resource does not exist or the user is not authorized to perform the requested operation.). 

这是否意味着我需要完全访问,而不仅仅是只读?,在文件系统下面?

感谢

在升级到Databricks运行时环境到运行时版本DBR-7.3后解决。

最新更新