我已经从databricks参考中构建了这个临时SQL函数,但得到apache/hive错误。这是最简单的,可以和SQL工作,那么为什么会出现错误?
%sql
create or replace temporary function fn_infinity()
returns timestamp
return
select to_timestamp('2099-12-31T23:59:59.999+0000')
错误:* org.apache.spark.sql。分析异常:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
File /databricks/spark/python/pyspark/sql/utils.py:209, in capture_sql_exception.<locals>.deco(*a, **kw)
208 try:
--> 209 return f(*a, **kw)
210 except Py4JJavaError as e:
File /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
325 if answer[1] == REFERENCE_TYPE:
--> 326 raise Py4JJavaError(
327 "An error occurred while calling {0}{1}{2}.n".
328 format(target_id, ".", name), value)
329 else:
Py4JJavaError: An error occurred while calling o363.sql.
: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
我已尝试重新连接群集。它是否与Unity Catalog有关?谢谢大家的建议。
就像-
"所以在数据块中,用户可以写入不同的'目录',默认的是'hive metastore'。看起来您可能需要在运行">
之前指定目录或设置目录。%sql
--USE CATALOG 'development';
use catalog '${env.catalog}';