使用sparksql在databricks中嵌套case



我正试图在下面的查询中使用spark SQL中的嵌套事例

%sql SELECT CASE WHEN 1 > 0 THEN 
CAST(CASE WHEN 2 > 0 THEN 2.0 ELSE 1.2 END AS INT)
ELSE "NOT FOUND "

然而,我遇到错误

Error in SQL statement: ParseException: 
mismatched input '1' expecting {<EOF>, ';'}(line 1, pos 17)
== SQL ==
SELECT CASE WHEN 1 > 0 THEN 
-----------------^^^
CAST(CASE WHEN 2 > 0 THEN 2.0 ELSE 1.2 END AS INT)
ELSE "NOT FOUND "

databricks支持嵌套的case语句吗?如果是,那么请建议上面代码中的问题是什么。

spark.sql("""
SELECT  CASE WHEN 1 > 0 THEN CAST(CASE WHEN 2 > 0 THEN 2.0 ELSE 1.2 END AS INT)
ELSE 'NOT FOUND' END AS select_case
""").show(10)
+-----------+
|select_case|
+-----------+
|          2|
+-----------+

最新更新