我正在转换我的原始R代码,使其适用于使用sparklyr
包的Spark。我使用lubridate
包来计算两个日期之间的天数。在R中,这导致duration
数据类型,然后可以将其转换为数字数据类型,如下面的示例所示。
# Load packages
library(sparklyr)
library(dplyr)
library(lubridate)
# Create dataframe with start and end date
df <- tibble(start = ymd("20210101"),
end = ymd("20210105"))
df
---
# A tibble: 1 x 2
start end
<date> <date>
1 2021-01-01 2021-01-05
---
# Calculate duration and convert to numeric using R dataframe
df %>%
mutate(dur = end - start,
dur_num = as.numeric(dur))
---
# A tibble: 1 x 4
start end dur dur_num
<date> <date> <drtn> <dbl>
1 2021-01-01 2021-01-05 4 days 4
---
使用sparklyr
在Spark数据框架上执行完全相同的转换将生成错误,因为持续时间数据类型自动转换为字符串数据类型。代码和错误显示在下面的示例中。请忽略从本地R传输到Spark时由于时区差异导致的日期变化。
## Connect to local Spark cluster
sc <- spark_connect(master = "local", version = "3.0")
# Copy dataframe to Spark
df_spark <- copy_to(sc, df)
# Calculate duration using Spark dataframe
df_spark %>%
mutate(dur = end - start)
---
# Source: spark<?> [?? x 3]
start end dur
<date> <date> <chr>
1 2020-12-31 2021-01-04 4 days
---
# Calculate duration and convert to numeric using Spark dataframe
df_spark %>%
mutate(dur = end - start,
dur_num = as.numeric(dur))
---
Error: org.apache.spark.sql.AnalysisException: cannot resolve 'CAST(q01.`dur` AS DOUBLE)' due to data type
mismatch: cannot cast interval to double; line 1 pos 30;
'Project [start#58, end#59, dur#280, cast(dur#280 as double) AS dur_num#281]
+- SubqueryAlias q01
+- Project [start#58, end#59, subtractdates(end#59, start#58) AS dur#280]
+- SubqueryAlias df
+- LogicalRDD [start#58, end#59], false
---
是否可以在Spark中使用sparklyr
使用lubridate::duration
数据类型?如果没有,是否有任何方法可以绕过转换为字符串的天数作为double的结果?感谢所有的帮助。
你可以使用内置的hive函数:
df_spark %>%
mutate(dur = datediff(end, start))
# Source: spark<?> [?? x 3]
start end dur
<date> <date> <int>
1 2021-01-01 2021-01-05 4
如果它是一个datetime对象,则在执行diff之前将datetime对象转换为numeric,例如
df <- tibble(start = ymd_hms("20210101 00:00:00"),
end = ymd_hms("20210105 00:00:00"))
df_spark <- copy_to(sc, df)
df_spark %>%
mutate(dur = (as.numeric(end) - as.numeric(start))/(3600*24))
# Source: spark<?> [?? x 3]
start end dur
<dttm> <dttm> <dbl>
1 2021-01-01 00:00:00 2021-01-05 00:00:00 4