从时间戳范围内获取开始日期和结束日期



我在Spark(Scala(中有一个来自大型csv文件的数据帧。

数据帧类似于

key| col1 | timestamp            |
---------------------------------
1  | aa  | 2019-01-01 08:02:05.1 |
1  | aa  | 2019-09-02 08:02:05.2 | 
1  | cc  | 2019-12-24 08:02:05.3 |
2  | dd  | 2013-01-22 08:02:05.4 | 

我需要添加两列start_date&end_date类似于这个

key| col1 | timestamp            | start date              | end date              | 
---------------------------------+---------------------------------------------------
1  | aa  | 2019-01-01 08:02:05.1 | 2017-01-01 08:02:05.1   | 2018-09-02 08:02:05.2 |
1  | aa  | 2019-09-02 08:02:05.2 | 2018-09-02 08:02:05.2   | 2019-12-24 08:02:05.3 |
1  | cc  | 2019-12-24 08:02:05.3 | 2019-12-24 08:02:05.3   | NULL                  |
2  | dd  | 2013-01-22 08:02:05.4 | 2013-01-22 08:02:05.4   | NULL                  |

这里,

对于每一列";键";,end_date是同一密钥的下一个时间戳。然而;end_ date";最近日期的应为NULL。

到目前为止我尝试了什么

我尝试使用窗口函数来计算每个分区的排名

像这个


var df = read_csv() 
//copy timestamp to start_date
df = df
.withColumn("start_date", df.col("timestamp"))
//add null value to the end_date
df = df.withColumn("end_date", typedLit[Option[String]](None))
val windowSpec = Window.partitionBy("merge_key_column").orderBy("start_date")

df
.withColumn("rank", dense_rank()
.over(windowSpec))
.withColumn("max", max("rank").over(Window.partitionBy("merge_key_column")))

到目前为止,我还没有得到想要的输出。

在这种情况下使用window lead function

Example:

val df=Seq((1,"aa","2019-01-01 08:02:05.1"),(1,"aa","2019-09-02 08:02:05.2"),(1,"cc","2019-12-24 08:02:05.3"),(2,"dd","2013-01-22 08:02:05.4")).toDF("key","col1","timestamp")
import org.apache.spark.sql.expressions._
import org.apache.spark.sql.functions._
import org.apache.spark.sql._
val df1=df.withColumn("start_date",col("timestamp"))
val windowSpec = Window.partitionBy("key").orderBy("start_date")
df1.withColumn("end_date",lead(col("start_date"),1).over(windowSpec)).show(10,false)
//+---+----+---------------------+---------------------+---------------------+
//|key|col1|timestamp            |start_date           |end_date             |
//+---+----+---------------------+---------------------+---------------------+
//|1  |aa  |2019-01-01 08:02:05.1|2019-01-01 08:02:05.1|2019-09-02 08:02:05.2|
//|1  |aa  |2019-09-02 08:02:05.2|2019-09-02 08:02:05.2|2019-12-24 08:02:05.3|
//|1  |cc  |2019-12-24 08:02:05.3|2019-12-24 08:02:05.3|null                 |
//|2  |dd  |2013-01-22 08:02:05.4|2013-01-22 08:02:05.4|null                 |
//+---+----+---------------------+---------------------+---------------------+

最新更新