我正试图为一些日志数据创建一个事件计数器和自第一个事件计数器以来的天数。下面的DataFrame跟踪一个组每天是否发生事件。对于每个组,我需要计算任何日期之前和当天发生的事件的数量。我还需要计算每个组中第一次事件发生后的天数
启动DF
group date event
A 2020-07-16 0
A 2020-07-17 1
A 2020-07-18 0
A 2020-07-19 1
A 2020-07-20 0
A 2020-07-21 0
A 2020-07-22 1
B 2020-07-16 1
B 2020-07-17 1
B 2020-07-18 0
B 2020-07-19 1
B 2020-07-20 0
B 2020-07-21 1
B 2020-07-22 1
生成DF 的代码
import pandas as pd
import datetime as datetime
base = datetime.datetime.today()
numdays = 7
date_list = [(base - datetime.timedelta(days=x)).date() for x in range(numdays)]
df = pd.DataFrame(columns=['group', 'date'])
for group in ['A', 'B']:
tmp = pd.DataFrame({'group': group, 'date': date_list})
df = df.append(tmp)
df = df.sort_values(['group', 'date'])
groupA_events = [0, 1, 0, 1, 0, 0, 1]
groupB_events = [1, 1, 0, 1, 0, 1, 1]
events = groupA_events + groupB_events
df['event'] = events
结束DF
group date event counter since_first
A 2020-07-16 0 0 0
A 2020-07-17 1 1 0
A 2020-07-18 0 1 1
A 2020-07-19 1 2 2
A 2020-07-20 0 2 3
A 2020-07-21 0 2 4
A 2020-07-22 1 3 5
B 2020-07-16 1 1 0
B 2020-07-17 1 2 1
B 2020-07-18 0 2 2
B 2020-07-19 1 3 3
B 2020-07-20 0 3 4
B 2020-07-21 1 4 5
B 2020-07-22 1 5 6
我的数据大约有80万行(而且还在增长(。我找到了一个可行(有点(但执行时间非常长的解决方案。
您可以使用cumsum
+cumcount
执行groupby
df['counter']=df.groupby('group').event.cumsum()
df['since_first']=df[df['counter'].ne(0)].groupby('group')['counter'].cumcount()
df['since_first'].fillna(0, inplace=True)
使用cumsum
获取计数器。此后的天数可以通过掩蔽transforming
到每组中有事件的第一天来获得。这很有用,因为如果你的日期不是连续的,它仍然可以正确计算时差。(clip
所以之前的任何东西都被认为是0(
df['counter'] = df.groupby('group').agg(counter=('event', 'cumsum'))
df['date'] = pd.to_datetime(df['date'])
s_first = df['date'].where(df['event'].eq(1)).groupby(df['group']).transform('first')
df['days_since'] = (df['date'] - s_first).dt.days.clip(lower=0)
group date event counter days_since
6 A 2020-07-16 0 0 0
5 A 2020-07-17 1 1 0
4 A 2020-07-18 0 1 1
3 A 2020-07-19 1 2 2
2 A 2020-07-20 0 2 3
1 A 2020-07-21 0 2 4
0 A 2020-07-22 1 3 5
6 B 2020-07-16 1 1 0
5 B 2020-07-17 1 2 1
4 B 2020-07-18 0 2 2
3 B 2020-07-19 1 3 3
2 B 2020-07-20 0 3 4
1 B 2020-07-21 1 4 5
0 B 2020-07-22 1 5 6