我有以下2个dfs:
df1
|company|company_id| date | max_date |
| A21 | 5 |2021-02-04| 2021-02-11|
| A21 | 10 |2020-10-04| 2020-10-11|
| B52 | 8 |2021-03-04| 2021-04-11|
| B52 | 6 |2020-04-04| 2020-04-11|
-------------------------------------------
以及df2:
|company|company_id| date_df2 |
| A21 | 5 |2021-02-05|
| A21 | 5 |2021-02-08|
| A21 | 5 |2021-02-12|
| A21 | 5 |2021-02-11|
| A21 | 10 |2020-10-07|
| B52 | 8 |2021-03-07|
| B52 | 6 |2020-04-08|
| B52 | 6 |2020-04-12|
| B52 | 6 |2020-04-05|
-------------------------------
逻辑:对于df1
中的每个company
和company_id
,我想计算df2
中有多少次出现,其中date_df2 in df2
在date and max_date from df1
之间
预期结果:
|company|company_id| date | max_date |count|
| A21 | 5 |2021-02-04| 2021-02-11| 3 |
| A21 | 10 |2020-10-04| 2020-10-11| 1 |
| B52 | 8 |2021-03-04| 2021-04-11| 1 |
| B52 | 6 |2020-04-04| 2020-03-11| 2 |
------------------------------------------------
如何在熊猫身上实现这一点?
再现df:的代码
#df1
list_columns = ['company','company_id','date','max_date']
list_data = [
['A21',5,'2021-02-04','2021-02-11'],
['A21',10,'2020-10-04','2020-10-11'],
['B52',8,'2021-03-04','2021-04-11'],
['B52',6,'2020-04-04','2020-04-11']
]
df1 = pd.DataFrame(columns=list_columns, data=list_data)
#df2
list_columns = ['company','company_id','date']
list_data = [
['A21',5,'2021-02-05'],
['A21',5,'2021-02-08'],
['A21',5,'2021-02-12'],
['A21',5,'2021-02-11'],
['A21',10,'2020-10-07'],
['B52',8,'2021-03-07'],
['B52',6,'2020-04-08'],
['B52',6,'2020-04-12'],
['B52',6,'2020-04-05']
]
df2 = pd.DataFrame(columns=list_columns, data=list_data)
使用具有默认内部联接的DataFrame.merge
,然后通过Series.between
筛选匹配的valeus,通过GroupBy.size
聚合计数,并在必要时添加带有repalce缺失值的新列:
df1['date'] = pd.to_datetime(df1['date'])
df1['max_date'] = pd.to_datetime(df1['max_date'])
df2['date'] = pd.to_datetime(df2['date'])
df = df1.merge(df2, on=['company','company_id'], suffixes=('','_'))
s = (df[df['date_'].between(df['date'], df['max_date'])]
.groupby(['company','company_id'])
.size())
df1 = df1.join(s.rename('count'), on=['company','company_id']).fillna({'count':0})
print (df1)
company company_id date max_date count
0 A21 5 2021-02-04 2021-02-11 3
1 A21 10 2020-10-04 2020-10-11 1
2 B52 8 2021-03-04 2021-04-11 1
3 B52 6 2020-04-04 2020-04-11 2