我有一个具有多个ID和日期的数据集,我在python中为累积供应创建了一个列。
我的数据如下
SKU Date Demand Supply Cum_Supply
1 20160207 6 2 2
1 20160214 5 0 2
1 20160221 1 0 2
1 20160228 6 0 2
1 20160306 1 0 2
1 20160313 101 0 2
1 20160320 1 0 2
1 20160327 1 0 2
2 20160207 0 0 0
2 20160214 0 0 0
2 20160221 2 0 0
2 20160228 2 0 0
2 20160306 2 0 0
2 20160313 1 0 0
2 20160320 1 0 0
2 20160327 1 0 0
其中Cum_supply计算公式为
idx = pd.MultiIndex.from_product([np.unique(data.Date), data.SKU.unique()])
data2 = data.set_index(['Date', 'SKU']).reindex(idx).fillna(0)
data2 = pd.concat([data2, data2.groupby(level=1).cumsum().add_prefix('Cum_')],1).sort_index(level=1).reset_index()
我想创建一个列"True_Demand",它是截至该日期的最大未满足需求 max(需求-供应)+ Cum_supply。
所以我的输出将是这样的:
SKU Date Demand Supply Cum_Supply True_Demand
1 20160207 6 2 2 6
1 20160214 5 0 2 7
1 20160221 1 0 2 7
1 20160228 6 0 2 8
1 20160306 1 0 2 8
1 20160313 101 0 2 103
1 20160320 1 0 2 103
1 20160327 1 0 2 103
2 20160207 0 0 0 0
2 20160214 0 0 0 0
2 20160221 2 0 0 2
2 20160228 2 0 0 2
2 20160306 2 0 0 2
2 20160313 1 0 0 2
2 20160320 1 0 0 2
2 20160327 1 0 0 2
因此,对于第 3 条记录(20160221),20160221之前的最大未满足需求为 5。因此,真实需求是 5+2 = 7,尽管该日期未满足的需求是 1+2。
数据帧的代码
data = pd.DataFrame({'SKU':[1,1,1,1,1,1,1,1,2,2,2,2,2,2,2,2],
'Date':[20160207,20160214,20160221,20160228,20160306,20160313,20160320,20160327,20160207,20160214,20160221,20160228,20160306,20160313,20160320,20160327],
'Demand':[6,5,1,6,1,101,1,1,0,0,2,2,2,1,1,1],
'Supply':[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]}
,columns=['Date', 'SKU', 'Demand', 'Supply'])
你会尝试这个非常有趣的单行吗?
(data.groupby('SKU',
as_index=False,
group_keys=False)
.apply(lambda x:
x.assign(Cum_Supply=x.Supply.cumsum())
.pipe(lambda x:
x.assign(True_Demand = (x.Demand - x.Supply + x.Cum_Supply).cummax()))))
输出:
Date SKU Demand Supply Cum_Supply True_Demand
0 20160207 1 6 2 2 6
1 20160214 1 5 0 2 7
2 20160221 1 1 0 2 7
3 20160228 1 6 0 2 8
4 20160306 1 1 0 2 8
5 20160313 1 101 0 2 103
6 20160320 1 1 0 2 103
7 20160327 1 1 0 2 103
8 20160207 2 0 0 0 0
9 20160214 2 0 0 0 0
10 20160221 2 2 0 0 2
11 20160228 2 2 0 0 2
12 20160306 2 2 0 0 2
13 20160313 2 1 0 0 2
14 20160320 2 1 0 0 2
15 20160327 2 1 0 0 2