使用来自另一个数据帧的行更新数据帧



如果你能帮我,我先谢谢你。我想要完成的是在相同的日期使用另一个数据帧(indexed_orders)更新一个充满零的数据帧,其中包含一个日期时间索引(我的交易数据帧)。我的代码如下:

import pandas as pd
import numpy as np
import os
import csv

orders = pd.read_csv('./orders/orders.csv', parse_dates=True, sep=',', dayfirst=True) #initiate orders data frame from csv data file
indexed_orders = orders.set_index(['Date']) #set Date as index for orders
print indexed_orders
symbol_list = orders['Symbol'].tolist() #creates list of symbols
symbols = list(set(symbol_list)) #gets rid of duplicates in list

dates_list = orders['Date'].tolist() #creates list of order dates
dates_orders = list(set(dates_list)) #gets rid of duplicates in list

start_date = '2011-01-05' #establish date range
end_date = '2011-01-20'
dates = pd.date_range(start_date, end_date) #establish dates from start_date and end_date
trade = pd.DataFrame(0, index = dates, columns = symbols) #establish trade data frame
trade['Cash'] = 0 #add column for future calculations
print trade

indexed_orders有哪些输出:

Date         Symbol Order  Shares
2011-01-10   AAPL   BUY    1500
2011-01-13   AAPL  SELL    1500
2011-01-13    IBM   BUY    4000
2011-01-26   GOOG   BUY    1000
2011-02-02    XOM  SELL    4000
2011-02-10    XOM   BUY    4000
2011-03-03   GOOG  SELL    1000
2011-03-03    IBM  SELL    2200
2011-06-03    IBM  SELL    3300
2011-05-03    IBM   BUY    1500
2011-06-10   AAPL   BUY    1200
2011-08-01   GOOG   BUY      55
2011-08-01   GOOG  SELL      55
2011-12-20   AAPL  SELL    1200

并为交易输出以下内容:

            GOOG  AAPL  XOM  IBM  Cash
2011-01-05     0     0    0    0     0
2011-01-06     0     0    0    0     0
2011-01-07     0     0    0    0     0
2011-01-08     0     0    0    0     0
2011-01-09     0     0    0    0     0
2011-01-10     0     0    0    0     0
2011-01-11     0     0    0    0     0
2011-01-12     0     0    0    0     0
2011-01-13     0     0    0    0     0
2011-01-14     0     0    0    0     0
2011-01-15     0     0    0    0     0
2011-01-16     0     0    0    0     0
2011-01-17     0     0    0    0     0
2011-01-18     0     0    0    0     0
2011-01-19     0     0    0    0     0
2011-01-20     0     0    0    0     0

我想根据idexed_orders中显示的日期更新我的交易数据框架,在正确的"Symbol"(即交易中的AAPL、IBM、GOOG和XOM名称)下的列中插入"Shares"的数量。我还希望当indexed_orders中的"Order"列指定"SELL"时,"Shares"的值为负。换句话说,我正试图提出更新贸易数据框架的代码,以便:打印贸易

            GOOG  AAPL  XOM  IBM  Cash
2011-01-05     0     0    0    0     0
2011-01-06     0     0    0    0     0
2011-01-07     0     0    0    0     0
2011-01-08     0     0    0    0     0
2011-01-09     0     0    0    0     0
2011-01-10     0  1500    0    0     0
2011-01-11     0     0    0    0     0
2011-01-12     0     0    0    0     0
2011-01-13     0 -1500    0 4000     0
2011-01-14     0     0    0    0     0
2011-01-15     0     0    0    0     0
2011-01-16     0     0    0    0     0
2011-01-17     0     0    0    0     0
2011-01-18     0     0    0    0     0
2011-01-19     0     0    0    0     0
2011-01-20     0     0    0    0     0

我认为某种迭代与嵌套的布尔语句是需要的,但我肯定有一个很难弄清楚。特别是,我很难想出一种方法来遍历行并基于索引日期时间进行更新。

首先,您可以使用Order列对份额的更改进行签名。然后,您可以按DateSymbol分组,并通过求和顺序进行聚合。这将为您提供所有特定日期的订单Series和在这些日子交易的Symbols。最后,使用unstackSeries转换为表格格式。

import numpy as np
import pandas as pd
df = pd.io.parsers.read_csv('temp.txt', sep = 't')
print df
'''
        Date Symbol Order  Shares
0    1/10/11   AAPL   BUY    1500
1    1/13/11   AAPL  SELL    1500
2    1/13/11    IBM   BUY    4000
3    1/26/11   GOOG   BUY    1000
4     2/2/11    XOM  SELL    4000
5    2/10/11    XOM   BUY    4000
6     3/3/11   GOOG  SELL    1000
7     3/3/11    IBM  SELL    2200
8     6/3/11    IBM  SELL    3300
9     5/3/11    IBM   BUY    1500
10   6/10/11   AAPL   BUY    1200
11    8/1/11   GOOG   BUY      55
12    8/1/11   GOOG  SELL      55
13  12/20/11   AAPL  SELL    1200
'''
df['SharesChange'] = df.Shares * df.Order.apply(lambda o: 1 if o == 'BUY' else -1)
df = df.groupby(['Date', 'Symbol']).agg({'SharesChange' : np.sum}).unstack().fillna(0)
print df
'''
         SharesChange
Symbol           AAPL  GOOG   IBM   XOM
Date
1/10/11          1500     0     0     0
1/13/11         -1500     0  4000     0
1/26/11             0  1000     0     0
12/20/11        -1200     0     0     0
2/10/11             0     0     0  4000
2/2/11              0     0     0 -4000
3/3/11              0 -1000 -2200     0
5/3/11              0     0  1500     0
6/10/11          1200     0     0     0
6/3/11              0     0 -3300     0
8/1/11              0     0     0     0
'''

相关内容

  • 没有找到相关文章

最新更新