如何在python中将文本文件转换为字典



我需要将不同长度的行转换为一个字典。这是关于玩家数据的。文本文件的格式如下。我需要返回每个玩家的统计数据的字典。

{Lebron James:(25,7,1),(34,5,6), Stephen Curry: (25,7,1),(34,5,6), Draymond Green: (25,7,1),(34,5,6)}

数据:

Lebron James
25,7,1
34,5,6
Stephen Curry
25,7,1
34,5,6
Draymond Green
25,7,1
34,5,6

我需要帮助启动代码。到目前为止,我有一个代码,删除空白行,并使行成为一个列表。

myfile = open("stats.txt","r") 
for line in myfile.readlines():  
    if line.rstrip():
         line = line.replace(",","")       
         line = line.split()

我认为这应该是你想要的:

data = {}
with open("myfile.txt","r") as f:
    for line in f:
        # Skip empty lines
        line = line.rstrip()
        if len(line) == 0: continue
        toks = line.split(",")
        if len(toks) == 1:
            # New player, assumed to have no commas in name
            player = toks[0]
            data[player] = []
        elif len(toks) == 3:
            data[player].append(tuple([int(tok) for tok in toks]))
        else: raise ValueErorr # or something

格式有点模糊,所以我们必须对名称做一些假设。我假设这里的名称不能包含逗号,但如果需要,您可以稍微放松一下,尝试解析int、int、int,如果解析失败,则返回到将其作为名称处理。

有一个简单的方法:

scores = {}
with open('stats.txt', 'r') as infile:
    i = 0
    for line in infile.readlines():
        if line.rstrip():
             if i%3!=0:
                 t = tuple(int(n) for n in line.split(","))
                 j = j+1
                 if j==1:
                    score1 = t # save for the next step
                 if j==2:
                    score  = (score1,t) # finalize tuple
              scores.update({name:score}) # add to dictionary
         else:
            name = line[0:-1] # trim n and save the key
            j = 0 # start over
         i=i+1 #increase counter
print scores

可以这样写:

For Python 2.x

myfile = open("stats.txt","r") 
lines = filter(None, (line.rstrip() for line in myfile))
dictionary = dict(zip(lines[0::3], zip(lines[1::3], lines[2::3])))

For Python 3.x

myfile = open("stats.txt","r") 
lines = list(filter(None, (line.rstrip() for line in myfile)))
dictionary = dict(zip(lines[0::3], zip(lines[1::3], lines[2::3])))

相关内容

  • 没有找到相关文章

最新更新