比较两个具有长列表的文件,以获得公共元素和相邻信息



我有两个大文件。文件A看起来像:

SNP_A-1780270 rs987435 7 78599583 - C G
SNP_A-1780271 rs345783 15 33395779 - C G
SNP_A-1780272 rs955894 1 189807684 - G T
SNP_A-1780274 rs6088791 20 33907909 - A G
SNP_A-1780277 rs11180435 12 75664046 + C T
SNP_A-1780278 rs17571465 1 218890658 - A T
SNP_A-1780283 rs17011450 4 127630276 - C T

拥有95万条线路。

文件B看起来像:

SNP_A-1780274
SNP_A-1780277
SNP_A-1780278
SNP_A-1780283
SNP_A-1780285
SNP_A-1780286
SNP_A-1780287

拥有90万条线路。

我需要在第1列的文件A中找到文件B的常见元素,并获得一个输出文件,如:

SNP_A-1780274 rs6088791 20 33907909 - A G
SNP_A-1780277 rs11180435 12 75664046 + C T
SNP_A-1780278 rs17571465 1 218890658 - A T
SNP_A-1780283 rs17011450 4 127630276 - C T

我如何在Python中以最有效的方式做到这一点?

我认为dict是理想的:

>>> sa = """SNP_A-1780270 rs987435 7 78599583 - C G
SNP_A-1780271 rs345783 15 33395779 - C G
SNP_A-1780272 rs955894 1 189807684 - G T
SNP_A-1780274 rs6088791 20 33907909 - A G
SNP_A-1780277 rs11180435 12 75664046 + C T
SNP_A-1780278 rs17571465 1 218890658 - A T
SNP_A-1780283 rs17011450 4 127630276 - C T"""
>>> dict_lines = {}
>>> for line in sa.split('n'):
dict_lines[line.split()[0]] = line

>>> sb = """SNP_A-1780274
SNP_A-1780277
SNP_A-1780278
SNP_A-1780283
SNP_A-1780285
SNP_A-1780286
SNP_A-1780287"""
>>> for val in sb.split('n'):
line = dict_lines.get(val, None)
if line:
print line

SNP_A-1780274 rs6088791 20 33907909 - A G
SNP_A-1780277 rs11180435 12 75664046 + C T
SNP_A-1780278 rs17571465 1 218890658 - A T
SNP_A-1780283 rs17011450 4 127630276 - C T

如果文件A的行与"key"列1相比很长,您可以尝试这种方法:

positions = {}
with open('fileA.txt') as fA:
pos = 0
for lineA in fA:
uid = lineA.split(' ')[0] #gets SNP_A-1780270
positions[uid] = pos
pos += len(lineA)
with open('fileB.txt') as fB, open('fileA.txt') as fA, open('fileC.txt', 'w') as out:
for lineB in fB:
pos = positions[lineB.strip()]
fA.seek(pos)
lineA = fA.readline()
out.write('%sn', lineA)

您应该检查pos += ...是更可靠还是file.tell()。我想,因为bufferin也参与其中。file.tell()不起作用,但可能pos += ...也需要重新调整。

与dict版本一样,这需要较少的内存,但由于文件A的处理,可能会较慢。

如果您可以从Python代码中调用join filea fileb > filec,它将为您提供所需内容。

相关内容

  • 没有找到相关文章

最新更新