所以在我的python代码运行后,我输入输入,程序就停止了。不知道逻辑错误是什么



我正在尝试为CSV文件创建一个简单的解析器。我正在尝试使用最简单的方法来查看解析器,以查找值(行(。

import csv
"""
with open('C:/MAIL07072021180029.csv', 'r') as file:
proxy = csv.reader(file)
for row in proxy:
print(row[0])
"""
identifyingnumber = input("What is the number? ")
with open('C:/MAIL07072021180029.csv', 'r') as file:
data = csv.reader(file)
for row in data:
if row[1] == (identifyingnumber):
print(row)

在我运行代码并输入代理号码(识别号码,excel中的数据(之后。程序刚刚停止?它没有打印行。

以下是我的csv文件中的示例数据:

Card Identifying Sequence
1873356

我打印出了第0行,如果该行打印成功,我会去掉识别号。

这是某种逻辑错误吗?我想不通。

您的csv文件中可能没有识别号,在这种情况下,由于您的"if"语句,将不会打印出任何内容。你可以检查你是否找到了这个数字,如果找到了,就打断(如果你不想在其他行中检查这个数字的另一个实例(,如果没有,就打印出你从未找到这个数字。

with open('C:/MAIL07072021180029.csv', 'r') as file:
data = csv.reader(file)
found = False
for row in data:
if row[1] == (identifyingnumber):
print(row)
found = True
break    # if you don't want to look for more instances of identifyingnumber
if not found:   # aka found == False
print('I did not find any row[1] equal to', identifyingnumber)    

我强烈建议您使用Pandas。在Pandas中打开csv,然后一次检查所有行的值,而不是遍历行。https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.read_csv.html

import pandas as pd
df = pd.read_csv('C:/MAIL07072021180029.csv')
col1 = df.columns[1]
dfi = df[df[col1] == identifyingnumber]
dfi = dfi.reset_index()
print(dfi['index'].tolist())        # there could be more than 1 row with the identifier

编辑以将我们在评论中讨论的所有内容放在一起:

file_list = ['C:/MAIL07072021180029.csv','C:/other_file.csv']
for f in file_list:
with open(f, 'r') as file:
data = csv.reader(file)
found = False
for row in data:
if found:   # aka found == True - if any column found the value, break to stop iterating
break
for col in range(1, len(row)):   # start at 1 to skip col 0
if row[col] == identifyingnumber:  # (identifyingnumber) is a tuple(), you probably don't want that
print(row)
found = True
break    # if you don't want to look for more instances of identifyingnumber
if not found:   # aka found == False
print('I did not find any row[1] equal to', identifyingnumber)

相关内容

  • 没有找到相关文章

最新更新