如何将函数的结果保存为新的CSV



代码需要从csv文件中获取地址,然后使用函数计算相应的纬度和经度。虽然我得到了正确的纬度和经度,但我无法将它们保存到新的csv文件中。

导入请求导入urlib.parse进口熊猫作为pd

#获取坐标的函数:

def lat_long(add):
url = 'https://nominatim.openstreetmap.org/search/'+urllib.parse.quote(add)+'?format=json'
response = requests.get(url).json()
print(response[0]["lat"], response[0]["lon"])
return

#调用函数从CSV文件中获取5个地址值,并将其传递给函数

df = pd.read_csv('C:\Users\Umer Abbas\Desktop\lat_long.csv')
i = 0
print("Latitude","","Longitude")
for i in range (0,5):
add = df._get_value(i, 'Address')
lat_long(add)

输出为:

Latitude  Longitude
34.0096961 71.8990106
34.0123846 71.5787458
33.6038766 73.048136
33.6938118 73.0651511
24.8546842 67.0207055

我想将此输出保存到一个新文件中,但无法获得结果。

只需一个小的修改就可以帮助

def lat_long(add):
url = 'https://nominatim.openstreetmap.org/search/'+urllib.parse.quote(add)+'?format=json'
response = requests.get(url).json()
print(response[0]["lat"], response[0]["lon"])
Lat = response[0]["lat"]
Long = response[0]["lon"]
return Lat, Long
Lat_List = []
Long_List = []
df = pd.read_csv('C:\Users\Umer Abbas\Desktop\lat_long.csv')
i = 0
print("Latitude","","Longitude")
for i in range (0,5):
add = df._get_value(i, 'Address')
Lat =lat_long(add)[0]
Long = lat_long(add)[1]
Lat_List.append(Lat)
Long_List.append(Long)
df1 = pd.DataFrame(data, columns=['Latitude', 'Longitude])
df1['Latitude'] = Lat_List
df1['Longitude'] = Long_List
df1.to_csv("LatLong.csv)
#one line of change here
def lat_long(add):
url = 'https://nominatim.openstreetmap.org/search/'+urllib.parse.quote(add)+'?format=json'
response = requests.get(url).json()
print(response[0]["lat"], response[0]["lon"])
return response[0]["lat"], response[0]["lon"]   # return the lat and long
# three lines added here
df = pd.read_csv('C:\Users\Umer Abbas\Desktop\lat_long.csv')
i = 0
l=[]  # define empty list
print("Latitude","","Longitude")
for i in range (0,5):
add = df._get_value(i, 'Address')
l.append(lat_long(add))   # append to the empty l

# create a dataframe and output as csv
pd.DataFrame(l, columns=['Longitude', 'Latitude']).to_csv('test.csv', sep= ' ') 

相关内容

  • 没有找到相关文章

最新更新