将字典列表中的数据大容量插入Postgresql数据库[更快的方法]



例如:

books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]

我需要把每一本字典插入已经创建好的表中,只需取"author"、"price"我有大约10万条记录要插入表中。现在我正在做的是循环浏览字典列表,获取所需的键/值对,并逐个插入

def insert_books(self, val):
cur = self.con.cursor()
sql = """insert into testtable values {}""".format(val)
cur.execute(sql)
self.con.commit()
cur.close()
for i in books:
result = i['author'],i['price']
db_g.insert_books(result)   #db_g is class - connection properties

那么,有没有一种更快、更简单的方法可以一次大容量插入数据,比如10k?

我认为通过转储整个数据帧进行大容量插入会更快。。为什么批量导入比一堆INSERT更快?

import sqlalchemy
def db_conn():
connection = sqlalchemy.create_engine(//connection string)
return connection 

books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]
df_to_ingest = pd.DataFrame(books)
df_to_ingest = df_to_ingest([['author', 'price']])
df_to_ingest('tablename', db_conn(), if_exists='append', index=False)

希望这能帮助