这是我用来将excel(使用Maatwebsite Laravel-Excel 2(文件导入数据库的方法的主要代码:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
DB::table('medidores')->insert([
"med" => $med,
"serial_number" => $serial
]);
}
DB::commit();
} catch (Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
当我有"很少"的数据(假设 excel 文件中少于 5000 行(时,这工作正常。但是我需要处理一个大的 excel 文件,该文件有大约 140 万行,分为 1 张以上的工作表。如何使我的方法更快?有什么提示吗?
编辑:我将使用答案注释之一链接上的代码编辑问题:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
$collection = collect($bulk_data); //turn data into collection
$chunks = $collection->chunk(100); //split into chunk of 100's
$chunks->toArray(); //convert chunk to array
//loop through chunks:
foreach($chunks as $chunk)
{
DB::table('medidores')->insert($chunk->toArray());
}
DB::commit();
} catch (Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
块的东西对我有用。
是的,您可以,而不是执行 X(数据库请求数(* N(工作表数(尝试执行简单的批量插入,这只会花费您循环数据保存 X * N 数据库请求的复杂性,这里有一个例子:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
DB::table('medidores')->insert($bulk_data);
DB::commit();
} catch (Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
您可以参考此答案以获取有关 db 请求的更多说明: https://stackoverflow.com/a/1793209/8008456