我需要分 2 批将 1 数百万条记录从一个表插入到另一个表,在 MYSQL 中每批 500k



嗨,任何人都可以帮我完成在 MYSQL 中循环插入 700 万条记录(每年百万条记录(的程序,

我需要为每批插入一批 500,000 个.

数据在表"Archive_data"中,需要插入"Stg_table">

存档数据有年销售额,我想在 MYSQL 循环中编写一个循环,然后插入一批 500K

我试过了

insert into SDL_Stg_Bill_Details
select SDL_Id, Rec_Is_Processed, concat(Bill_Header_Key,'_',Row_Num), Bill_Header_Key,Row_Num from (
SELECT SDL_Id, Rec_Is_Processed, Bill_Details_Key, Bill_Header_Key,
ROW_NUMBER() OVER(partition by Bill_Header_Key order by SDL_Id ) Row_Num
FROM PANTALOONS_SOLUS_PROD.SDL_Stg_Bill_Details_Archive 
where EXTRACT(YEAR_MONTH FROM Bill_Date) in ('201406',
'201407',
'201408',
'201409',
'201410') ff

如果我尝试 7 M 条记录,我收到锁定等待超时超出错误 立即

提前致谢

不确定这是否是您想要的,但我希望通过一些调整可能会对您有所帮助:

DELIMITER $
CREATE PROCEDURE CopyDataInBatch()
BEGIN
DECLARE x  INT;
DECLARE y  INT;
SET x = 1;
SET y = x + 10000;
the_loop :  LOOP
IF  x > 1000000 THEN  /* Use the number of rows you wanted to copy*/
LEAVE  the_loop;
END  IF;
INSERT INTO table_in_which_copying (col_name_1, col_name_2)
SELECT (col_name_1, col_name_2) FROM table_from_which_copying
ORDER BY (col_name_1)
LIMIT x, y;
SET  x = x + 10000;
SET  y = y + 10000;
SELECT "Copied 10000 rows"; /* Only for testing... better remove it(?)*/
END LOOP;
END $ 
DELIMITER ;

最新更新