php - Inserting large CSV with multiple SQL calls too slow -


at company, use program read csv files , insert data database. having trouble because needs able insert large amount of data ( 10,000 rows ) of time. @ first had looping through , inserting each record 1 @ time. slow because calls insert function 10,000 times... next tried group inserted 50 rows @ time concatenating sql call. have tried grouping sql calls 1,000 rows @ time, still slow.

another thing have change data. client gives spreadsheet data such username , password, usernames same, change them adding number @ end. i.e. jodoe, jodoe1. case there no password or username, have generate one. reason bring read using load data infile reads file fast , puts table, need edit before going table.

it time out after 120 seconds, , doesn't finished in time inserted 0's. need speed doesn't take long. not want change time limit because company thing. efficient way insert many rows of csv file database?

load data infile can perform numerous preprocessing operations loads data. might enough. if not, run php script process 1 csv file another, temporary, csv file, editing go. use load data infile on newly created file.


Comments

Popular posts from this blog

plot - Remove Objects from Legend When You Have Also Used Fit, Matlab -

java - Why does my date parsing return a weird date? -

Need help in packaging app using TideSDK on Windows -