What’s the best way to update several (200k)rows in a database?

Posted on

Question :

I’ve got a database that I need to update a column in some rows. I’ve got the data from my logs and I wrote php script that processes each log and extracts the data in to a file with an update for each record that needs it.

I broke it down in to each month but running this is taking forever. Is there a better way to do it?

The file is basically a .sql file filled with 205k rows of

update table set ended_at = '2019-01-31 23:59:30' where uuid = '304ad6ea-b949-42f5-96a0-a7a550e6126d';

Is there a better way?

Answer :

I would suggest that the schema is poorly designed if you have 205K rows with the identical uuid.

Furthermore, if you need all of them to have the same ended_at value, then there should be some other table that has, for example, one row with uuid and ended_at. The JOIN to get the data during a SELECT will be more than compensated for by having removed ended_at from the big table.

Perhaps I am being overly critical. If so, please provide some more insight into what the columns mean, and where the uuids come from, etc.

The best way to update “many” rows in a “huge” table is to walk through the table by the PRIMARY KEY, and update a thousand rows at a time. More details on chunking: http://mysql.rjweb.org/doc.php/deletebig#deleting_in_chunks

best technique to update with other table using inner join

UPDATE T1 SET T1.dept_ID = T2.dept_ID
FROM tblPro T1 JOIN (othertable) t2 ON T2.Doc_ID = T1.DOC_ID

IF YOU dont have other table just creat view or function and then join it with T1 and update as you like
Tariq Jamil

Leave a Reply

Your email address will not be published. Required fields are marked *