Entry tags:
Batch processing of huge tables
So, I just committed a code merge for an issue we found via LJ, where comment properties weren't being deleted when the comment was.
That takes care of new properties, but we still have old ones hanging around! How do we handle this? I'm assuming that due to the size of talkprop2, doing:
would be a horrible idea, but what can we do to make it less horrible? Should we even try to delete?
That takes care of new properties, but we still have old ones hanging around! How do we handle this? I'm assuming that due to the size of talkprop2, doing:
delete talkprop2 from talkprop2, talk2 where talkprop2.journalid=talk2.journalid AND talkprop2.jtalkid=talk2.jtalkid AND talk2.state="D";
would be a horrible idea, but what can we do to make it less horrible? Should we even try to delete?
no subject
no subject
no subject
no subject
no subject
However, I'm not opposed to purging out the data. This can be done on the inactive database to determine how slow of an operation is, and then we can run it at non-peak times on the active database.
I suppose the answer is yes, we should try to delete it.