Page 1 of 1

Parsing and updating a very large table

Posted: Wed 15 Mar 2006 22:43
by thornad
What would be the best way to go in order to parse a 100 million record table, record by record, an update certain fields on some of those records ?

Basically I have a table of 100 mil records and a text file of 100 mil lines.
Both are sorted by product_id code.
All I need to do is to parse the table and look for the equivalent field in the text file. If the 'price' column is different , then I have to update the database table.
The problem is that it takes forever if I use 'update table set prrice=xxx where product_id=yyy' for each of the 100 million lines in the text file.

It takes about 2.5 days because the program processes about 500 lines per second. I need to get it 10-20 times faster than that.
Any suggestion is appreciated, thank you,
thor

Posted: Thu 16 Mar 2006 11:23
by Antaeus
Please specify some details:
how many records are updated usually;
what computer would you like to withstand heavier duty: server or client.

Posted: Thu 16 Mar 2006 15:05
by jfpicard
How about taking your text file and insert it into a temporary table and then make some SQL to update only field that are differents (with a JOIN for exemple) ?

Posted: Thu 23 Mar 2006 21:23
by thornad
about 18 million records in the database
up to 500,000 records in the file

I am trying the bulk insert into temp table and then update.
I did a mnual test and it looked like it will go about 10 times faster (2 min instead of 20 min ).

Thank you.