Parsing and updating a very large table

Discussion of open issues, suggestions and bugs regarding SDAC (SQL Server Data Access Components) for Delphi, C++Builder, Lazarus (and FPC)
Post Reply
thornad
Posts: 2
Joined: Wed 15 Mar 2006 22:31

Parsing and updating a very large table

Post by thornad » Wed 15 Mar 2006 22:43

What would be the best way to go in order to parse a 100 million record table, record by record, an update certain fields on some of those records ?

Basically I have a table of 100 mil records and a text file of 100 mil lines.
Both are sorted by product_id code.
All I need to do is to parse the table and look for the equivalent field in the text file. If the 'price' column is different , then I have to update the database table.
The problem is that it takes forever if I use 'update table set prrice=xxx where product_id=yyy' for each of the 100 million lines in the text file.

It takes about 2.5 days because the program processes about 500 lines per second. I need to get it 10-20 times faster than that.
Any suggestion is appreciated, thank you,
thor

Antaeus
Posts: 2098
Joined: Tue 14 Feb 2006 10:14

Post by Antaeus » Thu 16 Mar 2006 11:23

Please specify some details:
how many records are updated usually;
what computer would you like to withstand heavier duty: server or client.

jfpicard
Posts: 24
Joined: Tue 06 Dec 2005 21:17

Post by jfpicard » Thu 16 Mar 2006 15:05

How about taking your text file and insert it into a temporary table and then make some SQL to update only field that are differents (with a JOIN for exemple) ?

thornad
Posts: 2
Joined: Wed 15 Mar 2006 22:31

Post by thornad » Thu 23 Mar 2006 21:23

about 18 million records in the database
up to 500,000 records in the file

I am trying the bulk insert into temp table and then update.
I did a mnual test and it looked like it will go about 10 times faster (2 min instead of 20 min ).

Thank you.

Post Reply