Page 1 of 1

Really slow db update

Posted: Sun 11 Apr 2010 09:04
by Raptor
Hi, I'm trying UniDAC with SQLite DB (around 1GB database size). I have table with 600000 records. I'm trying to update every record to new value. I'm using TUniTable to load all data (CachedUPdates = True). Update of all rows is ok (a few seconds), but calling CommitUpdates() takes so long (more than 12 hours!!!!).

There is no grid attached to table/datasource and there is no event to slow it down.

Where is the problem? Anyone could help me? What is wrong?

Thanx

Posted: Sun 11 Apr 2010 09:15
by hughespa
600000 records is a lot to bring to the client, update and send back. Are you not able to execute an SQL statement on the server side to update these records? Do you need them on the client to do some sort of test on them before updating them and sending them all back to the server?

Regards, Paul.

Posted: Sun 11 Apr 2010 09:20
by Raptor
This is disposable update for whole table. Something like little batch crypto operation. There is no easy way how to made sql update for it, so it is better co count new values on client side. But I have no idea why update of every record takes 30+ seconds. It seems like every part of update is in transaction etc.

Posted: Sun 11 Apr 2010 09:34
by hughespa
If you need them all on the client to do the processing then I think I would do it this way.

backup the table on the server (or get an empty copy of it to put the results in)

select the records and process them as you do now on the client but write the amended records to a copy of the table held on the client (e.g. in a VirtualTable) (or you could actually load them into a virtual table and update them in that table)

empty the table on the server (or working with your empty copy)

add all your new records from the virtual table using UniLoader (this will load them the fastest way)

I do something similar with SQL server, what DB are you using?

Regards, Paul.

Posted: Sun 11 Apr 2010 09:39
by Raptor
It is SQLite single file database. I'll test UniLoader. This is 1st time I'm trying UniDac components on SQLite db, so I just have to check all possibilities. And because this is really application for one use, I don't care about amount of data loaded to client. I'm just looking for solution for that slow update.

Posted: Sun 11 Apr 2010 09:52
by hughespa
I'm not really familiar with SQLite. If that is running on the same PC as your application (not a remote DB server) then I'm not sure why there should be such a long delay in updating those records.

If it is local data, I would think cached updates will be slower than direct updates and it may be using up all your RAM and thrashing the swap file too much (just guessing).

Maybe when the developers can reply on Monday they will give you a better idea of what to look for.

Regards, Paul.

Posted: Sun 11 Apr 2010 09:56
by Raptor
Yes, file is in my PC and there is no user or application using it. Direct update without cached mode was the same problem. Update of every row was so slow. Well, I will wait till Monday.

Posted: Sun 11 Apr 2010 14:35
by Joao Cadilhe
1 - Verify if your sqlite database file is monitored by antivirus. I had problems in the past because antivirus was verifying database files when updating or inserting records slowing down all operations. Maybe this is your problem.
2 - You can try to open and commit transaction for each group of 1000 records in your update routine.
3 - Use unidirectional dataset to loop trough all records in your update routine.

Joao Cadilhe.

Posted: Mon 12 Apr 2010 14:54
by bork
Hello

You are trying to load all 600000 records to the client, update them, and save back to the database. This is a huge amount of data for network traffic (if you are using a remote database) and for the memory at the client computer. So it is not the right method if you want to get good performance.

I fully agree with the previous suggestion.

Try to update first 100 records and tell us how long does it take:

---

var
start_transaction: TDateTime;

begin
UniQuery1.CachedUpdates := False;
UniQuery1.UniDirectional := True;
UniQuery1.FetchRows := 50;

UniQuery1.Open;
UniQuery1.First;

UniConnection1.StartTransaction;

start_transaction := Now;

while not UniQuery1.Eof do
begin
UniQuery1.Edit;

CryptRecord;

UniQuery1.Post;

UniQuery1.Next;

if UniQuery1.RecNo = 101 then
break;
end;

UniConnection1.Commit;

ShowMessage(FloatToStr((Now - start_transaction) * 24 * 60 * 60) + ' sec');
end;

---

CryptRecord - it is your crypt procedure.


And try update first 100 records again but remove call to your crypt procedure. Execute same code but comment one line:
// CryptRecord;

And tell us how long does it take again.

Posted: Wed 21 Apr 2010 19:52
by Raptor
Hi, sorry for late reply.

I tried update with UniQuery instead of UniTable and I found another strange thing :-) With example in post above it was ok - full update of all records was done after 1-2 minutes. It is ok for me. BUT it works only if there is transaction in UniConnection. Without transaction it is slow again like before.

Just want to know - why?

Posted: Fri 23 Apr 2010 07:36
by bork
Hello

If you are trying to update a database without transaction then the database performs commit after each your update. If you are trying to update the database in transaction then the database performs commit only when transaction is finished. Each commit takes some time and decreases performance.

Posted: Sat 22 May 2010 11:22
by sandy771
I am having similar problems with speed.

IU crete an im memory sqlite database
UniSQL->UniConnection
I DO NOT specify a datbase name in the uniConnection database property.

I can load 350,000 fields in 6 minutes using "insert into mytable ....."

If I specify a filename in the uniConnection database property the load time looks like it would be hours.

Any ideas?

Posted: Sat 22 May 2010 15:30
by sandy771
Raptor wrote:Hi, sorry for late reply.

I tried update with UniQuery instead of UniTable and I found another strange thing :-) With example in post above it was ok - full update of all records was done after 1-2 minutes. It is ok for me. BUT it works only if there is transaction in UniConnection. Without transaction it is slow again like before.

Just want to know - why?
What do you mean by "BUT it works only if there is transaction in UniConnection"

I am having a similar problem?

Posted: Tue 25 May 2010 09:56
by bork
Hello

Try to call UniConnection1.StartTransaction before inserting each 100 records and calling UniConnection1.Commit after inserting each 100 records.

For example:

Code: Select all

  UniConnection1.StartTransaction;

  for i := 0 to InsertRecordCount - 1 do
  begin
    UniQuery1.Insert;
    // ... here fill new record
    UniQuery1.Post;

    if i mod 100 = 99 then
    begin
      UniConnection1.Commit;
      UniConnection1.StartTransaction;
    end;
  end;

  UniConnection1.Commit;