Really slow db update
Really slow db update
Hi, I'm trying UniDAC with SQLite DB (around 1GB database size). I have table with 600000 records. I'm trying to update every record to new value. I'm using TUniTable to load all data (CachedUPdates = True). Update of all rows is ok (a few seconds), but calling CommitUpdates() takes so long (more than 12 hours!!!!).
There is no grid attached to table/datasource and there is no event to slow it down.
Where is the problem? Anyone could help me? What is wrong?
Thanx
There is no grid attached to table/datasource and there is no event to slow it down.
Where is the problem? Anyone could help me? What is wrong?
Thanx
This is disposable update for whole table. Something like little batch crypto operation. There is no easy way how to made sql update for it, so it is better co count new values on client side. But I have no idea why update of every record takes 30+ seconds. It seems like every part of update is in transaction etc.
If you need them all on the client to do the processing then I think I would do it this way.
backup the table on the server (or get an empty copy of it to put the results in)
select the records and process them as you do now on the client but write the amended records to a copy of the table held on the client (e.g. in a VirtualTable) (or you could actually load them into a virtual table and update them in that table)
empty the table on the server (or working with your empty copy)
add all your new records from the virtual table using UniLoader (this will load them the fastest way)
I do something similar with SQL server, what DB are you using?
Regards, Paul.
backup the table on the server (or get an empty copy of it to put the results in)
select the records and process them as you do now on the client but write the amended records to a copy of the table held on the client (e.g. in a VirtualTable) (or you could actually load them into a virtual table and update them in that table)
empty the table on the server (or working with your empty copy)
add all your new records from the virtual table using UniLoader (this will load them the fastest way)
I do something similar with SQL server, what DB are you using?
Regards, Paul.
It is SQLite single file database. I'll test UniLoader. This is 1st time I'm trying UniDac components on SQLite db, so I just have to check all possibilities. And because this is really application for one use, I don't care about amount of data loaded to client. I'm just looking for solution for that slow update.
I'm not really familiar with SQLite. If that is running on the same PC as your application (not a remote DB server) then I'm not sure why there should be such a long delay in updating those records.
If it is local data, I would think cached updates will be slower than direct updates and it may be using up all your RAM and thrashing the swap file too much (just guessing).
Maybe when the developers can reply on Monday they will give you a better idea of what to look for.
Regards, Paul.
If it is local data, I would think cached updates will be slower than direct updates and it may be using up all your RAM and thrashing the swap file too much (just guessing).
Maybe when the developers can reply on Monday they will give you a better idea of what to look for.
Regards, Paul.
-
- Posts: 27
- Joined: Wed 30 Jan 2008 19:29
- Location: Brazil
1 - Verify if your sqlite database file is monitored by antivirus. I had problems in the past because antivirus was verifying database files when updating or inserting records slowing down all operations. Maybe this is your problem.
2 - You can try to open and commit transaction for each group of 1000 records in your update routine.
3 - Use unidirectional dataset to loop trough all records in your update routine.
Joao Cadilhe.
2 - You can try to open and commit transaction for each group of 1000 records in your update routine.
3 - Use unidirectional dataset to loop trough all records in your update routine.
Joao Cadilhe.
Hello
You are trying to load all 600000 records to the client, update them, and save back to the database. This is a huge amount of data for network traffic (if you are using a remote database) and for the memory at the client computer. So it is not the right method if you want to get good performance.
I fully agree with the previous suggestion.
Try to update first 100 records and tell us how long does it take:
---
var
start_transaction: TDateTime;
begin
UniQuery1.CachedUpdates := False;
UniQuery1.UniDirectional := True;
UniQuery1.FetchRows := 50;
UniQuery1.Open;
UniQuery1.First;
UniConnection1.StartTransaction;
start_transaction := Now;
while not UniQuery1.Eof do
begin
UniQuery1.Edit;
CryptRecord;
UniQuery1.Post;
UniQuery1.Next;
if UniQuery1.RecNo = 101 then
break;
end;
UniConnection1.Commit;
ShowMessage(FloatToStr((Now - start_transaction) * 24 * 60 * 60) + ' sec');
end;
---
CryptRecord - it is your crypt procedure.
And try update first 100 records again but remove call to your crypt procedure. Execute same code but comment one line:
// CryptRecord;
And tell us how long does it take again.
You are trying to load all 600000 records to the client, update them, and save back to the database. This is a huge amount of data for network traffic (if you are using a remote database) and for the memory at the client computer. So it is not the right method if you want to get good performance.
I fully agree with the previous suggestion.
Try to update first 100 records and tell us how long does it take:
---
var
start_transaction: TDateTime;
begin
UniQuery1.CachedUpdates := False;
UniQuery1.UniDirectional := True;
UniQuery1.FetchRows := 50;
UniQuery1.Open;
UniQuery1.First;
UniConnection1.StartTransaction;
start_transaction := Now;
while not UniQuery1.Eof do
begin
UniQuery1.Edit;
CryptRecord;
UniQuery1.Post;
UniQuery1.Next;
if UniQuery1.RecNo = 101 then
break;
end;
UniConnection1.Commit;
ShowMessage(FloatToStr((Now - start_transaction) * 24 * 60 * 60) + ' sec');
end;
---
CryptRecord - it is your crypt procedure.
And try update first 100 records again but remove call to your crypt procedure. Execute same code but comment one line:
// CryptRecord;
And tell us how long does it take again.
Hi, sorry for late reply.
I tried update with UniQuery instead of UniTable and I found another strange thing
With example in post above it was ok - full update of all records was done after 1-2 minutes. It is ok for me. BUT it works only if there is transaction in UniConnection. Without transaction it is slow again like before.
Just want to know - why?
I tried update with UniQuery instead of UniTable and I found another strange thing

Just want to know - why?
Hello
If you are trying to update a database without transaction then the database performs commit after each your update. If you are trying to update the database in transaction then the database performs commit only when transaction is finished. Each commit takes some time and decreases performance.
If you are trying to update a database without transaction then the database performs commit after each your update. If you are trying to update the database in transaction then the database performs commit only when transaction is finished. Each commit takes some time and decreases performance.
I am having similar problems with speed.
IU crete an im memory sqlite database
UniSQL->UniConnection
I DO NOT specify a datbase name in the uniConnection database property.
I can load 350,000 fields in 6 minutes using "insert into mytable ....."
If I specify a filename in the uniConnection database property the load time looks like it would be hours.
Any ideas?
IU crete an im memory sqlite database
UniSQL->UniConnection
I DO NOT specify a datbase name in the uniConnection database property.
I can load 350,000 fields in 6 minutes using "insert into mytable ....."
If I specify a filename in the uniConnection database property the load time looks like it would be hours.
Any ideas?
What do you mean by "BUT it works only if there is transaction in UniConnection"Raptor wrote:Hi, sorry for late reply.
I tried update with UniQuery instead of UniTable and I found another strange thingWith example in post above it was ok - full update of all records was done after 1-2 minutes. It is ok for me. BUT it works only if there is transaction in UniConnection. Without transaction it is slow again like before.
Just want to know - why?
I am having a similar problem?
Hello
Try to call UniConnection1.StartTransaction before inserting each 100 records and calling UniConnection1.Commit after inserting each 100 records.
For example:
Try to call UniConnection1.StartTransaction before inserting each 100 records and calling UniConnection1.Commit after inserting each 100 records.
For example:
Code: Select all
UniConnection1.StartTransaction;
for i := 0 to InsertRecordCount - 1 do
begin
UniQuery1.Insert;
// ... here fill new record
UniQuery1.Post;
if i mod 100 = 99 then
begin
UniConnection1.Commit;
UniConnection1.StartTransaction;
end;
end;
UniConnection1.Commit;