Page 1 of 1
TMyDump for huge table
Posted: Tue 18 Jan 2011 08:42
by easyblue
Hello
MyDac v6.00.0.3+BDS2006
In case of a huge table to be dumped, the memory is very quickly used up and an AV of "out of memory" showed.
that table is 1 million records with 2000 columns.
Is it possible to have TMyDump support huge tables?
Posted: Tue 18 Jan 2011 13:06
by AndreyZ
Hello,
You should dump huge tables to files instead of memory. You can use for this the following code:
Code: Select all
MyDump.BackupToFile('your_filename');
Posted: Tue 18 Jan 2011 13:46
by easyblue
Hello
It does not work.
I used
and memory are quickly eaten up until I got "out of memory" AV.
Reading source code, if I understand correctly, that BackupToFile are first dump into a TString object, and then write this object to file, so for huge table, this will not work since memory will always be not enough.
Would you please check this problem?
Posted: Tue 18 Jan 2011 15:50
by AndreyZ
When MyDump performs backing up to a file, it writes data directly to a file. I have tried to backup a 4GB table. The project used only 8MB of memory. I can send you a sample for testing. Please send your e-mail address that can receive attachments to andreyz*devart*com.
Posted: Wed 19 Jan 2011 00:53
by easyblue
Hello
Maybe this problem is linked to following post, that memory is used up not by the dumping procedure but by the fact that all data are read into memory?
http://www.devart.com/forums/viewtopic. ... highlight=
Posted: Wed 19 Jan 2011 13:46
by AndreyZ
You are right, these problems are connected. We have answered here:
http://www.devart.com/forums/viewtopic.php?t=20020