Hello
MyDac v6.00.0.3+BDS2006
In case of a huge table to be dumped, the memory is very quickly used up and an AV of "out of memory" showed.
that table is 1 million records with 2000 columns.
Is it possible to have TMyDump support huge tables?
TMyDump for huge table
-
AndreyZ
Hello,
You should dump huge tables to files instead of memory. You can use for this the following code:
You should dump huge tables to files instead of memory. You can use for this the following code:
Code: Select all
MyDump.BackupToFile('your_filename');Hello
It does not work.
I used
and memory are quickly eaten up until I got "out of memory" AV.
Reading source code, if I understand correctly, that BackupToFile are first dump into a TString object, and then write this object to file, so for huge table, this will not work since memory will always be not enough.
Would you please check this problem?
It does not work.
I used
Code: Select all
MyDump->BackupToFile(MyFileName)Reading source code, if I understand correctly, that BackupToFile are first dump into a TString object, and then write this object to file, so for huge table, this will not work since memory will always be not enough.
Would you please check this problem?
-
AndreyZ
Hello
Maybe this problem is linked to following post, that memory is used up not by the dumping procedure but by the fact that all data are read into memory?
http://www.devart.com/forums/viewtopic. ... highlight=
Maybe this problem is linked to following post, that memory is used up not by the dumping procedure but by the fact that all data are read into memory?
http://www.devart.com/forums/viewtopic. ... highlight=
-
AndreyZ
You are right, these problems are connected. We have answered here: http://www.devart.com/forums/viewtopic.php?t=20020