OutOfMemory error inserting a very large file in a blob

Discussion of open issues, suggestions and bugs regarding SDAC (SQL Server Data Access Components) for Delphi, C++Builder, Lazarus (and FPC)
Post Reply
brace
Posts: 213
Joined: Wed 14 Feb 2007 08:26

OutOfMemory error inserting a very large file in a blob

Post by brace » Wed 14 Jul 2010 17:14

Hello I am getting errors in trying to insert very large files into the database. This is a BLOCKING ERROR FOR ME, please check and confirm it is a SDAC bug.
I am using filestream, initially I thought this was a problem of fielstream, but then I tried the same without filestream and I realized that the problem is still there.

I had this problem by trying to insert in a varbinary(max) field a big file. I tried with different file sizes:

with 465 MB it works
with 560 MB it doesn't

It gives an error.

As I try to insert the document after a while I have the following exception:

---------------------------
Debugger Exception Notification
---------------------------
Project Project1.exe raised exception class EOutOfMemory with message 'Out of memory'.
---------------------------
Break Continue Help
---------------------------


The exception comes from CLRClasses.pas:

class function Marshal.ReallocHGlobal(pv: pointer; cb: pointer): pointer;
begin
Result := pv;
ReallocMem(Result, Integer(cb)); -- EXCEPTION COMES FROM HERE!!!
end;

I can send you a sample anyway it is very easy to create by following the following instructions.

TO REPRODUCE IT ADD THIS TABLE TO A NEW DATABASE:

Code: Select all

 CREATE TABLE [dbo].[DOC_FILES](
  [ID_DOC_FILE] [int] NOT NULL,
  [DOCUMENT] varbinary(MAX) NULL

  CONSTRAINT [PK_DOC_FILES] PRIMARY KEY CLUSTERED 
 (
  [ID_DOC_FILE] ASC
 )WITH (PAD_INDEX  = OFF, IGNORE_DUP_KEY = OFF, FILLFACTOR = 90) ON [PRIMARY]
 ) ON [PRIMARY]
Create a new Delphi project and on a form drop a TMSConnection and a TMSQuery with the following sqlcode:

Code: Select all

INSERT INTO DOC_FILES(ID_DOC_FILE, DOCUMENT) 
VALUES(:ID_DOC_FILE, :DOCUMENT)
then drop a button with this code on the OnClick event:

Code: Select all

procedure TForm1.Button1Click(Sender: TObject);
var
  sFilePath: String;
begin
  sFilePath := 'D:\Test\VeryBigFile.dat';
  sqlInsertDoc.ParamByName('ID_DOC_FILE').AsInteger := 1;
  sqlInsertDoc.ParamByName('DOCUMENT').LoadFromFile(sFilePath, ftblob);
  sqlInsertDoc.Execute;
  sqlInsertDoc.Close;
end;
Of course try with a file that is larger than 560MB.

I am using SDAC 4.80.0.59 - Delphi 2009. I reproduced the problem on XP, Vista and 7.

Dimon
Devart Team
Posts: 2885
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Mon 19 Jul 2010 10:07

The point is that to pass big blob to SQL Server we should alloc one memory block and pass its address to SQL provider. But on allocating big memory size Delphi can not do it and generates the OutOfMemory error.
We can not avoid this because it is connected with the specificity of working with blobs of the SQL provider.

brace
Posts: 213
Joined: Wed 14 Feb 2007 08:26

Post by brace » Mon 19 Jul 2010 14:18

Can you suggest a workaround? My code inserts blobs from 1 KB to any size (of course 500MB is not a smart size, anyway I need to support also that).

In stackoverflow I asked the same question and they repleid to try this (.net code), can yuo suggest something similar, or anyway a workaround that allows me to put "any size" document into a blob?

Taken from: http://stackoverflow.com/questions/3247 ... as-filestr

Code: Select all

'Open the FILESTREAM data file for writing
Dim fs As New SqlFileStream(filePath, txContext, FileAccess.Write)

'Open the source file for reading
Dim localFile As New FileStream("C:\temp\microsoftmouse.jpg",
                                FileMode.Open,
                                FileAccess.Read)

'Start transferring data from the source file to FILESTREAM data file
Dim bw As New BinaryWriter(fs)
Const bufferSize As Integer = 4096
Dim buffer As Byte() = New Byte(bufferSize) {}
Dim bytes As Integer = localFile.Read(buffer, 0, bufferSize)

While bytes > 0
    bw.Write(buffer, 0, bytes)
    bw.Flush()
    bytes = localFile.Read(buffer, 0, bufferSize)
End While

'Close the files
bw.Close()
localFile.Close()
fs.Close()

Dimon
Devart Team
Posts: 2885
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Wed 21 Jul 2010 13:44

Unfortunately, at present time SDAC does not support FILESTREAM, therefore you cannot use this way. We will investigate the possibility of adding this functionality in the future.

brace
Posts: 213
Joined: Wed 14 Feb 2007 08:26

Post by brace » Thu 22 Jul 2010 10:38

Ok, so at present it is not possible to insert a blob > 500MB with SDAC, even if SQL Server doesn't have this limitation. Even without filestream.

brace
Posts: 213
Joined: Wed 14 Feb 2007 08:26

Post by brace » Thu 22 Jul 2010 13:28

For now I will not allow to store a big document, I will set 400MB as limit, but this OutOfMemory threshold (= the min size of the file that will generate the error) is it fixed or it depends on Operating System or Sys Resources?

I use Sql Server 2008 only.

I mean: if I set 400MB as limit, is it reliable or not? Or can it be that on some machines I will have the OutOfMemory error also for smaller files?

Dimon
Devart Team
Posts: 2885
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Fri 23 Jul 2010 11:01

It seems that 400 MB is a limitation of the Delphi memory manager that generates the OutOfMemory error.

Post Reply