Storing large amount of data in PostgreSQL

Discussion of open issues, suggestions and bugs regarding usage of dbExpress drivers for PostgreSQL in Delphi and C++Builder
Post Reply
T_MANN
Posts: 7
Joined: Sat 07 Jul 2012 07:29

Storing large amount of data in PostgreSQL

Post by T_MANN » Sat 07 Jul 2012 08:42

Hello!
I have a need to store large files in the PostgreSQL (version 8.4). For this I use field with type Bytea.
But when I try to save in DB data with size of 100-200 Mb I received 'Out of memory' exception.
I wrote simple application on Delphi 7 that simulate inserting large data into the blob field. There is the script for table creation:

Code: Select all

create table test_blobs
(
	id numeric(10) primary key,
	bfile bytea default null
);

create sequence seq_test_blobs_id start with 1 increment 1 cache 10;
For connection I use dbExpress driver for PostgreSQL version 1.10.0.9.
There is the code that I use for inserting large amount of data to DB:

Code: Select all

//this function is used for initializing connection to the DB
//paramater Open is used for identification if we want open new connection or close existing one

function TMainForm.InitConnection(Open : boolean = true) : boolean;
begin
   if not Open then
   begin
      if SQLConnection.Connected then
         SQLConnection.Close;

      MessageBox(Application.Handle, PChar('Disconnected!'), PChar('Information'), MB_OK or MB_ICONINFORMATION);

      memLog.Lines.Add('------------------------------');
      Result := true;
      Exit;
   end;

   Result := SQLConnection.Connected;

   if not Result then
   begin
      SQLConnection.Params.Values['HOSTNAME'] := edHost.Text;
      SQLConnection.Params.Values['SERVER PORT'] := edPort.Text;
      SQLConnection.Params.Values['DATABASE'] := edDB.Text;
      SQLConnection.Params.Values['USER_NAME'] := edUser.Text;
      SQLConnection.Params.Values['PASSWORD'] := edPass.Text;

      try
        SQLConnection.Open;
        Result := true;

        memLog.Lines.Add(Format('HOST %s (%s)', [edHost.Text, edPort.Text]));
        memLog.Lines.Add(Format('DATABASE %s', [edDB.Text]));
        memLog.Lines.Add('------------------------------');

        MessageBox(Application.Handle, PChar('Connected!'), PChar('Information'),MB_OK or MB_ICONINFORMATION);
      except
        on E: Exception do
        MessageBox(Application.Handle, PChar('Connection failed!' + #13#10 + E.Message), PChar('Error'), MB_OK or MB_ICONERROR);
      end;
   end;
end;

//this procedure is used for simulation loading large data to the DB
//the size of the data is measured in Megabytes
procedure TMainForm.btnLoadClick(Sender: TObject);
var
  tmpQ : TSQLQuery;
  ms : TMemoryStream;
  tmpStr : string;
begin
  if edSize.Text = EmptyStr then //if user forget enter the size of loaded data then
     edSize.Text := '1'; //the size will be 1 Mb

  if not InitConnection then
     Exit; //exit if connection initialization failed

  tmpQ := TSQLQuery.Create(nil);
  tmpQ.SQLConnection := SQLConnection;
  tmpQ.SQL.Add('insert into test_blobs values (nextval(''seq_test_blobs_id''), cast(:bfile as bytea));');

  //simulation of large data using DupeString.
  tmpStr := DupeString('a', StrToInt(edSize.Text) * 1024 * 1024);
  
  //loading data to the MemoryStream
  //size of used memory will be increased in twice
  ms := TMemoryStream.Create;
  ms.WriteBuffer(Pointer(tmpStr)^, Length(tmpStr));
  ms.Position := 0;
  
  //loding data to the paramater using LoadFromStream
  tmpQ.ParamByName('bfile').LoadFromStream(ms, ftBlob);

  try
     tmpQ.ExecSQL();
     memLog.Lines.Add(Format('size %s Mb : OK', [edSize.Text]));
  except
     on E : Exception do
      memLog.Lines.Add(Format('size %s Mb : FAILED. %s', [edSize.Text, E.Message]));
  end;

  ms.Free;
  tmpQ.Free;
end;
I tested this application on different machines with connection to local and remote PostgreSQL servers. The results of working confused me. On different machines are different amount of data that I can save in DB. On some of them I received 'Out of memory' when I try to load string with size 127 Mb, on another - 280 Mb.
I posted source code of my test application on this russian forum SQL.ru (http://www.sql.ru/forum/actualthread.as ... tid=952893). And I received answer from the user DEVART that suggested to use Large Object data type (lo) and set parameters OIDAsLargeObject and DetectParamTypes to true. But it didn't help me.
Then I try to use the answer from this post http://forums.devart.com/viewtopic.php? ... tea#p55869. But without results. When I try to insert data into field with type OID I received type cast error. Field with type OID is casted as ftInteger.
Questions:
1. Is there some example of inserting data to the field with Large Object(LO) data type?
2. What is the cause of casting OID data type as ftInteger?
P.S.
I didn't find ability to attach file to the message. But I can send the whole Delphi 7 project of my test application if it will help.

AlexP
Devart Team
Posts: 5530
Joined: Tue 10 Aug 2010 11:35

Re: Storing large amount of data in PostgreSQL

Post by AlexP » Tue 10 Jul 2012 15:21

Hello,

1)There is your modified sample below for downloading data to the OID filed.

Code: Select all

procedure TForm1.SQLConnection1AfterConnect(Sender: TObject);
const
  coOIDAsLargeObject = TSQLConnectionOption(501);
  coDetectParamTypes = TSQLConnectionOption(510);
begin
  SQLConnection1.SQLConnection.SetOption(coOIDAsLargeObject, Integer(True));
  SQLConnection1.SQLConnection.SetOption(coDetectParamTypes, Integer(True));
end;

procedure TForm1.Button1Click(Sender: TObject);
var
  tmpQ : TSQLQuery;
  ms : TMemoryStream;
  tmpStr : string;
  TD: TTransactionDesc;
  MBSize: integer;
begin
  MBSize:= 500;
  tmpQ := TSQLQuery.Create(nil);
  tmpQ.SQLConnection := SQLConnection1;
  tmpQ.SQL.Add('insert into test_blobs values (nextval(''seq_test_blobs_id''), :bfile);');
  tmpStr := StringOfChar('a', MBSize * 1024 * 1024);
  ms := TMemoryStream.Create;
  ms.WriteBuffer(Pointer(tmpStr)^, Length(tmpStr));
  tmpStr := EmptyStr;
  SetLength(tmpStr,0);
  ms.Position := 0;
  tmpQ.ParamByName('bfile').LoadFromStream(ms, ftBlob);
  ms.Free;
  try
    if not SQLConnection1.InTransaction then
    begin
      TD.TransactionID := 1;
      TD.IsolationLevel := xilREADCOMMITTED;
      SQLConnection1.StartTransaction(TD);
      tmpQ.ExecSQL();
      SQLConnection1.Commit(TD);
      memLog.Lines.Add(Format('size %d Mb : OK', [MBSize]));
    end;
  except
    on E : Exception do
      SQLConnection1.Rollback(TD);
      memLog.Lines.Add(Format('size %d Mb : FAILED. %s', [MBSize, E.Message]));
  end;
  ms.Free;
  tmpQ.Free;
end;
2) In the OID field, there is an integer number stored, that is an identifier of a saved object, and if the OIDAsLargeObject option is enabled, then this field is identified as ftinteger.

T_MANN
Posts: 7
Joined: Sat 07 Jul 2012 07:29

Re: Storing large amount of data in PostgreSQL

Post by T_MANN » Thu 12 Jul 2012 09:12

I tried to use modified code. Casting OID as BlobField is OK now but I receive 'Out of memory' error when trying to load more than 127 Mb of data. Maybe it depends from the PostgreSQL Server setting. I'm not shure in it. Plus, there is my computer characteristics:
  • Microsoft Windows 7 Professional (build 7600), 64-bit
  • Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
  • RAM 8173 Mb
Is there any ideas or suggestion how can I avoid this "wonderful" error?

AlexP
Devart Team
Posts: 5530
Joined: Tue 10 Aug 2010 11:35

Re: Storing large amount of data in PostgreSQL

Post by AlexP » Fri 13 Jul 2012 08:25

hello,

With the help of the code I have given, the upload of 500 Mb to the server on a less powerfull PC completed with no errors. Therefore, most probably, this problem is due to the specific settings of the PostgreSQL server.

T_MANN
Posts: 7
Joined: Sat 07 Jul 2012 07:29

Re: Storing large amount of data in PostgreSQL

Post by T_MANN » Thu 16 Aug 2012 11:46

It was not a problem of the PostgreSQL server settings. There wasn't enough free physical memory on my machine. I was able to save data up to 590 Mb after I reboot my computer and stop some processes.
But I noticed that the process postgres.exe doesn't free memory after the finishing data saving.
The process used about 300 Mb of memory after four-five attempts of loading data.
Source code of my test application are available from this link
https://docs.google.com/open?id=0B5OiX_ ... URoQTNILUE

AlexP
Devart Team
Posts: 5530
Joined: Tue 10 Aug 2010 11:35

Re: Storing large amount of data in PostgreSQL

Post by AlexP » Thu 16 Aug 2012 12:45

hello,

This problem occured when working in transactions in old versions of our dbExpress driver. In the latest version, the problem with non-releasing memory is already resolved. You should download the latest version of our driver to solve the problem

T_MANN
Posts: 7
Joined: Sat 07 Jul 2012 07:29

Re: Storing large amount of data in PostgreSQL

Post by T_MANN » Thu 16 Aug 2012 15:07

Thanks for your responses. They helped me a lot. And I hope that this topic will be very useful for other Devart dbExpress users.

RGyaznoff
Posts: 1
Joined: Thu 15 Nov 2012 13:34
Location: Россия Москва
Contact:

Re: Storing large amount of data in PostgreSQL

Post by RGyaznoff » Thu 15 Nov 2012 13:38

But when I try to save in DB data with size of 100-200 Mb I received 'Out of memory' exception.
I wrote simple application on Delphi 7 that simulate inserting large data into the blob field.

AlexP
Devart Team
Posts: 5530
Joined: Tue 10 Aug 2010 11:35

Re: Storing large amount of data in PostgreSQL

Post by AlexP » Thu 15 Nov 2012 15:22

Hello,

As it was said above, the problem with memory releasing is already solved. Try downloading the latest driver version and checking your application on it. If the problem is repeated, please send your application demonstrating the problem to alexp*devart*com .

NKjulanoff
Posts: 1
Joined: Fri 22 Feb 2013 04:47
Location: Россия Москва
Contact:

Re: Storing large amount of data in PostgreSQL

Post by NKjulanoff » Fri 22 Feb 2013 04:47

Therefore, most probably, this problem is due to the specific settings of the PostgreSQL server.

pavshinAN
Posts: 1
Joined: Tue 26 Feb 2013 14:18
Location: Россия Москва
Contact:

Re: Storing large amount of data in PostgreSQL

Post by pavshinAN » Tue 26 Feb 2013 14:19

But I noticed that the process postgres.exe doesn't free memory after the finishing data saving.
The process used about 300 Mb of memory after four-five attempts of loading data.

DemetrionQ
Devart Team
Posts: 271
Joined: Wed 23 Jan 2013 11:21

Re: Storing large amount of data in PostgreSQL

Post by DemetrionQ » Thu 06 Jun 2013 15:05

Hello.

   When creating strings of large size and attempting to transfer them to a server, it is rather easy to get the "Out of memory" error on the client-side, especially when you use functions like StringOfChar for creating strings. The point is that calling StringOfChar allocates memory, that cannot be freed until the procedure, which calls StringOfChar, finishes. Taking into account that the value returned by StringOfChar must be copied to a variable for further use, and then it must be also copied to a parameter or a dataset, there may be lack of memory on your PC.
   When transferring large amounts of data, the easiest way to avoid the "Out of memory" error is to execute the INSERT query with a parameter, and load data into the parameter directly from a file. For example:

Code: Select all

  SQLQuery1.SQL.Text :=  'insert into TableForFile (FiledForFile) values(:data)';
  SQLQuery1.Params.ParamByName('data').LoadFromFile('C:\MyFiles.zip', ftTypedBinary);
  SQLQuery1.ExecSQL;
Note: due to internal specificity of the dbExpress interface (developed by Embarcadero), even when using an INSERT query, on data transfer a client application allocates memory 2-3 times greater than the volume of the transferred data. So, if you are trying to transfer a file, which size is, for example, half of your RAM, you will most probably get the "Out of memory" error.

UZmosk
Posts: 1
Joined: Fri 16 Jan 2015 10:14
Location: Россия Москва
Contact:

Re: Storing large amount of data in PostgreSQL

Post by UZmosk » Fri 16 Jan 2015 10:16

For connection I use dbExpress driver for PostgreSQL version 1.10.0.9.

azyk
Devart Team
Posts: 1119
Joined: Fri 11 Apr 2014 11:47
Location: Alpha Centauri A

Re: Storing large amount of data in PostgreSQL

Post by azyk » Wed 21 Jan 2015 10:04

Please clarify the issue. If you have a simple test sample reproducing the problem, please send it to andreyz*devart*com .

Post Reply