Out of memory Error when insert many rows

Discussion of open issues, suggestions and bugs regarding ODAC (Oracle Data Access Components) for Delphi, C++Builder, Lazarus (and FPC)
Post Reply
genvox
Posts: 2
Joined: Mon 06 Sep 2021 05:17

Out of memory Error when insert many rows

Post by genvox » Mon 06 Sep 2021 06:55

I try in different ways to transfer many records to Oracle for the subsequent insert (the target is 1 million) with maximum speed, but I cannot transfer even 100k (Win10, Embarcadero Delphi 10.2, ODAC 11.2.5):
1) DML ARRAY (https://www.devart.com/odac/docs/work_dml.htm)

Code: Select all

create global temporary table tbl_gtt
(
    f1  varchar2(19 char),
    f2 varchar2(64 char)
)
on commit preserve rows;

procedure TfmTest.ODACdmlarray(Sender: TObject);
var
  OraQuery: TOraQuery;
  RowQty, i: Integer;
begin
  RowQty := 72735;
  OraQuery := TOraQuery.Create(nil);
  OraQuery.SQL.Text := 'INSERT INTO tbl_gtt VALUES (:f1, :f2)';
  OraQuery.ParamByName('f1').DataType := ftString;
  OraQuery.ParamByName('f1').Length   := RowQty;
  OraQuery.ParamByName('f2').DataType  := ftString;
  OraQuery.ParamByName('f2').Length    := RowQty;
  OraQuery.Prepare;
  for i := 1 to RowQty do
    begin
      OraQuery.ParamByName('f1').ItemAsString[i] := '1234567890123456789';
      OraQuery.ParamByName('f2').ItemAsString[i]  := '1234567890123456789';
    end;
  OraQuery.Execute(RowQty);
end;
Results: "Out of memory" on line "OraQuery.Prepare;"
OK, lets try Batch Operations.

2) Batch Operations (https://www.devart.com/odac/docs/batchops.htm)

Code: Select all

create global temporary table tbl_gtt
(
    f1  varchar2(19 char),
    f2 varchar2(64 char)
)
on commit preserve rows;

procedure TfmTest.ODACbatch(Sender: TObject);
var
  OraQuery: TOraQuery;
  RowQty, i: Integer;
begin
  RowQty := 72735;
  OraQuery := TOraQuery.Create(nil);
  OraQuery.SQL.Text := 'INSERT INTO tbl_gtt VALUES (:f1, :f2)';
  OraQuery.Params[0].DataType := ftString;
  OraQuery.Params[1].DataType := ftString;
  OraQuery.Params.ValueCount := RowQty;
  for i := 1 to RowQty do
    begin
      OraQuery.Params[0][i-1].AsString := '1234567890123456789';
      OraQuery.Params[1][i-1].AsString := '1234567890123456789';
    end;
  OraQuery.Execute(40000);
  OraQuery.Execute(32735, 40000);
end;
Results: "Out of memory" on line "OraQuery.Execute(40000);"
OK, lets try to pass an array to a procedure and insert through forall.

3) Array in Procedure

Code: Select all

create or replace package pkg_test is
type type_test is table of varchar2(19 char) index by pls_integer;
procedure proc
(
    p_arr in type_test
);
end pkg_test;

create or replace package body pkg_test is
procedure proc
(
    p_arr in type_test
)
is
begin
  raise_application_error(-20999, 'Y');
end;
end pkg_test;

procedure TfmTest.ODACarrproc(Sender: TObject);
var
  i, RowQty: Integer;
  arr: array of Variant;
  oProc: TOraStoredProc;
begin
  RowQty := 65536;
  SetLength(arr, RowQty);
  for i := 1 to RowQty do
    begin
      arr[i-1] := '1234567890123456789';
    end;
  oProc:= TOraStoredProc.Create(nil);
  oProc.StoredProcName:= 'pkg_test.proc';
  oProc.Prepare;
  oProc.Params[0].Value  := arr;
  oProc.Execute;
end;
Results: endless execution on line "oProc.Execute". If RowQty < 65536, then very quickly get the expected ORA-20999.

4) TOraLoader works, but much slower than other methods (for 10k rows).

Please tell me what am I doing wrong? How can I load at least 500k rows into the database at maximum speed?

jdorlon
Posts: 202
Joined: Fri 05 Jan 2007 22:07

Re: Out of memory Error when insert many rows

Post by jdorlon » Tue 07 Sep 2021 14:45

In my experience, inserting about 500 rows at a time is ideal.
There is very little speed gain going higher than that.

-John

MaximG
Devart Team
Posts: 1822
Joined: Mon 06 Jul 2015 11:34

Re: Out of memory Error when insert many rows

Post by MaximG » Thu 09 Sep 2021 11:50

In your samples, the "Out of memory" error occurs when filling the array of parameters and does not depend on the operation of our components. To insert values in the case described by you, we recommend using Batch-insertion in blocks. For example, inserting a million records in blocks of 50,000 records would look like this:

Code: Select all

      ...
      RowQty, i: Integer;
      BlockCount, j: Integer;
    begin
      BlockCount:= 20;
      RowQty := 50000;
      ...

      OraQuery.SQL.Text := 'INSERT INTO tbl_gtt VALUES (:f1, :f2)';
      OraQuery.Prepare;
      OraQuery.Params[0].DataType := ftString;
      OraQuery.Params[1].DataType := ftString;
      OraQuery.Params.ValueCount := RowQty;
      OraQuery.Session.StartTransaction;
      for j := 1 to BlockCount do begin
        for i := 1 to RowQty do
        begin
          OraQuery.Params[0][i-1].AsString := '1234567890123456789';
          OraQuery.Params[1][i-1].AsString := '1234567890123456789';
        end;
        OraQuery.Execute(RowQty);
      end;
      OraQuery.Session.Commit;

      ...
Let us know if any questions!

genvox
Posts: 2
Joined: Mon 06 Sep 2021 05:17

Re: Out of memory Error when insert many rows

Post by genvox » Tue 14 Sep 2021 05:02

Thank's MaximG.
I compared the two options:
1) Batch Inserting in gtt, 10k lines per block (and then merge on the database side)
2) Calling a procedure with passing an array of 10k lines in a block into it (and then update on the database side)
Oddly enough, the results are about the same. Although I expected an array-to-procedure transfer to be faster, since update should be faster than insert + merge.
As a result, I chose option 1, since it looks better architecturally (since besides insertion, some more logic is needed on the database side), but if there were no problems with passing a large array to the procedure, I would have chosen option 2.

MaximG
Devart Team
Posts: 1822
Joined: Mon 06 Jul 2015 11:34

Re: Out of memory Error when insert many rows

Post by MaximG » Tue 12 Oct 2021 20:05

The absolute insertion times for either option depend on the settings of your server or environment.
Feel free to contact us if you need any assistance.

Post Reply