Batch Insert and memory usage
Posted: Tue 31 Oct 2017 23:57
I am using the latest UniDac and XE7 with Firebird 3. I am inserting 335K records into a particular table. These records are fairly large and I am finding that I can only insert about 200 or so at a time. If I try to set 1000 I get an error. I have tried to trap the error but to no avail. I think it has to do with running out of memory. In any case, if I drop the batch size down to 200 or so, it all works.
what happens is that the line Query1.Execute(cnt); fails and the system leaves the procedure. Standard exception behaviour except I do not seem to be able to trap it.
I am thinking that a large batch size is better than a small one. Correct? I am also worried that this 200 limit is on my development system - there is no way of knowing if this limit would pertain to a working customer's machine or not. It is a bit of a concern. It would be nice if the Batch Insert errored out so that I could trap it and allow the user to do something about it. A problem in the script will be picked up by the error trapping, but not this out of memory problem.
I am basically using the code as supplied in the documentation
cnt=200;
Query1.SQL.add('INSERT INTO tablename');
Query1.SQL.add(' (fieldname1, fieldname2, etc...)');
Query1.SQL.add('VALUES (');
Query1.sql.Add(':fieldname1,');
Query1.sql.Add(':fieldname2'); etc
Query1.SQL.add(')');
Query1.Params[0].DataType := ftFloat;
Query1.Params[1].DataType := ftFloat; etc
Query1.Params.ValueCount := cnt; // specify the array dimension:
// populate the array with parameter values:
for i := 0 to cnt - 1 do
begin
Query1.Params[0].AsFloat := some value;
Query1.Params[1].AsFloat := some value;
end;
try
Query1.Execute(cnt);
except
on e:exception do ShowMessage(e.message);
end;
Any thoughts?
what happens is that the line Query1.Execute(cnt); fails and the system leaves the procedure. Standard exception behaviour except I do not seem to be able to trap it.
I am thinking that a large batch size is better than a small one. Correct? I am also worried that this 200 limit is on my development system - there is no way of knowing if this limit would pertain to a working customer's machine or not. It is a bit of a concern. It would be nice if the Batch Insert errored out so that I could trap it and allow the user to do something about it. A problem in the script will be picked up by the error trapping, but not this out of memory problem.
I am basically using the code as supplied in the documentation
cnt=200;
Query1.SQL.add('INSERT INTO tablename');
Query1.SQL.add(' (fieldname1, fieldname2, etc...)');
Query1.SQL.add('VALUES (');
Query1.sql.Add(':fieldname1,');
Query1.sql.Add(':fieldname2'); etc
Query1.SQL.add(')');
Query1.Params[0].DataType := ftFloat;
Query1.Params[1].DataType := ftFloat; etc
Query1.Params.ValueCount := cnt; // specify the array dimension:
// populate the array with parameter values:
for i := 0 to cnt - 1 do
begin
Query1.Params[0].AsFloat := some value;
Query1.Params[1].AsFloat := some value;
end;
try
Query1.Execute(cnt);
except
on e:exception do ShowMessage(e.message);
end;
Any thoughts?