Page 1 of 1

Memory problem with LoadFromFile

Posted: Tue 10 Mar 2015 09:40
by dehorter
Hello

I use the TIBCStoredProc component and the SaveToXml method in order to save its result into an xml file.
It's works :)

This xml file is later on loaded into a TVirtalTable with LoadFromFile method. It works fine except the the size of the file is large (27M). I receive an exception "eOutOfMemory".

There is limits in size ?
and how can I solve that ?

kind regards

Re: Memory problem with LoadFromFile

Posted: Wed 11 Mar 2015 06:57
by AlexP
Hello,

This error means you have not enough memory (the application consumes too much memory). Run Task Manager and check the application memory consumption.

P.S. If you are using these components in one application, you can load data directly to VirtualTable using the Assign method, having specified the required DataSet as a data source.

Re: Memory problem with LoadFromFile

Posted: Wed 11 Mar 2015 09:56
by dehorter
Thanks for your answer

Yes the memory is full.

The file is used by the same application but not at the same place, thus I send than this file in order to complete their own database.

Is it possible to split the exported file during the SaveToXml method into small piece ?
Or do you have an other idea to solve that ?

king regards

olivier

Re: Memory problem with LoadFromFile

Posted: Wed 11 Mar 2015 11:49
by AlexP
You can use the filter property of the DataSet to save a definite range of records in XML. However, loading from several xml files or by a condition is not supported in VirtualTable. Even if such a functionality was implemented, you would get the same error on reaching the maximum memory volume.

Re: Memory problem with LoadFromFile

Posted: Wed 11 Mar 2015 13:04
by dehorter
Thanks again ;)

the "filter" is already done, and there is not possibility in terms of SQL syntax to reduce the number of data (except using ROWS).
I am thinking if there is a possibility in the SaveToXML method (from TIBCStoredProc component in my case) to spilt the exported file into several (part1, part2, ...) according to a max size of indivual file.
It is not necessary for me to merge them into a memorytable, because I can treat them separatly.

is it clear ? :oops:

Re: Memory problem with LoadFromFile

Posted: Thu 12 Mar 2015 06:12
by AlexP
To filter data, you can use local filtration:

Code: Select all

SET TERM ^ ;

create or alter procedure SEL_FROM_EMP
returns (
    EMPNO integer,
    ENAME varchar(10),
    JOB varchar(9),
    MGR integer,
    SAL integer,
    COMM integer,
    DEPTNO integer)
as
BEGIN
  FOR  SELECT EMPNO, ENAME, JOB, MGR, SAL, COMM, DEPTNO FROM emp
    INTO :EMPNO, :ENAME, :JOB, :MGR, :SAL, :COMM, :DEPTNO
  DO
    suspend;
END^

SET TERM ; ^

Code: Select all

  IBCStoredProc1.IsQuery := True;
  IBCStoredProc1.StoredProcName := 'SEL_FROM_EMP';
  IBCStoredProc1.Filter := 'DEPTNO=10';
  IBCStoredProc1.Filtered := true;
  IBCStoredProc1.SaveToXML('d:\ibcrecordset.xml');

Re: Memory problem with LoadFromFile

Posted: Thu 12 Mar 2015 07:34
by dehorter
Hi

This solution will efficient for some situation, but for some of them surely not (one table with the contents of field are always different :?

regards

olivier

Re: Memory problem with LoadFromFile

Posted: Sun 15 Mar 2015 15:48
by dehorter
Hi

I build queries to extract a minimum of lines.
However the problem remains :(

It should be great to have this possibility :
IBCStoredProc.SaveToXML(fileName, rowNumber) where rowNumber indicates the size of division of the total ?

kind regards

olivier

Re: Memory problem with LoadFromFile

Posted: Mon 16 Mar 2015 07:33
by AlexP
You can leave your suggestion on our uservoice page. If it gets enough user votes, we will implement it.