Exception when using ArrayBinding

Discussion of open issues, suggestions and bugs regarding ADO.NET provider for Oracle
Post Reply
Johannes
Posts: 2
Joined: Tue 07 Dec 2010 15:36

Exception when using ArrayBinding

Post by Johannes » Tue 07 Dec 2010 16:33

Hello,

I hope someone could help me with this problem. I'm using dotConnect for Oracle Version 6 in a .NET 4 application. I'm using oci.dll from oracle client 11.2.0.1.2 (32 bit version).

So here a short description of what I'm doing:

My application has to read data from a multiple sql servers (mssql, oracle, mysql, progress, ...) and should put the data in tables in an oracle db.

For that the program reads a config file that contains connection strings and select statments for the source-databases. It contains connection strings and table names for the oracle destination tables to.

First I use a DbConnection, DbCommand, DbDataReader to execute the given SQL-Select statement on any of the source systems.

The second step is to use DbDataReader.GetFieldType() and .FullName to create an insert into statment and to create an OracleParameterCollection (because the destination table contains of the same columnnames and columntyps as the source table).

In the last step i open an second connection to the oracle instance (with an OracleConnection object). Then I create an OracleCommand and set .CommandText to the dynamicaly created SQL-insert statement created at step 2. After that I fill OracleCommand.Paramters with my OracleParametersCollection. And after that a loop is filling the .Value Parameter of each .Paramter[] with an array of values (fetched from the above mentioned DbDataReader).

To execute the statement i execute OracleCommand.ExecuteArray().

This works just fine for an arraysize of up to 200-300 but if I use arraysizes of 1000 or greater I get OutOfMemoryException or OverflowException.

The select statement produces about 180 columns where about 150 of them a varchar2 columns with an average of about 40 Chars length. There are no lob's or long columns in the result.

Before using the devart assembly i used the .NET 2 assembly from the oracle ODAC client 11g. It just worked fine with the same sql statement and up to an ArrayBindCount of 50000 (i never tested it with greater numbers because at a arraysize of 5000-10000 the performance hits the maximum, larger numbers did not increase performance).

Does anyone know a solution for me to use devart for realy big arrays with ExecuteArray()?

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Fri 10 Dec 2010 11:21

Thank you for your report. We are investigating the issue. Please contact us via our contact form and give us the following information:
1) the DDL script of the table you are making ExecuteArray() to;
2) your dotConnect for Oracle version is 6.0.58, isn't it? You can find the version via the Tools > Oracle > About menu of Visual Studio;
3) the version of your Oracle Server;
4) specify your connection string settings (at least roughly, without credentials);
5) have you tried Direct Mode? Does the problem persist in this mode as well?

Johannes
Posts: 2
Joined: Tue 07 Dec 2010 15:36

Post by Johannes » Fri 10 Dec 2010 16:59

Hi Shalex,

thanks for your input. I have opened a support case with the information you wished. Your recommendation to use direct mode works fine for me. I was able to insert 177 columns with ExecuteArray(20000) without any problems.

But this is only a workaround for me because we are using tnsnames.ora as masterfile for all our applications so we only have to change one file to change connection settings for every application. With direct connect I have to change every application setting after a change on the db-server. Or is there any way to use settings from tnsnames.ora but without using oci.dll (direct connect via settings from tnsnames.ora)?

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Mon 13 Dec 2010 16:02

1. We have sent you a test project concerning the OutOfMemoryException problem.

2. The tnsnames.ora file is used only in the OCI mode. We recommend you to use a general approach with the ConnectionStringManager class in the Direct mode to store your connection string in one place for many projects. Here are some ideas:
http://social.msdn.microsoft.com/forums ... 66F92C0DB2
http://www.codeproject.com/KB/database/ ... tring.aspx.

HCRoman
Posts: 55
Joined: Wed 12 Aug 2009 05:47

Same problem, why is this not fixed in the last version???

Post by HCRoman » Mon 26 Sep 2011 15:58

Hi devart team,

I have the same problem, in OCI mode the Devart api throws an OutOfMemory-Exception after about 64000 records.

This seems to be an memory leak problem!!
(or fragmentation of large object heap???)

If I have two varchar 4000 columns, the exception occures after 64000 rows

If I shrink this two columns to 400, the exception occures only later....

Version:
5.0.284.0 / 6.30.185.0

In Direct mode all works fine.

Why this error is not fixed???? :?: :?:

(Direct mode has also some limitations...)

We have to use windows authentification!
for our database connections

this is one limitation of devart direct mode
:cry:
second limitation:

RETURNING clause does also not work!!!

so direct mode is not a good workaround for me.

(TOAD does this direct mode connection as external user with windows authentification!)

:idea:


Any help available

Thanks

Roman

HCRoman
Posts: 55
Joined: Wed 12 Aug 2009 05:47

workaround found..

Post by HCRoman » Thu 29 Sep 2011 09:28

I have found a usable workaround for this.... :)

(But it would be better, if the problem could be fixed) :wink:

Code: Select all

using (var tran = new TransactionScope(TransactionScopeOption.Required, TimeSpan.FromMinutes(25)))
{
  var nextJunk = true;
  while(nextJunk)
  {
    create object context.

    create a junk of data that does not throw an OutOfMemoryException.

    do ArrayInsert

    commit context, if needed

    dispose ObjectContext and ObjectContext.[b]Connection[/b]
    (the connection holds that data cousing the OutOfMemoryException)

    nextJunk = more data to insert...
  }
  tran.Complete();
}
(
So I can insert millions of records via entity framework in a couple of hours 8)
)
Roman

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Thu 29 Sep 2011 15:50

The reason of the OutOfMemoryException problem is that OCI sets the OracleDbType.VarChar parameters size to 4000 by default. It is necessary to set the size of such parameters manually.

If you are using Entity Framework, be aware of the Batch Updates functionality. But Batch Updates has a lower performance than Array Binding.

HCRoman
Posts: 55
Joined: Wed 12 Aug 2009 05:47

This is not the reason...

Post by HCRoman » Thu 29 Sep 2011 16:10

Hi Shalex,

this is not really the reason...

I every time set the correct max length to varchar parameters.
If the parameters length is less than 4000, the error occurres only later...

there is definitly a memory leak in the code, may be it is not in the devart dll. The memory is freed after closing and disposing the connection!
See my previous post.

Devart-Batch Updates are not a good solution, up to this release.
It does not support returning clause, so if you have generated keys or computed values, you can not work without hurts.

My ObjectContext Extension for the entity framework can insert, update and delete using ExecuteArray. Insert and Update works with correct returning clauses for all identity and computed columns.
So all references updated correctly after AcceptChanges for the EntityState of the Entity.

You have only to ensure the correct order of the operation.

Update and delete can also optionally use different logical keys for the operation...

Include the calls in the SavingChanges Event it ist mostly transparent for the user but hundred times faster... than SaveChanges


Roman

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Mon 03 Oct 2011 14:27

Please try another version of Oracle client. If the problem persists, send us a small test project with the corresponding DDL/DML script to reproduce the issue in our environment.

HCRoman
Posts: 55
Joined: Wed 12 Aug 2009 05:47

Test-Project was sent by email....

Post by HCRoman » Wed 05 Oct 2011 16:31

Hi devart team,

the test project was sent to your email address...

Roman

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Thu 20 Oct 2011 08:49

The bug with not freeing memory after command execution is fixed. We will post here when the corresponding build of dotConnect for Oracle is available for download.

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Post by Shalex » Fri 11 Nov 2011 15:58

New build of dotConnect for Oracle 6.50.244 is available for download now!
It can be downloaded from http://www.devart.com/dotconnect/oracle/download.html (trial version) or from Registered Users' Area (for users with valid subscription only).
For more information, please refer to http://www.devart.com/forums/viewtopic.php?t=22547 .

Partizan
Posts: 36
Joined: Fri 13 Nov 2009 10:18

Re: This is not the reason...

Post by Partizan » Thu 27 Feb 2014 18:37

HCRoman wrote:Hi Shalex,

Devart-Batch Updates are not a good solution, up to this release.
It does not support returning clause, so if you have generated keys or computed values, you can not work without hurts.

My ObjectContext Extension for the entity framework can insert, update and delete using ExecuteArray. Insert and Update works with correct returning clauses for all identity and computed columns.
So all references updated correctly after AcceptChanges for the EntityState of the Entity.

You have only to ensure the correct order of the operation.

Update and delete can also optionally use different logical keys for the operation...

Include the calls in the SavingChanges Event it ist mostly transparent for the user but hundred times faster... than SaveChanges

Roman
Hi Roman,

Do you have this solution available anywhere?
I'm really interested in figuring out how you did achieve this.

Best regards,
Alexander Efimov.

Post Reply