Page 1 of 1

OutOfMemory-Exception during SaveChanges() to Oracle

Posted: Thu 08 Oct 2009 11:24
by HCRoman
Hi experts,

what amount of memory should be expected for writing about 100000 EntityObjects to the Oracle 10 database in one SaveChanges().?????

We have a Migration-Framework, that creates an object tree of about 100000 objects. If I will save the context, the memory usage grows from 250 MB rapidly to 1.6 GB and couse my machine has only 2 GB an OutOfMemory-Exception is thrown after about one minute.

if I call SaveChanges() after migration of every 1000 master objects all works fine and the memory usage is of about 400 to 500 MB.
But this is not in one transaction and I can not rollback the whole data on errors.

(SaveChanges() reports about 35000 objects including the associations for 1000 master objects)

I tryed TransactionScope, but couse of Identity columns?? the transaction crashes with an "open cursor error" from oracle at the beginning of the second package of 1000 records. I am not sure, this will avoid the large memory consume...

Is there a solution how to handle such a moderate count of objects in one transaction???

Tnx

Roman

Posted: Thu 08 Oct 2009 15:20
by AndreyR
The problem is in fact in the ORM architecture. The similar problem was discussed here, for example:
http://social.msdn.microsoft.com/Forums ... 2722de991a
There is a number of ways to overcome common performance problems.
Please refer to the article about the EF performance here:
http://msdn.microsoft.com/en-us/library/bb896240.aspx
If the model is considerably large, time-consuming actions are made before the database is queried for the first time.
This problem can be partially solved using view pregeneration, it is described here:
http://msdn.microsoft.com/en-us/library/bb896240.aspx

Trouble with Oracle and TransactionScope!!

Posted: Mon 12 Oct 2009 13:44
by HCRoman
Hi devart team,

I have much trouble with using TransactionScope, ending up in

OptimisticConcurrencyException for zero records (0)!!!!

As you in an different post suggested, i use

TransactionOptions tso = new TransactionOptions();
tso.Timeout = new TimeSpan(10, 0, 0);
tso.IsolationLevel = IsolationLevel.ReadUncommitted;
using (TransactionScope tran = new TransactionScope(TransactionScopeOption.Required, tso))
{
result = p.StartProgram();
if (result)
{
result = p.ResetMigration() && p.RunMigration();
}
tran.Complete();
}

What happens: inside:

after each 1000 Master records I do context.SaveChanges(true);
This saves me from OutOfMemoryException,
but after 10 Minutes, i am getting the concurrency exception????

nobody else writes an works on this database!!

I played around with a lot of different "loops" ending up in more dangerous exceptions.

Your small example from the different post seems to work, but have you stress and mass tested TransactionScope with Oracle???

Are there more "options" and "switches" that helps?

Without the TransactionScope, all works fine!!!


In german, the error is....

"Speicheranweisung für Aktualisierung, Einfügen oder Löschen betraf eine unerwartete Anzahl an Zeilen (0). Möglicherweise wurden einige Entitäten seit dem Laden der Entitäten geändert oder gelöscht. Aktualisieren Sie die ObjectStateManager-Einträge."


Tnx

Roman

Posted: Wed 14 Oct 2009 11:56
by AndreyR
Could you please send us (support * devart * com, subject EF: TScope problem) the test project reproducing the problem?
Is the problem reproducible with both Pooling = true and Pooling = false?

I will test it

Posted: Thu 15 Oct 2009 08:18
by HCRoman
It is not so easy to isolate the problem from the application, couse i need the data of two schemas and some logic, i hope to have time next days...

up to now i play with Pooling=false and it does not work, now i am getting index errors for duplicate keys,
but i do only inserts and all keys are generated via sequences during insert triggers.

But now i have seen the explicite BeginTransaction() method in the objectConext connection.

couse i do not need multiple connections this is sufficient for our application!!!
I do not need the TransactionScope now.

Combined with the SaveChanges() after each 1000 new records avoiding OutOfMemory-Exceptions all works now.
The code writes 180.000 objects in about 13 minutes to the database.
Calculation (reading and conversion needs < 1 Minute) the rest is SaveChanges()

Code: Select all

......

this.saveContext.Entity.Connection.Open();	
bool result = StartProgram();
if(result)
{
   var dbtran = this.saveContext.Entity.Connection.BeginTransaction();
   try
   {
      result = ResetMigration() && RunMigration();
   }
   catch (Exception ex)
   {
      //Log
      result = false;
   }
   finally
  {
     try
    {
        if (result)
        {
            dbtran.Commit();
        }
        else
        {
            dbtran.Rollback();
        }
        saveContext.Entity.Connection.Close();
     }
     catch
    {
         dbtran.Dispose();
    }
}
Roman

Posted: Mon 19 Oct 2009 07:23
by AndreyR
Glad to hear you have solved the problem.
Could you please make a test project and send it to us?
This project should help us to locate the problem and improve our product.