Issue with connection pool. Getting Underlying provider failed error after max pool size reached

Discussion of open issues, suggestions and bugs regarding ADO.NET provider for Oracle
Post Reply
sandeshkoli
Posts: 1
Joined: Fri 05 Oct 2012 16:01

Issue with connection pool. Getting Underlying provider failed error after max pool size reached

Post by sandeshkoli » Fri 05 Oct 2012 16:40

Hi,
I am using dotConnect for Oracle. I am getting "Underlying provider failed" eror after my connection pool reaches 100 since the default setting of the connection pool is 100. The connections from the connection pool are not getting reused. Below is my sample code. Can somebody please let me know what is wrong here?.

Connection string -
<add name="MyNamespace.EntityFrameWorkConnectionString"
connectionString="metadata=res://*/MyNamespace.EntityFrameWork.csdl|res://*/MyNamespace.EntityFrameWork.ssdl|res://*/MyNamespace.EntityFrameWork.msl;provider=Devart.Data.Oracle;provider connection string='User Id=db_user;Password=db_pwd;Server=ora_server;Persist Security Info=True';"
providerName="System.Data.EntityClient" />

using (DBEntity dbEntity = new DBEntity())
{
MyTableRec tableRec = new tableRec()
{
ResponsibilitieId = GenericHelper.GetNextSequenceValue(GenericHelper.TABLE_NAME),
Name = "Some Name",
CreatedBy = "User1"
};
try
{
dbEntity.AddToMyTable(tableRec);
dbEntity.SaveChanges();
}
catch
{
isSuccess = GenericHelper.isNotSucessfull;
}
}

Shalex
Site Admin
Posts: 9543
Joined: Thu 14 Aug 2008 12:44

Re: Issue with connection pool. Getting Underlying provider failed error after max pool size reached

Post by Shalex » Thu 11 Oct 2012 14:58

1. The possible reason of opening (and not closing) too much connections can be usage of TransactionScope in a wrong way. We recommend using TransactionScope only for saving data (not for multiple read/save-s):
a) use

Code: Select all

using (var firstConext = new FirstContext()) {
using (var secondConext = new SecondContext()) {
// ... multiple reading 1
// ... multiple reading 2
using (var transactionscope = new TransactionScope()) {
// ... saving 1
// ... saving 2
transactionscope.Commit();
}
}
}
instead of

Code: Select all

using (var transactionscope = new TransactionScope()) {
using (var firstConext = new FirstContext()) {
// ... multiple reading 1
// ... saving 1
}
using (var secondConext = new SecondContext()) {
// ... multiple reading 2
// ... saving 2
}
transactionscope.Commit();
}
b) also take into account the peculiarities of a particular ORM. For example, if you are using multiple read/save-s before applying ts.Commit(), all used connections are accumulated (they will be released only after ts.Commit()). It makes sense to open connection manually so that EF would not open/"close" them for each database interoperation - in this case EF context would use only this single connection.
Use

Code: Select all

using (var transactionscope = new TransactionScope()) {
using (var firstConext = new FirstContext()) {
firstConext.Connection.Open();
// ... multiple reading 1
// ... saving 1
}
using (var secondConext = new SecondContext()) {
secondConext.Connection.Open();
// ... multiple reading 2
// ... saving 2
}
transactionscope.Commit();
}
instead of

Code: Select all

using (var transactionscope = new TransactionScope()) {
using (var firstConext = new FirstContext()) {
// ... multiple reading 1
// ... saving 1
}
using (var secondConext = new SecondContext()) {
// ... multiple reading 2
// ... saving 2
}
transactionscope.Commit();
}
This is a sample for ObjectContext.

In case of DbContext, it has to be cast to IObjectContextAdapter:

Code: Select all

((IObjectContextAdapter)сonext).ObjectContext.Connection.Open();
2. You are employing XML-mapping and obtaining sequence.nextval directly from the database. Be aware that this can be done by EF-provider to optimize usage of connections: http://blogs.devart.com/dotconnect/set- ... ggers.html

3. If these recommendations don't help, send us a test project to reproduce the problem in our environment.

Post Reply