Page 1 of 1

Connection Life Time

Posted: Mon 09 Dec 2019 17:42
by Ekki
Hi...

I have a special scenario where I have to open 100+ different DB connections sequentially. This causes issues with the max connection limit (default: 100) of the PostgreSQL server. Of course, I could increment this limit to let's say 200, but it's only a question of time when I have to open 200+ connections in that special use case.

Anything I can do to force the release/closing of a connection explicitly? THX IA, Ekki

Re: Connection Life Time

Posted: Mon 09 Dec 2019 21:20
by Ekki
On my dev machine, ClearPool(<conn>) does exactly what I was looking for. But when I did deploy the executable to the production machine, nothing did change. The connections raise up to 120 (meanwhile I did increment the connection limit to 200) and after some minutes go back to where it started (~7).

Does this make sense? What could be the reason for ignoring ClearPool() calls on the production machine?

Re: Connection Life Time

Posted: Fri 13 Dec 2019 19:02
by Shalex
1. Please use the following overload to clear the pool immediately: https://www.devart.com/dotconnect/postg ... lean).html.

2. You can control the max number of connections in the pool with the Max Pool Size connection string parameter.

3. If you do not need pooling, turn off it using "Pooling=false;" in the connection string.

Refer to https://www.devart.com/dotconnect/postg ... tring.html.

Re: Connection Life Time

Posted: Fri 13 Dec 2019 22:57
by Ekki
#3 solved my problem. Thx!