Hi...
I have a special scenario where I have to open 100+ different DB connections sequentially. This causes issues with the max connection limit (default: 100) of the PostgreSQL server. Of course, I could increment this limit to let's say 200, but it's only a question of time when I have to open 200+ connections in that special use case.
Anything I can do to force the release/closing of a connection explicitly? THX IA, Ekki
Connection Life Time
Re: Connection Life Time
On my dev machine, ClearPool(<conn>) does exactly what I was looking for. But when I did deploy the executable to the production machine, nothing did change. The connections raise up to 120 (meanwhile I did increment the connection limit to 200) and after some minutes go back to where it started (~7).
Does this make sense? What could be the reason for ignoring ClearPool() calls on the production machine?
Does this make sense? What could be the reason for ignoring ClearPool() calls on the production machine?
Re: Connection Life Time
1. Please use the following overload to clear the pool immediately: https://www.devart.com/dotconnect/postg ... lean).html.
2. You can control the max number of connections in the pool with the Max Pool Size connection string parameter.
3. If you do not need pooling, turn off it using "Pooling=false;" in the connection string.
Refer to https://www.devart.com/dotconnect/postg ... tring.html.
2. You can control the max number of connections in the pool with the Max Pool Size connection string parameter.
3. If you do not need pooling, turn off it using "Pooling=false;" in the connection string.
Refer to https://www.devart.com/dotconnect/postg ... tring.html.
Re: Connection Life Time
#3 solved my problem. Thx!