Hi,
I would like to update more then 2k of records and I want to avoid opening 2K connections or so for each property change + using the unit of work approach I would like to be able to update everything and commit at the end.
Is it supported by your framework?
What is your recommendation and best practices for such actions ? Can I avoid multiple connections for such operations? What about timeouts ?
I am using the latest mysql dotConnect
Thanks
update large amout of records best practice question
Do you want to update 2000 records or to use 2000 different commands?
- If you use only one command that updates 2000 records, only one connection will be used. Our provider passes this command to the server where it is executed.
- If you need 2000 different commands to update records in the same database, you can execute them on the same connection object using local transaction.
- dotConnect for MySQL supports distributed transactions. You can use this feature by putting MySqlConnection.Open() in scope of TransactionScope or by using the "Enlist" connection string parameter or the MySqlConnection.EnlistTransaction() method:
http://www.devart.com/dotconnect/mysql/ ... ction.html.
Usage of dotConnect for MySQL with distributed transactions is similar to using System.Data.SqlClient (only change class prefixes from "Sql" to "MySql"): http://msdn.microsoft.com/en-us/library ... scope.aspx. - Additionally, we support the "Transaction Scope Local" connection string option that enables the support for non-distributed transactions in the TransactionScope implementation. This parameter turns on one-phase commit functionality. I.e. if there are several connections with the same connection string (and they all include "Transaction Scope Local=true;") within a scope of TransactionScope, our provider will use only one connection internally. Here is a case when this feature can be useful: http://www.devart.com/forums/viewtopic.php?t=18905.