Use of UniLoader

Discussion of open issues, suggestions and bugs regarding UniDAC (Universal Data Access Components) for Delphi, C++Builder, Lazarus (and FPC)
Post Reply
hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Use of UniLoader

Post by hughespa » Mon 01 Feb 2010 05:34

Hi,

This is using D2010 and SQL Server 2008

I've used the UniLoader to bulk load data (via a BDE dataset) from our existing application being ported to SQL Server 2008, it works very well and is very fast compared to the usual methods.

I'm considering using UniLoader to populate SQL server data via a VirtualTable dataset created and populated on the client rather than populating say a TUniTable or TUniQuery and having it write individual record updates to the server. Typical number of records might be between 10,000 and 40,000.

Can I use UniLoader like this, are there any issues to be aware of?

Also, will this still work within the context of an existing (previously started) transaction such that if the transaction is ultimately rolled back, any data appended via the UniLoader is also rolled back correctly?

Thanks in advance.

Regards, Paul.

Dimon
Devart Team
Posts: 2910
Joined: Mon 05 Mar 2007 16:32

Re: Use of UniLoader

Post by Dimon » Mon 01 Feb 2010 14:34

Yes, you can use this way.
But it is easier to use the CachedUpdate mode. Setting the TUniQuery.CachedUpdates property to True enables updates to a dataset to be stored in an internal cache on the client side instead of being written directly to the dataset's underlying database tables. When changes are completed, an application writes all cached changes to the database in the context of a single transaction.

hughespa wrote:Also, will this still work within the context of an existing (previously started) transaction such that if the transaction is ultimately rolled back, any data appended via the UniLoader is also rolled back correctly?
You can use TUniLoader with transactions.

hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Post by hughespa » Mon 01 Feb 2010 23:16

Hi Dimon,

Thanks for the information.

Do you think cached updates would generally be faster than UniLoader?

I'm not sure what mechanism each method uses to actually apply the updates to the DB.

Regards, Paul.

Dimon
Devart Team
Posts: 2910
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Tue 02 Feb 2010 08:22

In these cases performance depends on many factors and you need to test both cases.

hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Post by hughespa » Tue 02 Feb 2010 08:40

OK, thanks Dimon

Regards, Paul.

hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Post by hughespa » Tue 02 Feb 2010 14:07

Hi,

I thought you might be interested in this error and the timings.

D2010 and SQL Server 2008

The application test run generates 42400 records to a fairly simple table.

TUniTable (no cached updates) took 109 secs

TUniTable (using cached updates) took 85 secs

VirtualTable (populated instead of TUniTable) and using UniLoader.LoadFromDataSet took 10 secs

I tried setting the UniTable.Options.UpdateBatchSize to 1000 but got the following error (the highest setting I tried that worked was 125).

class EMSError with message 'The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parameters were provided in this RPC request. The maximum is 2100.'

Not sure if this is an error or just a limitation.

The VirtualTable and UniLoader combination seems to be very significantly faster though.

Regards, Paul.

Dimon
Devart Team
Posts: 2910
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Tue 02 Feb 2010 14:55

Thank you for the information.

hughespa wrote:I tried setting the UniTable.Options.UpdateBatchSize to 1000 but got the following error (the highest setting I tried that worked was 125).
class EMSError with message 'The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parameters were provided in this RPC request. The maximum is 2100.'
Not sure if this is an error or just a limitation.
This is a limitation of MS SQL server.

hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Post by hughespa » Wed 03 Feb 2010 00:21

Thanks Dimon,

Is there a way to calculate the optimal setting for the batch size? It looks like it may be related to the number of fields in the dataset (or the number of fields being updated on each record maybe)?

Regards, Paul.

Dimon
Devart Team
Posts: 2910
Joined: Mon 05 Mar 2007 16:32

Post by Dimon » Thu 04 Feb 2010 09:57

The batch size depends on the count of updated fields in a record and on the count of updated records.

hughespa
Posts: 81
Joined: Sat 23 Aug 2008 08:36
Location: W. Australia

Post by hughespa » Thu 04 Feb 2010 10:38

OK Thanks Dimon,

Regards, Paul.

Post Reply