How to optimize the usage of TMSQuery on remote connection?
How to optimize the usage of TMSQuery on remote connection?
Hello,
I am hitting a performance wall when using my client/server application with a remote connection. Typical scenario is User has Client EXE on Laptop, he uses a mobile connection and with VPN he connects to company LAN to access SQLServer and document folders.
(same scenario with VPN and with direct connection to SQL server through external IP can occur, nothing changes for my example)
The problem is that every time I run a query, there is a latency. The latency depends on connection speed. But to give you an idea with a normal internet connection I expereince something like 200ms of latency, while with slower connection the latency is more than 1000ms.
For latency I mean that even if the query returns a single value (for example in the simple query
SELECT CUSTOMER_NAME
FROM CUSTOMERS_TABLE
WHERE CUSTOMER_ID = 10
)
takes a lot of time to execute, 200ms.
A query that returns a lot of more data, like the following 100 records query:
SELECT CUSTOMER_ID, CUSTOMER_NAME, CUSTOMER_ADDRESS, CUSTOMER_PHONE, CUSTOMER_LONG_DESCRIPTION
FROM CUSTOMERS_TABLE
WHERE CUSTOMER_ID <=100
takes 240ms. So the big amount of data returned (more than 1000 more data) has a minimal difference in time.
So this is why I speak about latency.
Anyway it seems this is SQL Server behaviour, I would like to understand why there is all this latency. I tried with Management Studio and query execution time are the same (I mean I tried with management studio from remote connection).
So the point is, my application, super simplified, is like this:
1) after login There is a TMSConnection.Connect to the DB
2) during the software execution I have many time data blocks kile this:
procedure DummyProcedure;
begin
MSQuery1.Open;
MSQuery2.Open;
MSQuery3.Open;
MSQuery4.Open;
MSQuery5.Open;
MSQuery6.Open;
MSQuery7.Open;
end;
MSQuery1 to 7 basically return few data tipically, but since every MSQuery has the latency, if latency is 200 ms the above DummyProcedure will take 200*7 =1,4 seconds to execute, while in LAN it is instantanious...
Can you comment on this?
Is there a way to avoid that latency I am speaking about?
I run queries also for setting some popupmenuitem's visibility, so even the display time of some popupmenus is sometimes not acceptable.
Thanks.
I am hitting a performance wall when using my client/server application with a remote connection. Typical scenario is User has Client EXE on Laptop, he uses a mobile connection and with VPN he connects to company LAN to access SQLServer and document folders.
(same scenario with VPN and with direct connection to SQL server through external IP can occur, nothing changes for my example)
The problem is that every time I run a query, there is a latency. The latency depends on connection speed. But to give you an idea with a normal internet connection I expereince something like 200ms of latency, while with slower connection the latency is more than 1000ms.
For latency I mean that even if the query returns a single value (for example in the simple query
SELECT CUSTOMER_NAME
FROM CUSTOMERS_TABLE
WHERE CUSTOMER_ID = 10
)
takes a lot of time to execute, 200ms.
A query that returns a lot of more data, like the following 100 records query:
SELECT CUSTOMER_ID, CUSTOMER_NAME, CUSTOMER_ADDRESS, CUSTOMER_PHONE, CUSTOMER_LONG_DESCRIPTION
FROM CUSTOMERS_TABLE
WHERE CUSTOMER_ID <=100
takes 240ms. So the big amount of data returned (more than 1000 more data) has a minimal difference in time.
So this is why I speak about latency.
Anyway it seems this is SQL Server behaviour, I would like to understand why there is all this latency. I tried with Management Studio and query execution time are the same (I mean I tried with management studio from remote connection).
So the point is, my application, super simplified, is like this:
1) after login There is a TMSConnection.Connect to the DB
2) during the software execution I have many time data blocks kile this:
procedure DummyProcedure;
begin
MSQuery1.Open;
MSQuery2.Open;
MSQuery3.Open;
MSQuery4.Open;
MSQuery5.Open;
MSQuery6.Open;
MSQuery7.Open;
end;
MSQuery1 to 7 basically return few data tipically, but since every MSQuery has the latency, if latency is 200 ms the above DummyProcedure will take 200*7 =1,4 seconds to execute, while in LAN it is instantanious...
Can you comment on this?
Is there a way to avoid that latency I am speaking about?
I run queries also for setting some popupmenuitem's visibility, so even the display time of some popupmenus is sometimes not acceptable.
Thanks.
Ok, but do you have a SDAC sample? I mean do you have a demo that demonstrate a good behaviour of a GUI application that interacts with remote server?Dimon wrote:This problem is connected with SQL Server work, not with SDAC. Please learn SQL Server Books Online.
(TMSConnection.Options.DisconnectedMode is false, I let you noitce that this property is not documented in SDAC help)
I would also VERY appreciate some optimization possibility for such batches of queries - just anything like TMSConnection.OpenMoreDatasets(const d: array of TMSDataset); it could internally connect the sql commands of the queries on one helper query, send as one batch and then redistrubute the results back to original dataset components. I know, it has some pitfalls with more complicated queries. But in 99% of cases, it would be really helpful.
Yes this is definitely a great idea that would probably solve 85% of my problems.
On Embarcadero Delphi's forum (https://forums.embarcadero.com/message. ... eID=158883) about the same problem I was told that:
If this is true I could run all the queries I need to run in a single request, this would be very useful for example when I open a customer in my application, instead of sending 10 queries to load all custom data (customer personal info, list of docuemnts, list of projects, list of contacts, ...) I could do this in 1 single step, minimizing the need of going through more "dramatic" changes like moving from 2 to 3 tier architecture.
Thanks.
On Embarcadero Delphi's forum (https://forums.embarcadero.com/message. ... eID=158883) about the same problem I was told that:
may you comment on this?MS SQL Server allows you to send multiple queries as a 'single' SQL request and returns all the results together. You may want to check if the components that you are using support this mode.
If this is true I could run all the queries I need to run in a single request, this would be very useful for example when I open a customer in my application, instead of sending 10 queries to load all custom data (customer personal info, list of docuemnts, list of projects, list of contacts, ...) I could do this in 1 single step, minimizing the need of going through more "dramatic" changes like moving from 2 to 3 tier architecture.
Thanks.
I also made this simple test using Management Studio: running this single batch query
"
select * from TestTable -- 1
select * from TestTable -- 2
select * from TestTable -- 3
[...]
select * from TestTable -- 100
" (100 times the same select statement)
and running this multi batch one:
"
select * from TestTable -- 1
GO
select * from TestTable -- 2
GO
select * from TestTable -- 3
GO
[...]
select * from TestTable -- 100
GO
" (100 times the same SELECT statement + GO)
===================================
The first one is about running more queries together, the second one is more similar to what we do with SDAC when doing
query1.Open;
query2.Open;
query3.Open;
...
query100.Open;
====================================
I made tests by connecting to a remote DB and i noticed that the single batch was quite fast, I had this results:
LOCAL DB: 4 seconds without GO / 5 seconds with GO (basically no significant difference)
REMOTE DB: 8 seconds without GO / 28 seconds with GO (basically dramatic difference: 350% ... and this number is proportional to the slowliness of connection: This means that the slowest the connection the greatest the benefit of using this approach!)
How is it possible to achieve this? Is what Ludek suggests (or a variation of his idea) doable?
Thanks!
"
select * from TestTable -- 1
select * from TestTable -- 2
select * from TestTable -- 3
[...]
select * from TestTable -- 100
" (100 times the same select statement)
and running this multi batch one:
"
select * from TestTable -- 1
GO
select * from TestTable -- 2
GO
select * from TestTable -- 3
GO
[...]
select * from TestTable -- 100
GO
" (100 times the same SELECT statement + GO)
===================================
The first one is about running more queries together, the second one is more similar to what we do with SDAC when doing
query1.Open;
query2.Open;
query3.Open;
...
query100.Open;
====================================
I made tests by connecting to a remote DB and i noticed that the single batch was quite fast, I had this results:
LOCAL DB: 4 seconds without GO / 5 seconds with GO (basically no significant difference)
REMOTE DB: 8 seconds without GO / 28 seconds with GO (basically dramatic difference: 350% ... and this number is proportional to the slowliness of connection: This means that the slowest the connection the greatest the benefit of using this approach!)
How is it possible to achieve this? Is what Ludek suggests (or a variation of his idea) doable?
Thanks!
Calling first TMSConnection.OpenDatasets([qry1,qry2]) and than TMSConnection.OpenDatasets([qry3, qry4]) an exception "Query does not return next result set" is raized. Seems the internal FDataSet used to launch the compound SQL must be moved to the begin.
Another issue could be an optional parameter to not open already opened DS (like msQuery.Open does nothing if just opened).
best regards
Toni
Another issue could be an optional parameter to not open already opened DS (like msQuery.Open does nothing if just opened).
best regards
Toni