Our 2-core web server with 2 GB RAM worked fine. Then I decided to use LINQ entity classes to access database.
After the installation, the web server was immediately overloaded. So I increased number of CPU cores to 8 and RAM to 12 GB. It was no solution because CPU load was constantly between 95 and 100% and 12 GB of RAM wasn't enough.
After a little profiling, I found that LINQ queries work between 2 and 10 times more slowly than plain text SQL. Furthermore, the second line in this code worked 120 times longer:
Table po = dataContext.GetTable();
PurchaseOrder single = po.Single(c => c.Key == key);
After metagenerating the code and changing the 2. line, the performance penalty decreased to 2 times, compared to plain SQL SELECT.
Now I have 8 idle cores, working seldom at more than 10% and 10 GB RAM too much.
Is that normal?
Performance
I thought the actual read would be deferred until the second command because the table wasn't used in the first one.
After using the code you proposed, I got the similar results. For a different table 200 LINQ reads took 2.308 sec. and plain SQL spent 0.078 sec., 30 times difference.
How about LINQ queries, do they have to last 2-10 times longer?
After using the code you proposed, I got the similar results. For a different table 200 LINQ reads took 2.308 sec. and plain SQL spent 0.078 sec., 30 times difference.
How about LINQ queries, do they have to last 2-10 times longer?
This codeis absolutely identical to the following:
Only one row is retrieved in both cases. You can check this with the DataContext.Log property (specifies the destination to write the SQL query or command):
The delay is caused by the compilation of the LINQ query. Our compiler is slow. If you need to call the query 200 times, maybe CompiledQuery will be a better solution for you. Please try the following sample:
Code: Select all
Table po = dataContext.GetTable();
PurchaseOrder single = po.Single(c => c.Key == key);
Code: Select all
PurchaseOrder single = dataContext.PurchaseOrders.Single(c => c.Key == key));
Code: Select all
StringBuilder sb = new StringBuilder();
db.Log = new System.IO.StringWriter(sb);
Code: Select all
for (int i = 0; i GetPurchaseOrder =
Devart.Data.Linq.CompiledQuery.Compile(
(TESTDataContext.PurchaseOrder ctx, int id) =>
(ctx.PurchaseOrder.Single(obj => obj.Key == id))
);
The compiled query is much faster. For my example 0.391 sec. (LINQ) compared to 0.086 sec. (plain SQL). However, that it still 4.5 times more slowly and is awkward to write the delegates. A metagenerated data tier is significantly faster.
200 row fetches for one procedure are seldom needed, that is only for testing purposes. I wanted to say that a 2, 5 or 10 times more slowly data access may not be acceptable. For my application, it isn't.
200 row fetches for one procedure are seldom needed, that is only for testing purposes. I wanted to say that a 2, 5 or 10 times more slowly data access may not be acceptable. For my application, it isn't.