What I'm seeing is that inserting 100,000 records is taking between 50 and 100 seconds, and using around 400 MB of memory. Needless to say, at that rate the system runs out of memory fairly quickly. Since I don't see other memory usage complaints in the forum, this must be something I'm doing wrong, but I haven't been able to find my problem.
I've tried wrapping the SubmitChanges() call in a transaction, but that made no difference. I also tried closing and disposing the context after each call to SubmitChanges() but that also made no difference.
The datacontext is created from an existing database, and for each run I create a new database by changing the file name in the connect string.
Here's pseudocode for what I'm doing.
Code: Select all
using (SQLiteConnection connection = dbUtil.CreateConnection(dbFile))
{
dbUtil.CreateDatabase(connection);
// use linqConnect to add records to the database
context = new MainDataContext(connection.ConnectionString);
while (there are still records)
{
if (count == limit)
{
DateTime start = DateTime.Now; // for timing insertions per second
context.SubmitChanges();
DateTime end = DateTime.Now; // for timing insertions per second
TimeSpan timer = end.Subtract(start);
Debug.WriteLine(String.Format("Total time for insert of {1} records: {0} MS",
timer.TotalMilliseconds, recordsDone));
count = 0;
}
get a record
fill in the fields
call InsertOnSubmit(rec)
}
}
I have put breakpoints before and after the SubmitChanges() call to make sure that the memory is being allocated there. I also use the same parsing code to generate equivalent records and write them to a binary file. I can write a 2.6 million record binary file in about a minute; I expected the database population to be close to that speed.
This must be some mistake I'm making. Where should I start looking?
Thanks,
Dennis