Page 1 of 1

SQL Server DTS / SSIS and large tables "out of memory"

Posted: Thu 28 Jun 2007 03:44
by bcfletcher
I am trying to use the PostgreSQLDirect driver to Migrate data from Postgres to SQL Server via DTS / SSIS. Most of the tables migrate fine however larger tables (in excess of 10 million rows) fall over with an out of memory exception. Any ideas how I can get around this, or is this a known bug with the driver? The server has 2 Gig of RAM and plenty of swap, and monitoring the memory it seems to have plenty available when it falls over.

SSIS Error

- Pre-execute (Error)
Messages
Error 0xc0047062: Data Flow Task: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at CoreLab.PostgreSql.PgSqlDataReader.e(Int32 A_0)
at CoreLab.PostgreSql.PgSqlCommand.a(CommandBehavior A_0, IDisposable A_1, Int32 A_2, Int32 A_3)
at CoreLab.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.PreExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper) (SQL Server Import and Export Wizard)

Error 0xc004701a: Data Flow Task: component "Source - Query" (1) failed the pre-execute phase and returned error code 0x8007000E.
(SQL Server Import and Export Wizard)

Posted: Mon 02 Jul 2007 07:39
by Alexey
Could you send me your SSIS project?
Also I need the structure of the table on which migration fails and the number of records in it.

Posted: Wed 04 Jul 2007 23:59
by bcfletcher
Didnt get as far as an SSIS package - fails in the Data Import Wizard, though I did try directly from Business Development Studio and had the same problem.

The table structure is;

column1 - bigint (primary key)
column2 - bigint
column3 - smallint
column4 - text
column5 - bigint
column6 - text
column7 - smallint
column8 - smallint
column9 - text
column10 - text
column 11 - smallint
column 12 - bigint

I also tried changing the text data types on the postgres db to varchar(255) - no luck there either.

So to replicate. Create the above table and populate with 13,000,000 rows. Then create a db on SQL Server 2005 and try to import using the .NET Postgres provider using a query (select * from tablename). It seemed to cope fine if I limited the query below 8 million rows. e.g (select * from tablename LIMIT 7000000).

thanks.

Posted: Thu 05 Jul 2007 00:04
by bcfletcher
btw - I finally got this working using the Postgres ODBC driver v. 8.02.0400. It failed as well with an out of memory error at the transformation stage though. I managed to get it working by checking the "use Declare/Fetch" option in the ODBC driver settings. Is there an equivalent setting on the Core Lab driver?

Posted: Thu 05 Jul 2007 10:35
by Alexey
Please provide me with CREATE TABLE statement, because I've got an error saying that data type for the column4 is invalid.

Posted: Fri 06 Jul 2007 03:37
by bcfletcher
CREATE TABLE bigtable
(
column1 bigint NOT NULL,
column2 bigint,
column3 smallint,
column4 text,
column5 bigint,
column6 text,
column7 smallint,
column8 smallint,
column9 text,
column10 text,
column11 smallint,
column12 bigint,
CONSTRAINT column1_pk PRIMARY KEY (column1 )
)
WITH OIDS;

Posted: Fri 06 Jul 2007 10:51
by Alexey
The problem seems not to pertain to our provider.
Could you describe your configuration? Do you have MS SQL Server, PostgreSQL server and MS SQL client all on one machine? If so, please try to find out which of three component raise the out of memory error.

Posted: Tue 10 Jul 2007 00:32
by bcfletcher
Server 1 - Redhat ES4 Postgres 8.03
Server 2 - Windows 2003 Server, SQL 2005

Trying to Import using Server 2 connecting to Server 1 via the crlabs driver using the Data Import wizard in SQL 2005.

Posted: Thu 12 Jul 2007 09:41
by Alexey
Server 1 - Redhat ES4 Postgres 8.03
Server 2 - Windows 2003 Server, SQL 2005
So which one has 2 Gig of RAM? Both?

Actually, I've reproduced this problem. Now we are trying to investigate it.

Posted: Thu 12 Jul 2007 16:08
by Alexey
We have analyzed the problem and it seems to have nothing to do with PostgreSQLDirect .NET. I think that exception is raised by MS SQL Server. You should upload data onto it in chunks.

Same prolem

Posted: Wed 01 Oct 2008 17:06
by mnowill
I am having the same issue. I am attempting to pull a table of 3 million rows via SSIS from Postgres into SQL. I am getting the same out of memory error. When looking into the issue I do not believe it is a SQL issue...if you watch the SSIS process in BIDS you will notice that the data reader task is not sending any rows through the pipeline. It appears to be waiting until the entire SQL statement is executed. Yet, if I limit the select to the top 10,000 I have no issue and can see the data being sent down the pipeline.

Posted: Tue 07 Oct 2008 07:22
by Shalex
We are investigating this issue. You will be notified on the results as soon as possible.

Posted: Wed 08 Oct 2008 15:57
by Shalex
The problem is fixed. Look forward to the next build of PostgreSQLDirect .NET.

Posted: Mon 20 Oct 2008 13:52
by mnowill
Great news...any idea on when the next build maybe released?

Posted: Tue 21 Oct 2008 07:44
by Shalex
The new release of PostgreSQLDirect .NET 4.00 beta version will be available tomorrow.

The final version will be available in 1-2 weeks.