SQL Server DTS / SSIS and large tables "out of memory"
-
- Posts: 5
- Joined: Thu 28 Jun 2007 03:34
SQL Server DTS / SSIS and large tables "out of memory"
I am trying to use the PostgreSQLDirect driver to Migrate data from Postgres to SQL Server via DTS / SSIS. Most of the tables migrate fine however larger tables (in excess of 10 million rows) fall over with an out of memory exception. Any ideas how I can get around this, or is this a known bug with the driver? The server has 2 Gig of RAM and plenty of swap, and monitoring the memory it seems to have plenty available when it falls over.
SSIS Error
- Pre-execute (Error)
Messages
Error 0xc0047062: Data Flow Task: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at CoreLab.PostgreSql.PgSqlDataReader.e(Int32 A_0)
at CoreLab.PostgreSql.PgSqlCommand.a(CommandBehavior A_0, IDisposable A_1, Int32 A_2, Int32 A_3)
at CoreLab.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.PreExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper) (SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Source - Query" (1) failed the pre-execute phase and returned error code 0x8007000E.
(SQL Server Import and Export Wizard)
SSIS Error
- Pre-execute (Error)
Messages
Error 0xc0047062: Data Flow Task: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at CoreLab.PostgreSql.PgSqlDataReader.e(Int32 A_0)
at CoreLab.PostgreSql.PgSqlCommand.a(CommandBehavior A_0, IDisposable A_1, Int32 A_2, Int32 A_3)
at CoreLab.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.PreExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper) (SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Source - Query" (1) failed the pre-execute phase and returned error code 0x8007000E.
(SQL Server Import and Export Wizard)
-
- Posts: 5
- Joined: Thu 28 Jun 2007 03:34
Didnt get as far as an SSIS package - fails in the Data Import Wizard, though I did try directly from Business Development Studio and had the same problem.
The table structure is;
column1 - bigint (primary key)
column2 - bigint
column3 - smallint
column4 - text
column5 - bigint
column6 - text
column7 - smallint
column8 - smallint
column9 - text
column10 - text
column 11 - smallint
column 12 - bigint
I also tried changing the text data types on the postgres db to varchar(255) - no luck there either.
So to replicate. Create the above table and populate with 13,000,000 rows. Then create a db on SQL Server 2005 and try to import using the .NET Postgres provider using a query (select * from tablename). It seemed to cope fine if I limited the query below 8 million rows. e.g (select * from tablename LIMIT 7000000).
thanks.
The table structure is;
column1 - bigint (primary key)
column2 - bigint
column3 - smallint
column4 - text
column5 - bigint
column6 - text
column7 - smallint
column8 - smallint
column9 - text
column10 - text
column 11 - smallint
column 12 - bigint
I also tried changing the text data types on the postgres db to varchar(255) - no luck there either.
So to replicate. Create the above table and populate with 13,000,000 rows. Then create a db on SQL Server 2005 and try to import using the .NET Postgres provider using a query (select * from tablename). It seemed to cope fine if I limited the query below 8 million rows. e.g (select * from tablename LIMIT 7000000).
thanks.
-
- Posts: 5
- Joined: Thu 28 Jun 2007 03:34
btw - I finally got this working using the Postgres ODBC driver v. 8.02.0400. It failed as well with an out of memory error at the transformation stage though. I managed to get it working by checking the "use Declare/Fetch" option in the ODBC driver settings. Is there an equivalent setting on the Core Lab driver?
-
- Posts: 5
- Joined: Thu 28 Jun 2007 03:34
-
- Posts: 5
- Joined: Thu 28 Jun 2007 03:34
Same prolem
I am having the same issue. I am attempting to pull a table of 3 million rows via SSIS from Postgres into SQL. I am getting the same out of memory error. When looking into the issue I do not believe it is a SQL issue...if you watch the SSIS process in BIDS you will notice that the data reader task is not sending any rows through the pipeline. It appears to be waiting until the entire SQL statement is executed. Yet, if I limit the select to the top 10,000 I have no issue and can see the data being sent down the pipeline.