UNICODE & .NET w/Enterprise Library

Discussion of open issues, suggestions and bugs regarding ADO.NET provider for MySQL
Post Reply
cparke
Posts: 2
Joined: Mon 26 Feb 2007 21:53

UNICODE & .NET w/Enterprise Library

Post by cparke » Mon 26 Feb 2007 22:27

I'm having trouble getting my application to read & write UNICODE text to the MySQL database. I'm using the MySQLDirect .NET2 Data Provider (v3.55.18.0) in conjunction with Microsoft Enterprise Library Data Access Application Block in C# with .NET Framework 2.0.

The same code reads/writes UNICODE characters just fine with the Microsoft SQL Server built-in provider for SQLEXPRESS, so I'm pretty sure the problem is with the MySQL Direct .NET provider, or in my MySQL database definitions or server configuration. I'm going to give a description of my environment and the problems below, and maybe somebody has an idea what I'm not doing or doing wrong.

MySQL database table definition (abbreviated):

CREATE TABLE `testtable` (
`id` int NOT NULL,
`Slug` varchar(100) character set utf8 NOT NULL default '',
`Text` mediumtext character set utf8 NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;


MySQL Direct connect string entry in .config file:





My sample text: "Hello Привет"


MySQL server is configured to use 'latin1' as the default character set. But as I understand it, the UNICODE=true setting in the MySQL Direct connection string and the table definition should override that.

I am using stored procedures in MySQL to do the actual insert/update operations. I am using the CoreLab.MySql.MySqlAdapter (obtained via the Enterprise Library factory functions) to do an ADO.NET Update() call to change an existing record with the sample text.

When I make the update call, I get the CoreLab.MySql.MySqlException:

Data too long for column`Slug`at row 1`.

Obviously, however, my text is not anywhere near too long (100 is the length), so the error must mean something else. No difference if I try to update the mediumtext column instead (yes, it actually tells me the string is too long for a mediumtext!).

If I don't have the Russian characters in the update string, the update succeeds. If I don't use the UNICODE=true in the connection string, then the update call also succeeds, but the Russian characters change to ?????? (obviously the data isn't being sent right to the server). No difference if I choose to use UCS2 instead of UTF8 as the character set.

Any ideas why I'm having such difficulty doing such a trivial operation? Again, this works just fine with SQL Server into a nvarchar() and ntext field, but with MySQL and the unicode character sets, just bad results.

Appreciate any advice or assistance you can provide to get MySQL Direct .NET2 to do what I need.


CP

Alexey
Posts: 2756
Joined: Mon 13 Mar 2006 07:43

Post by Alexey » Thu 01 Mar 2007 09:35

Does this error take place if you don't use Microsoft Enterprise Library Data Access Application Block?
What is your MySQL ini file?
What is the version of your MySQL server?
What is your Windows regional settongs?
Send us small test project to reproduce the problem.
Use e-mail address provided in the Readme file.
Do not use third party components.

cparke
Posts: 2
Joined: Mon 26 Feb 2007 21:53

Post by cparke » Thu 01 Mar 2007 21:35

Alexey wrote:Send us small test project to reproduce the problem.
Hi,

Thanks for your reply.

In building the testapp for you, I found that the error "Data too long for column`Slug`at row 1" is caused by the insert/update stored procedure's definition not indicating the unicode character set 'utf8' for the input parameters containing the unicode. Direct INSERT/UPDATE statements, or fixing the stored procedure definition to indicate 'utf8' character set, fixed the problem.

Thank you for your help!
CP

Alexey
Posts: 2756
Joined: Mon 13 Mar 2006 07:43

Post by Alexey » Fri 02 Mar 2007 11:21

Not at all.

Post Reply