unidac - oracle - behaviour changes in direct mode
Re: unidac - oracle - behaviour changes in direct mode
May I ask you what you mean by "shortly"? The issue is not yet fixed in 7.3.10 release. It begins to be a very long time.
Re: unidac - oracle - behaviour changes in direct mode
We thoroughly investigated the work of ODAC in the Direct Mode according to your description. We did not find an error in its behavior. When using parameters when working with data of the ORACLE CLOB type, you should describe them as follows:
To pass small values to these parameters, use the AsString method:
To pass values of large length, passing the value should look like this:
You cannot use the AsMemo method in any of the cases when working with ORACLE CLOB, as in this case, the passed parameter will be of the ftMemo type used for the values ORACLE LONG
A detailed description of work with parameters such as BLOB and CLOB is given in our documentation : https://www.devart.com/odac/docs/work_lob.htm
Code: Select all
UniQuery -> Params -> Items[1] -> ParamType = ptInput;
UniQuery -> Params -> Items[1] -> DataType = ftOraClob;
Code: Select all
UniQuery -> Params -> Items[1] -> AsString = L"éé";
Code: Select all
UniQuery -> Params -> Items[1] -> AsBlobRef -> AsString = L"éé";
A detailed description of work with parameters such as BLOB and CLOB is given in our documentation : https://www.devart.com/odac/docs/work_lob.htm
Re: unidac - oracle - behaviour changes in direct mode
In this case please define explicitly what is large length and small length, in terms of length of AnsiString, for each dabatase version and each characterset!
As shown in the example, there is not one method that always work, we have to use a different one for large length, small length. Furthermore, the behaviour changes according to database characterSet: it behaves differently if db uses utf8 and we8Iso8859P1, and we have no other workaround for more than one year than re-reading data and try another insert methodif what is inserted is different than what is read. With some charactersets, we even can't insert clobs>32Kb.
As shown in the example, there is not one method that always work, we have to use a different one for large length, small length. Furthermore, the behaviour changes according to database characterSet: it behaves differently if db uses utf8 and we8Iso8859P1, and we have no other workaround for more than one year than re-reading data and try another insert methodif what is inserted is different than what is read. With some charactersets, we even can't insert clobs>32Kb.
Re: unidac - oracle - behaviour changes in direct mode
Indeed, the length of the transmitted string will depend on various factors, including the CharacterSet database. Therefore, if you cannot choose between the 2 ways described above, use the 2nd method (AsBlobRef -> AsString).
It will allow you to get the correct result in all cases of working with CLOB fields.
It will allow you to get the correct result in all cases of working with CLOB fields.