Problem reading values from number columns with high scale
Problem reading values from number columns with high scale
I create a table and inserts values according to this script :
CREATE TABLE TJOTR
(
N NUMBER,
V VARCHAR2(10 BYTE)
)
INSERT INTO TJOTR (
N, V)
VALUES ( 0.00632958471747583928374627378, 'FUN' )
The number inserted has 29 decimals.
When I try to fill (select * from tjotr) an oracleDatatable or a dataset with oracleDataAdapter I get the following error:
Specified argument was out of the range of valid values.
Parameter name: Decimal's scale value must be between 0 and 28, inclusive.
Default scale for number columns I think is 38 in oracle. If I understand this problem right the scale can only be 28 for a number column to be able to read the data. We can't redefine all number columns in an external datawarehouse to be able to read the values.
Bug? Clues? Hints? Suggestions?
CREATE TABLE TJOTR
(
N NUMBER,
V VARCHAR2(10 BYTE)
)
INSERT INTO TJOTR (
N, V)
VALUES ( 0.00632958471747583928374627378, 'FUN' )
The number inserted has 29 decimals.
When I try to fill (select * from tjotr) an oracleDatatable or a dataset with oracleDataAdapter I get the following error:
Specified argument was out of the range of valid values.
Parameter name: Decimal's scale value must be between 0 and 28, inclusive.
Default scale for number columns I think is 38 in oracle. If I understand this problem right the scale can only be 28 for a number column to be able to read the data. We can't redefine all number columns in an external datawarehouse to be able to read the values.
Bug? Clues? Hints? Suggestions?
We definitely consider upgrade to version 4.20 when it will be released because of performance improvements and other stuff.
But as for today actual 4.20 version is not stable enought (some of my tests fail). The one error I can remember is NullReferenceException during some DDL queries.
But to be honest, I can't say we tested 4.20 a lot. When we got this exception, the only solution was to return to 3.55.23 which is at this moment in our production environment and considered more-or-less stable (except for some hard-to-submit errors with memory corruption on vista+instant client machines)...
But as for today actual 4.20 version is not stable enought (some of my tests fail). The one error I can remember is NullReferenceException during some DDL queries.
But to be honest, I can't say we tested 4.20 a lot. When we got this exception, the only solution was to return to 3.55.23 which is at this moment in our production environment and considered more-or-less stable (except for some hard-to-submit errors with memory corruption on vista+instant client machines)...
We'll upgrade to 4.20 when it will be officially released.
Generally, we do not beta test third-party products due to lack of our time. And we are not allowed to use beta versions to minimize problem field of our own beta software.
(When I made a test run with 4.20 was my private initiative)
But generally speaking, complete lost of some numeric field values is a serious problem which I cannot explain to our customers.
Generally, we do not beta test third-party products due to lack of our time. And we are not allowed to use beta versions to minimize problem field of our own beta software.
(When I made a test run with 4.20 was my private initiative)
But generally speaking, complete lost of some numeric field values is a serious problem which I cannot explain to our customers.
Code: Select all
while (reader.Read())
{
object[] record = result.Add();
try
{
reader.GetValues(record);
}
catch
{
FillSlow(reader, record);
}
}
private void FillSlow(IDataReader reader, object[] record)
{
for (int i = 0; i < reader.FieldCount; i++)
try
{
record[i] = reader.GetValue(i);
}
catch
{
record[i] = DBNull.Value;
}
}