When I update an existing large object, with a large object smaller than the currently stored one, the written data is corrupted by trailing data, because the previous large object size is not updated (the large object is not truncated to the new size).
For example, I store a large object 1000 bytes long. Then I update data for this same large object, with a content of 900 bytes. The stored data is corrupted because its lenght is still 1000 bytes: The last 100 bytes were not truncated.
I tried multiple ways to store the updated large object, but everything fails with the same result.
Tried using directly TPgLargeObject, then by using a TPgTable and a TPgLargeObjectField, tried with ou without large object caching.
Here is a code sample used to write the large object:
Code: Select all
pgConnection.StartTransaction;
ms.LoadFromFile(FilePath);
lobj := TPgLargeObject.Create(pgConnection);
lobj.Cached := False; // Setting it to true wont resolve the issue
lobj.OID := OID; // For this example, we assume that the large object already exist
lobj.ReadBlob;
lobj.Clear; // With cached=False, the size is not reset to 0 here. I also tried Truncate(0)
ms.Position := 0;
lobj.LoadFromStream(ms); // The large object size is not set to the ms size
lobj.WriteBlob; // Here, the large object seems to be written on top of the previous object, without truncating data to the new size
lobj.CloseObject;
pgConnection.Commit;
Using PgDAC version 6.3.2 and Delphi 10.3 update 2
I need a workaround for a commercial product
Thank you