You are not logged in.
Pages: 1
TSQLProduct = class(TSQLRecord)
private
...
fPreview: TByteDynArray;
published
...
property preview: TByteDynArray read fPreview write fPreview;
procedure FillPreview;
var
prd, p: TSQLProduct;
ids: TIntegerDynArray;
picBin: TByteDynArray;
ms: TMemoryStream;
begin
p := TSQLProduct.CreateAndFillPrepare(Database, '');
if Database.TransactionBegin(TSQLProduct) then
try
Database.BatchStart(TSQLProduct);
while p.FillOne do begin
if p.preview <> nil then Continue;
prd := TSQLProduct.Create(Database, p.ID, True);
try
ms := TMemoryStream.Create;
try
ms.LoadFromFile(pathDir + p.fPath);
SetLength(picBin, ms.Size);
Move(ms.Memory^, picBin[0], ms.Size);
prd.preview := picBin;
finally
ms.Free;
end;
Database.BatchUpdate(prd);
finally
FreeAndNil(prd);
end;
end;
Database.BatchSend(ids);
Database.Commit;
except
Database.RollBack;
end;
FreeAndNil(p);
end;
directory contains about 3000 jpg, but update only 512, i found in source MAX_SQLLOCKS constant and change to 1024 and re run test, 1024 record updated
/// maximum number of the locked record in a Table (used in TSQLLocks)
// - code is somewhat faster and easier with a fixed cache size
// - 512 seems big enough on practice
MAX_SQLLOCKS = 1024;
how to update all records regardless of the constants MAX_SQLLOCKS?
Last edited by noobies (2014-04-09 14:44:23)
Offline
As a workaround (If your businnes logic allow) - use commit on every 512 record. In all case this is faster compared to long transaction.
i try but no effect
We have just removed MAX_SQLLOCKS constant non-sense (replaced by two dynamic arrays).
Should work as you expected now.:)
thanks, now work pretty well
Offline
Pages: 1