You are not logged in.
Pages: 1
Hi, I use TSynBigTable class to store a table which has a unique fileld used as primary key.
I want to flush to disk every changes of records.
I use the UpdateToFile(True, false) function, which works fine after inserting new records (with new key field value). Instead, something goes wrong after updating some records (using an already used primary key value).
The main symptom that something has gone wrong is that when I try to reload the saved file I get an error stating 'TFileBufferReader: invalid content'.
But even before reloading the file, if I try multiple update operations at a certain point I find that the number of records in the dataset is wrong: for instance, if I try to continually resave records with keys ranging from 1 to 1000, at a certain point I find that the number of records is no more 1000, but it starts growing (1001, 1002, 1005, 1007 or something like that, and so on).
If instead of UpdateToFile, I use Pack() function after every block of inserts/updates, it works like a charm.
In attach a piece of code:
recordId := GetRecordId(AKeyFieldName, keyFieldValue);// get record Id searching primary key in table
rec.Init(FSynTable.Table, recordId);
PopulateFieldsValues();
if recordId > 0 then
begin //found KEY
FSynTable.RecordUpdate(rec);
end else // not found..add a new rec
FSynTable.RecordAdd(rec);
Is it correct code? Am I do anything wrong?
Thanks in advace.
Cocce
Offline
Do you have any code to reproduce the issue?
Did you retrieve the latest version from http://synopse.info/fossil?
Offline
Do you have any code to reproduce the issue?
I must extract the piece of code... i'll submit soon
Did you retrieve the latest version from http://synopse.info/fossil?
Yes i tried it, but using internal "TestBigTable" function crash... and so i rolled back to last stable.
Offline
ab wrote:Do you have any code to reproduce the issue?
You can find the example at this link: http://dl.dropbox.com/u/10759134/BigTableTest.7z
If you press more times the add button, you'll see the wrong record number.
ab wrote:Did you retrieve the latest version from http://synopse.info/fossil?
Yes i tried it, but using internal "TestBigTable" function crashes... and so i rolled back to last stable.
Did you try the TestBigTable function inside SynBigTable.pas , I use D2007 and it crashes.
Thanks in advance
Cocce
Offline
I think there was an issue introduced by
http://synopse.info/fossil/fdiff?v1=cd4 … 9767cfae9e
I'll take a look a this!
Thanks for the report.
Offline
Here we are...
I've added a new sbtBeforeWrite step (e.g. to safely update indexes).
See http://synopse.info/fossil/info/cf26b333d0
This fixes the regression issue introduced by http://synopse.info/fossil/info/747cf5317c
Thanks for the report!
Offline
Here we are...
I've added a new sbtBeforeWrite step (e.g. to safely update indexes).
See http://synopse.info/fossil/info/cf26b333d0This fixes the regression issue introduced by http://synopse.info/fossil/info/747cf5317c
Thanks for the report!
Cool, and what about my piece of code?
Thanks
Offline
ab wrote:Do you have any code to reproduce the issue?
You can find the example at this link: http://dl.dropbox.com/u/10759134/BigTableTest.7z
If you press more times the add button, you'll see the wrong record number.
I think your code sounds correct.
Did you try the example in attach?
And so.. My code is all correct?
Thanks
Last edited by cocce (2012-03-02 11:37:07)
Offline
I can't download it now (site blocked by proxy).
I'll check later.
Hi, did you take a look?
Thanks
Offline
Hi ab, i m waiting for your answer...
I m in stuck at this problem, I apologize for my insistence
Thank you very much!
Offline
I already try the new version but the problem is still present.
The link is active and correct, how can I send you the zip file?
Thanks
Offline
I got the files.
There is indeed some problems here.
I'll investigate ASAP - but perhaps not finished before end of next week.Thanks for the report.
Hi Ab, have you got some news?
Thanks
Offline
Hi, i m waiting for your answer...
Thanks
Offline
Hi ab I discovered your lib a while ago and started to use it for a project. I can confirm that there is indeed an issue with updating records. I use TSynBigTableRecord class and the method UpdateToFile() to write the database to disk. When I do a RecordAdd(..) everything's fine but if I use RecordUpdate(..) I'm no longer able to retrieve or update or even delete that record. I still can list it with GetAllIDs but when I Search for it, it's like it's ID is no longer linked with the actual data... Do you have a fix for this bug yet because that kind of data corruption won't be good once the database will grow out of testing phase!
Thank you!
Offline
Hi, is there a news about this bug? I m blocked.... please give me an answer!
Offline
I was able to reproduce the issue (thanks to your files), but did not succeed in finding out the cause.
I suspect it is a weird border-side effect of the current implementation.
But I was not able to find out where it comes from.
I'll retry today.
Offline
Thank you very much.
I was able to reproduce the issue (thanks to your files), but did not succeed in finding out the cause.
I suspect it is a weird border-side effect of the current implementation.
But I was not able to find out where it comes from.I'll retry today.
Offline
I was able to reproduce the issue (thanks to your files), but did not succeed in finding out the cause.
I suspect it is a weird border-side effect of the current implementation.
But I was not able to find out where it comes from.I'll retry today.
Did you discover the cause?
Offline
No, sorry, I'm still stucked with this issue.
May I help you in some way?
Offline
@mpv
I sent you a email!
Offline
@cocce
No e-mail from you
I m not able to send you a mail...
Try to download this
https://dl.dropbox.com/u/10759134/BigTableTest.zip
Thanks in advance
Cocce
Offline
Hello ab,
sorry to open this issue again, but TBigTableRecord has some issues with saving updated tables with indexed fields.
consider following :
1. create TBigTableRecord with at least 1 indexed field
2. add 10 records
3. update any record (first, for example)
4. destroy TBigTableRecord
5. try to load TBigTableRecord (will fail)
if I pack TBigTableRecord before destruction everything is OK
cause :
When we modify record, TBigTableRecord will add hidden record and update OrderedIndex array
When TBigTableRecord is saved to file, OrderedIndex still contains new PhysicalIndex {11,1,2,3,4,5,6,7,8,9,10}
On next load of TBigTableRecord from file, TSynTableFieldProperties.CreateFrom will load OrderedIndex array with new PhysicalIndex {11,1,2,3,4,5,6,7,8,9,10},
but will fail to create OrderedIndexReverse (actually on sanity check "if OrderedIndex[ i ]>=OrderedIndexCount then RD.ErrorInvalidContent").
fix:
Pack TBigTableRecord before destruction (would be nice to add this to TBigTableRecord.Destroy so we can forget about it,
if TBigTableRecord has indexed fields and is updated, table is unusable anyway)
or
Take some speed penalty and ditch OrderedIndexReverse array for reverse lookup.
Make more complicated reverse lookup procedure, so you can handle ie {11,1,2,3,4,5,6,7,8,9,10}
Current implementation cannot handle OrderedIndexReverse[ 11 ].
Maybe use TDynArray.IndexOf on OrderedIndex to find indexes.
Offline
In procedure TSynBigTable.Pack
if (self=nil) or ((fDeletedCount=0) and (fAliasCount=0)) or (fCount=0) or
(fFile=0) then
should be :
if (self=nil) or ((fDeletedCount=0) and (fAliasCount=0) and (fInMemoryCount=0)) or (fCount=0) or
(fFile=0) then
because if you try to pack table with no changed records TSynTable.GetData will generate exception at
if PByte(result)^<=$7f then
Offline
Does http://synopse.info/fossil/info/f402c03383 works now?
I have added the corresponding regression tests, which seems to fix the issue.
Offline
Thanks for quick update. First I thought issue is fixed, but then I've tried this :
1. Create TSynBigTableRecord
2. Add 65537+ records
3. Destroy TSynBigTableRecord
4. Open TSynBigTableRecord
5. Modify first and last record
6. Destroy TSynBigTableRecord without packing
7. Open TSynBigTableRecord
Same ErrorInvalidContent message.
With Pack() before Destroy everything is fine.
Here is the code I've tested
procedure TForm1.BtnABClick(Sender: TObject);
var
Tbl : TSynBigTableRecord;
rec: TSynTableData;
Fld1,Fld2 : TSynTableFieldProperties;
aId,i,l,j : integer;
begin
l := 65537;
if FileExists('Test.big') then
if not DeleteFile('Test.big') then
exit;
// CREATE TABLE
Tbl := TSynBigTableRecord.Create('Test.big','Test1');
try
if not Tbl.AddField('Int',tftInt32,[tfoIndex,tfoUnique]) then
exit;
if not Tbl.AddField('BigInt',tftInt64,[tfoIndex]) then
exit;
Tbl.AddFieldUpdate;
// ADD RECORDS
Fld1 := tbl.Table['Int'];
Fld2 := tbl.Table['BigInt'];
for i := 1 to l do
begin
rec.Init(Tbl.Table);
rec.SetFieldValue(Fld1,i);
rec.SetFieldValue(Fld2,i);
aID := Tbl.RecordAdd(rec);
if aID=0 then
begin
ShowMessage('Error adding record');
exit;
end;
end;
finally
FreeAndNil(Tbl);
end;
// MODIFY DATA
Tbl := TSynBigTableRecord.Create('Test.big','Test1');
try
Fld2 := tbl.Table['BigInt'];
rec := Tbl.RecordGet(1);
if rec.ID > 0 then
begin
j := Rec.GetFieldValue(Fld2);
Rec.SetFieldValue(Fld2, j+1);
Tbl.RecordUpdate(rec);
end;
rec := Tbl.RecordGet(Tbl.Count);
if rec.ID > 0 then
begin
j := Rec.GetFieldValue(Fld2);
Rec.SetFieldValue(Fld2, j+1);
Tbl.RecordUpdate(rec);
end;
//Tbl.Pack();
finally
FreeAndNil(Tbl);
end;
Tbl := TSynBigTableRecord.Create('Test.big','Test1');
try
ShowMessage('Voila'); // never shown
finally
FreeAndNil(Tbl);
end;
end;
Offline
Pages: 1