You are not logged in.
Pages: 1
Hi Arnaud,
I'm back at work today and I've reviewed your changes.
It works fine now!
I think that the bug I reported is now fixed.
Sorry for posting such a long source code here in the forum, next time I will use a different way for that.
Thank you very much for this bug fix!
Best regards from Germany
Hi Arnaud,
we identified a broken change between v.2.2.6584 (works) and the current v.2.2.8711 (broken).
The following example reproduces it:
program Project1;
{$APPTYPE CONSOLE}
{$R *.res}
uses
System.SysUtils,
mormot.core.base,
mormot.core.collections;
type
TMyValue = record
Field01: TGUID;
Field02: TGUID;
Field03: TGUID;
end;
TCachedValues<TIdx; TRec> = class
public type
TRtcKey = record
CustomId: Word;
Key : TIdx;
end;
TRtcValue = record
CustomId: Integer;
Rec : TRec;
end;
private
FDict: IKeyValue<TRtcKey,TRtcValue>;
function HashKey(const AKey): Cardinal;
function CompareKey(const A, B): Integer;
public
constructor Create;
function TryGet(const AKey: TIdx; out Rec: TRec): Boolean;
end;
{ ======================================================================= }
{ TCachedValues<TIdx, TRec> }
{ ======================================================================= }
constructor TCachedValues<TIdx, TRec>.Create;
begin
FDict:=Collections.NewPlainKeyValue<TRtcKey,TRtcValue>;
// FDict.Data.Keys.EventCompare:=CompareKey;
// FDict.Data.Keys.EventHash:=HashKey;
end;
{ ----------------------------------------------------------------------- }
function TCachedValues<TIdx, TRec>.HashKey(const AKey): Cardinal;
begin
Result:=DefaultHasher(0,@AKey,SizeOf(TRtcKey));
end;
{ ----------------------------------------------------------------------- }
function TCachedValues<TIdx, TRec>.CompareKey(const A, B): Integer;
begin
Result:=MemCmp(@A,@B,SizeOf(TRtcKey));
end;
{ ----------------------------------------------------------------------- }
function TCachedValues<TIdx, TRec>.TryGet(const AKey: TIdx; out Rec: TRec): Boolean;
var
Key: TRtcKey;
Res: TRtcValue;
begin
Key:=Default(TRtcKey);
Key.Key:=AKey;
Result:=FDict.TryGetValue(Key,Res);
if Result
then Rec:=Res.Rec;
end;
{ ======================================================================= }
var
CachedValues: TCachedValues<TGUID,TMyValue>;
OneValue : TMyValue;
begin
try
CachedValues:=TCachedValues<TGUID,TMyValue>.Create;
try
if not CachedValues.TryGet(TGUID.NewGuid,OneValue)
then raise Exception.Create('Not found');
finally
CachedValues.Free;
end;
except
on E: Exception do
Writeln(E.ClassName, ': ', E.Message);
end;
readln;
end.
This example throws the expected Exception "Not found" in v.2.2.6584, but a "EDynArray: TDynArrayHasher.FindOrNew fatal collision: aHashCode=00000000..." in the current version.
We investigated the cause and found a solution, we have to define a EventCompare and EventHash (commented lines in TCachedValues<TIdx, TRec>.Create). But we also not sure which way we have to go in the future and whether it is a bug or not.
Currently we prefer to stay on the stable v.2.2.6584.
Thank you for your clarification in advance.
Best regards
This works for us.
Thank you so much for your consistently fast and reliable support!
Notice:
Previously the check was if fHasher<>nil and changed to if fHasher = nil in
commit 51627ba2a2d2b70c9604115cf36cd367c59e1b29.
Maybe it was correct previously?
Hi ab!
Recently we had upgraded from mORMot 2.0.5027 to the latest stable 2.2.6584. Since then, we've encountered an exception in some parts of our code "x.AddSorted() is not allowed with loCreateUniqueIndex: use Add()". This occurs when using generic lists with AddSorted and a custom compare function.
Here is an example, how to reproduce the exception:
program Project1;
{$APPTYPE CONSOLE}
{$R *.res}
uses
System.SysUtils,
System.Math,
Mormot.core.collections;
type
TTestRecord = record
Test: Integer;
end;
function SomeComparer(const Item1, Item2): Integer;
begin
Result:=CompareValue(TTestRecord(Item1).Test,TTestRecord(Item2).Test);
end;
var
ListUnderTest: IList<TTestRecord>;
begin
try
ListUnderTest:=Collections.NewPlainList<TTestRecord>([]);
ListUnderTest.Comparer:=SomeComparer;
var RecA:=Default(TTestRecord);
RecA.Test:=321;
var RecB:=Default(TTestRecord);
RecB.Test:=123;
ListUnderTest.AddSorted(RecA);
ListUnderTest.AddSorted(RecB);
for var LoopRec in ListUnderTest do
begin
Writeln(LoopRec.Test);
end;
except
on E: Exception do
Writeln(E.ClassName, ': ', E.Message);
end;
readln;
end.
The output of this console example is:
EIList: TIList<Project1.TTestRecord>.AddSorted() is not allowed with loCreateUniqueIndex: use Add()
We analyzed the cause and believe the check in mormot.core.collections (specifically in TIListParent.DoAddSorted) is incomplete:
function TIListParent.DoAddSorted(const value; wasadded: PBoolean): integer;
begin
if fHasher = nil then // << this check don't consider the compare function
raise EIList.CreateUtf8('%.AddSorted() is not allowed with ' +
'loCreateUniqueIndex: use Add()', [self]);
result := fDynArray.FastLocateOrAddSorted(value, wasadded);
end;
Instead, the check should be:
function TIListParent.DoAddSorted(const value; wasadded: PBoolean): integer;
begin
if (fHasher = nil) and (@fDynArray.compare = nil) then // << consider also the compare function
raise EIList.CreateUtf8('%.AddSorted() is not allowed with ' +
'loCreateUniqueIndex: use Add()', [self]);
result := fDynArray.FastLocateOrAddSorted(value, wasadded);
end;
Many thanks for your time and consideration!
Best regards from Germany
Hi ab,
I had no intention of attacking you! I just wanted to share the results.
Note that iterating on a dictionary is not what a dictionary was meant to, so it seems a bit as a weird test.
Iterating over all keys in a dictionary to test the speed of TryGetValue is not a weird test for me. Here it is obviously, that TryGetValue of Spring dictionaries is over 90% faster (286 msec. vs 557 msec.) in this case.
You need also to compare the memory consumption and the executable size, to be fair.
I know, that the dcus are much smaller, when we use generic lists/dicts with mormot collections. This is a big plus! In this benchmarks, we want to see other metrics, this has nothing to do with "fair", it's simply a comparison.
Hi ab,
we have done some benchmarkings here and I was surprised, that the collections of Spring v2 was at almost every aspect faster than the collections of mormot v2. So I want to share the benchmark code here, so you can make your mormot even faster. We used Delphi 11.3.
Here the code:
program SpringVsMormotCollections;
{$APPTYPE CONSOLE}
{$R *.res}
uses
System.SysUtils,
System.Diagnostics,
mormot.core.collections,
Spring,
Spring.Collections;
{ ----------------------------------------------------------------------- }
type
TBenchmarks = class
private
function CreateTestListString(ACount: Integer; const APrefix: String): TArray<String>;
procedure RunBenchmark(const AProc: TProc; ABenchmarkRounds: Integer);
procedure ListBenchmark;
procedure DictBenchmark;
public
procedure Run;
end;
{ ======================================================================= }
{ TBenchmarks }
{ ======================================================================= }
function TBenchmarks.CreateTestListString(ACount: Integer; const APrefix: String): TArray<String>;
begin
SetLength(Result, ACount);
for var Loop:=0 to ACount-1
do Result[Loop]:=APrefix+' #'+IntToStr(Loop+1);
end;
{ ----------------------------------------------------------------------- }
procedure TBenchmarks.DictBenchmark;
const
BenchRounds = 100;
begin
var TestKeys:=CreateTestListString(100000,'Key');
var TestValues:=CreateTestListString(100000,'Value');
var MormotDict:=Collections.NewKeyValue<String,String>;
var SpringDict:=TCollections.CreateDictionary<String,String>;
Writeln(Format('Populate MormotDict with %d items...',[Length(TestKeys)]));
RunBenchmark(
procedure
begin
MormotDict.Clear;
for var Loop:=0 to Length(TestKeys)-1
do MormotDict.Add(TestKeys[Loop],TestValues[Loop]);
end,BenchRounds);
Writeln(Format('Populate SpringDict with %d items...',[Length(TestKeys)]));
RunBenchmark(
procedure
begin
SpringDict.Clear;
for var Loop:=0 to Length(TestKeys)-1
do SpringDict.Add(TestKeys[Loop],TestValues[Loop]);
end,BenchRounds);
Writeln('Compare MormotDict with SpringDict...');
RunBenchmark(
procedure
var
CurKey : String;
CurMormotValue: String;
CurSpringValue: String;
begin
for var Loop:=0 to Length(TestKeys)-1 do
begin
CurKey:=TestKeys[Loop];
if not
(
MormotDict.TryGetValue(CurKey,CurMormotValue) and
SpringDict.TryGetValue(CurKey,CurSpringValue) and
(CurMormotValue=CurSpringValue)
)
then raise Exception.Create('Different content!');
end;
end,BenchRounds);
Writeln('Iterate MormotDict by key (TryGetValue)...');
RunBenchmark(
procedure
var
CurKey : String;
CurValue: String;
begin
for var Loop:=0 to Length(TestKeys)-1 do
begin
CurKey:=TestKeys[Loop];
if not MormotDict.TryGetValue(CurKey,CurValue)
then raise Exception.Create('Something is wrong!');
end;
end,BenchRounds);
Writeln('Iterate SpringDict by key (TryGetValue)...');
RunBenchmark(
procedure
var
CurKey : String;
CurValue: String;
begin
for var Loop:=0 to Length(TestKeys)-1 do
begin
CurKey:=TestKeys[Loop];
if not SpringDict.TryGetValue(CurKey,CurValue)
then raise Exception.Create('Something is wrong!');
end;
end,BenchRounds);
Writeln('Iterate (for..in) MormotDict ...');
RunBenchmark(
procedure
var
CurKey : String;
CurValue: String;
begin
for var Entry in MormotDict do
begin
CurKey:=Entry.Key;
CurValue:=Entry.Value;
end;
end,BenchRounds);
Writeln('Iterate (for..in) SpringDict ...');
RunBenchmark(
procedure
var
CurKey : String;
CurValue: String;
begin
for var Entry in SpringDict do
begin
CurKey:=Entry.Key;
CurValue:=Entry.Value;
end;
end,BenchRounds);
end;
{ ----------------------------------------------------------------------- }
procedure TBenchmarks.ListBenchmark;
const
BenchRounds = 100;
begin
var TestData:=CreateTestListString(100000,'Value');
var MormotList:=Collections.NewList<String>;
var SpringList:=TCollections.CreateList<String>;
Writeln(Format('Populate MormotList with %d items...',[Length(TestData)]));
RunBenchmark(
procedure
begin
MormotList.Clear;
for var Entry in TestData do
begin
MormotList.Add(Entry);
end;
end,BenchRounds);
Writeln(Format('Populate SpringList with %d items...',[Length(TestData)]));
RunBenchmark(
procedure
begin
SpringList.Clear;
for var Entry in TestData do
begin
SpringList.Add(Entry);
end;
end,BenchRounds);
Writeln('Compare MormotList with SpringList...');
RunBenchmark(
procedure
begin
for var Loop:=0 to SpringList.Count-1 do
begin
if MormotList[Loop]<>SpringList[Loop]
then raise Exception.Create('Different content!');
end;
end,BenchRounds);
Writeln('Iterate (for..in) through MormotList...');
RunBenchmark(
procedure
begin
var LastEntry: String;
for var Entry in MormotList do
begin
LastEntry:=Entry;
end;
end,BenchRounds);
Writeln('Iterate (for..in) through SpringList...');
RunBenchmark(
procedure
begin
var LastEntry: String;
for var Entry in SpringList do
begin
LastEntry:=Entry;
end;
end,BenchRounds);
Writeln('Iterate (for..to) through MormotList...');
RunBenchmark(
procedure
begin
var LastEntry: String;
for var Loop:=0 to MormotList.Count-1 do
begin
LastEntry:=MormotList[Loop];
end;
end,BenchRounds);
Writeln('Iterate (for..to) through SpringList...');
RunBenchmark(
procedure
begin
var LastEntry: String;
for var Loop:=0 to SpringList.Count-1 do
begin
LastEntry:=SpringList[Loop];
end;
end,BenchRounds);
end;
{ ----------------------------------------------------------------------- }
procedure TBenchmarks.RunBenchmark(const AProc: TProc; ABenchmarkRounds: Integer);
begin
var Stopper:=TStopwatch.StartNew;
for var Loop:=1 to ABenchmarkRounds do
begin
AProc;
end;
Stopper.Stop;
Writeln(Format('- (%d runs in %d msec.)',[ABenchmarkRounds,Stopper.ElapsedMilliseconds]));
Writeln;
end;
{ ----------------------------------------------------------------------- }
procedure TBenchmarks.Run;
begin
Writeln('---ListBenchmark---');
ListBenchmark;
Writeln('---DictBenchmark---');
DictBenchmark;
end;
{ ======================================================================= }
begin
try
var Benchmarks:=TBenchmarks.Create;
try
Benchmarks.Run;
finally
Benchmarks.Free;
end;
except
on E: Exception do
Writeln(E.ClassName, ': ', E.Message);
end;
Writeln('Done. Press Enter for exit.');
Readln;
end.
And here some results:
---ListBenchmark---
Populate MormotList with 100000 items...
- (100 runs in 260 msec.)
Populate SpringList with 100000 items...
- (100 runs in 228 msec.)
Compare MormotList with SpringList...
- (100 runs in 212 msec.)
Iterate (for..in) through MormotList...
- (100 runs in 353 msec.)
Iterate (for..in) through SpringList...
- (100 runs in 192 msec.)
Iterate (for..to) through MormotList...
- (100 runs in 100 msec.)
Iterate (for..to) through SpringList...
- (100 runs in 102 msec.)
---DictBenchmark---
Populate MormotDict with 100000 items...
- (100 runs in 1198 msec.)
Populate SpringDict with 100000 items...
- (100 runs in 926 msec.)
Compare MormotDict with SpringDict...
- (100 runs in 846 msec.)
Iterate MormotDict by key (TryGetValue)...
- (100 runs in 557 msec.)
Iterate SpringDict by key (TryGetValue)...
- (100 runs in 286 msec.)
Iterate (for..in) MormotDict ...
- (100 runs in 641 msec.)
Iterate (for..in) SpringDict ...
- (100 runs in 372 msec.)
I have just updated to f53ad31ef60a34e9 and also the sqlite obj files and hope that this issue will gone.
I know, that it's not the fine way to use some pending code, but I have no other way out...you remember for Problem with Currency properties
Your Virtual Tables solution sounds really cool, I explore it, may be I need it in the near future.
Again, thank you VERY MUCH!
Hi Arnaud,
this morning I received a bug report (determined by madExcept) from one user of Lauge. The worst fact is, that I can't figure out what happens and also can't reproduce it.
Here the call stack:
007330f0 +000 Lauge.exe SynCommons 14366 +0 SortDynArrayAnsiString
00733873 +013 Lauge.exe SynCommons 15157 +2 TDynArray.Equals
00733f2f +04b Lauge.exe SynCommons 15583 +8 TDynArrayHashed.HashFind
00733c04 +01c Lauge.exe SynCommons 15414 +2 TDynArrayHashed.FindHashedForAdding
007497d0 +018 Lauge.exe SQLite3 3322 +1 TSQLRestServerDB.PrepareStatement
007499e7 +1cf Lauge.exe SQLite3 3380 +42 TSQLRestServerDB.GetAndPrepareStatement
0074a4e4 +068 Lauge.exe SQLite3 3640 +8 TSQLRestServerDB.EngineList
007437cb +38b Lauge.exe SQLite3Commons 11699 +79 TSQLRestServer.URI
0074b06b +073 Lauge.exe SQLite3 3957 +4 TSQLRestClientDB.URI
007415ef +047 Lauge.exe SQLite3Commons 10643 +3 TSQLRestClientURI.ExecuteList
00741fd7 +017 Lauge.exe SQLite3Commons 10890 +1 TSQLRestClientURI.InternalListJSON
00740dfb +06f Lauge.exe SQLite3Commons 10279 +3 TSQLRest.Retrieve
0073e98e +056 Lauge.exe SQLite3Commons 8784 +3 TSQLRecord.Create
007e3275 +075 Lauge.exe LaugeCDB 543 +4 TCategoriesDB.GetCategoryFromDB
...
and here my usage counterpart:
function TCategoriesDB.GetCategoryFromDB(CategoryID:Int64;
AdditionalWhereCondition:String):TSQLRecordCategory;
begin
if AdditionalWhereCondition <> '' then
AdditionalWhereCondition:=' AND ' + AdditionalWhereCondition;
Result:=TSQLRecordCategory.Create(GLF.LDB.RestClient,
'(SiteID = :(%):) AND (CategoryID = :(%):)%',
[SiteID, CategoryID, AdditionalWhereCondition]);
if Result.ID = 0 then
FreeAndNil(Result);
end;
I use your whole repository checkout cd47cf4439b92723 with the sqlite3.obj (3.7.5) from your v.1.12.
May be this is a known issue and you have already fixed it?
BTW: I think we need a code freeze (new release), hence all users of your SQLite 3 Framework use the same source.
Best regards!
Hi Arnaud,
I found a small issue in TSQLRestServerStats.DebugMessage:
resourcestring
sRestServerStatsDebugMessage =
'Clients now=%d max=%d'#13+
'Data in=%s out=%s'#13+
'Requests=%d Updates=%d Invalid=%d'
{$ifdef MSWINDOWS} // NOTE: This conditional should be also "WITHSTATPROCESS"
+#13'Time spent in requests = %s'
{$endif};
function TSQLRestServerStats.DebugMessage: string;
begin
result := format(sRestServerStatsDebugMessage,
[ClientsCurrent,ClientsMax,kb(IncomingBytes),kb(OutcomingBytes),
Responses,Modified,Invalid{$ifdef WITHSTATPROCESS},ProcessTimeString{$endif}]);
end;
The format function fails, if WITHSTATPROCESS is not defined, because the resource string don't match the parameters.
Best regards!
Thank you for the fast bug fix!
I'm looking forward to the new 1.13
P.S.: I sent (for a couple weeks) a personal message through this forum. Did you received it?
Hi Arnaud,
today I noticed, that published Currency properties of TSQLRecord descendants aren't serialized properly in TSQLRecord.GetJSONValues:
procedure TSQLRecord.GetJSONValues(W: TJSONWriter);
begin
...
sftCurrency:
W.AddCurr64(Fields[i]^.GetInt64Value(self)); // NOTE: Take a look at GetInt64Value...
...
end;
function TPropInfo.GetInt64Value(Instance: TObject): Int64;
begin
if (Instance<>nil) and (@self<>nil) then
// NOTE: The Kind of a Currency is tkFloat but here is no handling for it!
case PropType^^.Kind of
tkInteger,tkEnumeration,tkSet,{$ifdef FPC}tkBool,{$endif}tkClass:
result := GetOrdProp(Instance,pointer(@self));
tkInt64:
result := GetInt64Prop(Instance,pointer(@self));
else result := 0;
end else
result := 0;
end;
with my quick fix it works well again:
procedure TSQLRecord.GetJSONValues(W: TJSONWriter);
begin
...
sftCurrency:
W.AddCurr64(GetInt64Prop(self, Fields[i]));
...
end;
If you need more info, lets me know and I try to create a small test case...
Best regards!
Wow, it sounds very interesting!
I just pulled all files from trunk and saw, that you have already fixed this case in AfterDeleteCoherency.
It's time for 1.13
Ok, here some odd things:
TSQLRestServerDB.EngineExecuteAll never calls TSQLRestServerDB.GetAndPrepareStatement
But TSQLRestServer.AfterDeleteForceCoherency can build a batch of prepared UPDATE queries and pass them to TSQLRestServerDB.EngineExecuteAll
I think that is it and hope you can fix it.
Hi Arnaud,
I think I found a bug, when prepared queries are used. Today I noticed some entries in my error log like this:
2011-04-04 15:24:45 TSQLRestServerDB.EngineExecuteAll Error: unrecognized token: ":" UPDATE ItemFolderProperty SET ItemFolderID=0 WHERE ItemFolderID=:(8):;
This query was generated in following stack:
SQLite3.sqlite3_check(38752936,1)
SQLite3.TSQLRequest.Prepare(???,???)
SQLite3.TSQLRequest.ExecuteAll(???,???)
SQLite3.TSQLDatabase.ExecuteAll('UPDATE ItemFolderProperty SET ItemFolderID=0 WHERE ItemFolderID=:(8):; ')
SQLite3.TSQLRestServerDB.EngineExecuteAll('UPDATE ItemFolderProperty SET ItemFolderID=0 WHERE ItemFolderID=:(8):; ')
SQLite3Commons.TSQLRestServer.AfterDeleteForceCoherency(TSQLRecordItemFolder,8)
SQLite3Commons.TSQLRestServer.URI('root/ItemFolder/8','DELETE','','','',$892920)
SQLite3.TSQLRestClientDB.URI('root/ItemFolder/8','DELETE',???,???,$18FC98 {''})
SQLite3Commons.TSQLRestClientURI.Delete(TSQLRecordItemFolder,8)
LaugeIDB.TItemsDB.DeleteFolder($25CB670)
My DeleteFolder method is very simple and looks so:
procedure TItemsDB.DeleteFolder(Folder:TSQLRecordItemFolder);
begin
GLF.LDB.RestClient.Delete(TSQLRecordItemFolder, Folder.ID);
DoFolderDeleted(Folder);
end;
It seems, that the replace process of ':(' and '):' to '?' failed or the cache (your pool of prepared stmts) becomes invalid.
I try to localize the bug and write here some more detail.
I use your official 1.12 version.
The collaboration begins...
I just installed it to have the new method TSQLRestServer.CreateSQLMultiIndex.
This and all other code (used by myself) works very well!
You are a real Open Source Fighter!
Thank you! (I think three exclamation marks are enough )
Oh, ok, thank you for the clarification!
This are things, which I have to learn step by step I saw the ancestor TSQLRestClientURI of TSQLRestClientDB an have thought, that this is an additional client/server concept.
Thank you for the fast answer!
Hi ab!
I was busy with other stuff in the past weeks and today I can further work on Lauge, so I upgraded to the new release 1.11. I noticed the changes on the pivot solution and was on the way to complete all related adjustments in my code...but then I run at an impasse.
I don't need an additional client/server approach, instead I work directly with an TSQLRestServerDB instance. Further I don't need (at least not in Lauge) any DB independence.
The new methods DestGetJoined/DestGetJoinedTable of TSQLRecordMany requires as client a TSQLRestClient while the older (for example) DestGet method goes the flexible way and accepty any descendants of TSQLRest. There I was able to pass my TSQLRestServerDB instance...
Of course I saw the reason, you use TSQLRestClient.ExecuteList...but weren't the same possible with TSQLRest.InternalListJSON?
Or is the usage of an TSQLRestClient descendant a must?
I hope not...
It sounds to me, that you are offended about my post?
Many apologies, if it's so!
I hope for a good collaboration further.
Very cool methods, which allows fast one-liner, but can be also evil, because they rely on the order and count of the published properties.
Imagine, you write a part of a application today with this construct on several places and in a couple months you extend your TSQLRecord descendant, the code further compiles successful, but fails silently (without an exception) on runtime. Really bad were the changing (can be a mistake) of the properties, in such case it happily insert wrong data. This can be very frustrated.
These are just my worries.
But why not replace "aSimpleFields: array of const" with "aFieldsJSON:RawUTF8".
Example call were like that:
ItemID1 := RestServer.Add(TSQLRecordItem, '{"OriginID":"123","Title":"FirstItem"}');
Advantages:
* No need to list all values
* Independent from the order
* The field names can be validated and a exception thrown, if anything is wrong
Disadvantages:
* Still no checks by compiler
* Is just a string
In delphi I see no perfect solution for this kind of concept.
I'm not sure, whether this simple test project is related to examples, but here as promised:
program CollateFunctionIssue;
{$APPTYPE CONSOLE}
uses
SysUtils,
SynCommons,
SQLite3Commons,
SQLite3,
DateUtils;
type
TSQLRecordItem = class(TSQLRecord)
private
FItemID:RawUTF8;
FTitle:RawUTF8;
FStartTime:TDateTime;
FEndTime:TDateTime;
published
property OriginItemID:RawUTF8 read FItemID write FItemID stored FALSE;
property StartTime:TDateTime read FStartTime write FStartTime;
property EndTime:TDateTime read FEndTime write FEndTime;
property Title:RawUTF8 read FTitle write FTitle;
end;
TSQLRecordItemFolder = class(TSQLRecord); // Isn't really needed for this example, just for ref
TSQLRecordItemFolderEntry = class(TSQLRecord)
private
FItemFolderID:TSQLRecordItemFolder;
FItemID:TSQLRecordItem;
published
property ItemFolderID:TSQLRecordItemFolder read FItemFolderID write FItemFolderID;
property ItemID:TSQLRecordItem read FItemID write FItemID;
end;
var
RestServer:TSQLRestServerDB;
function AddItem(OriginItemID, Title:String; StartTime:TDateTime):Integer;
var
Item:TSQLRecordItem;
begin
Item:=TSQLRecordItem.Create;
try
Item.OriginItemID:=StringToUTF8(OriginItemID);
Item.Title:=StringToUTF8(Title);
Item.StartTime:=StartTime;
Result:=RestServer.Add(Item, TRUE);
finally
Item.Free;
end;
end;
procedure AddItemToFolder(ItemID, FolderID:Integer);
var
FolderEntry:TSQLRecordItemFolderEntry;
begin
FolderEntry:=TSQLRecordItemFolderEntry.Create;
try
FolderEntry.ItemFolderID:=Pointer(FolderID);
FolderEntry.ItemID:=Pointer(ItemID);
RestServer.Add(FolderEntry, TRUE);
finally
FolderEntry.Free;
end;
end;
procedure AddTestData;
var
ItemID1, ItemID2, ItemID3:Integer;
begin
Writeln('Begin AddTestData');
ItemID1:=AddItem('123', 'First Item', Now);
ItemID2:=AddItem('456', 'Second Item', IncDay(Now, 1));
ItemID3:=AddItem('789', 'Third Item', IncDay(Now, 2));
AddItemToFolder(ItemID1, 1);
AddItemToFolder(ItemID1, 2);
AddItemToFolder(ItemID2, 1);
AddItemToFolder(ItemID2, 2);
AddItemToFolder(ItemID3, 3);
Writeln('End AddTestData');
end;
const
DBFileName = 'Test.sqlite';
var
DBExists:Boolean;
CompoundSQL:RawUTF8;
JSONResult:RawUTF8;
JSONTable:TSQLTableJSON;
Item:TSQLRecordItem;
begin
DBExists:=FileExists(DBFileName);
RestServer:=TSQLRestServerDB.Create(TSQLModel.Create([TSQLRecordItem, TSQLRecordItemFolder,
TSQLRecordItemFolderEntry]), DBFileName);
if not DBExists then
begin
RestServer.CreateMissingTables(0);
AddTestData
end
else
Writeln('DB already exists.');
{**
* Folder 1 | 2 | 3 |
* ----------------------
* Item 1 | X | X | |
* 2 | X | X | |
* 3 | | | X |
*
* @see AddTestData
*
* This query should return only the Item #3!
*}
CompoundSQL:='SELECT a.* FROM Item AS a JOIN (SELECT DISTINCT ItemID FROM ItemFolderEntry WHERE (ItemFolderID = 1) OR (ItemFolderID = 3)) AS aa ON (aa.ItemID = a.ID)' + #13#10 +
'EXCEPT ' + #13#10 +
'SELECT a.* FROM Item AS a JOIN (SELECT DISTINCT ItemID FROM ItemFolderEntry WHERE (ItemFolderID = 2)) AS aa ON (aa.ItemID = a.ID)';
Writeln; Writeln(CompoundSQL); Writeln;
JSONResult:=RestServer.DB.ExecuteJSON(CompoundSQL, TRUE);
if JSONResult = '' then
Exit;
JSONTable:=TSQLTableJSON.Create([TSQLRecordItem], CompoundSQL, JSONResult);
Item:=TSQLRecordItem.Create;
try
Item.FillPrepare(JSONTable);
while Item.FillOne do
Writeln(Format('Item #%d, "%s"', [Item.ID, UTF8ToString(Item.Title)]));
finally
Item.Free;
JSONTable.Free;
end;
Readln;
RestServer.Free;
end.
Good news, I figured out and fixed the bug!
If a field with your custom collate ISO8601 is empty '' (not NULL), so SQLite calls the registered collate function with length 0 for s1len or s2len, but the pointers s1 or s2 maps to the string of the previous call. So no errors or av occurs, but the results was wrong. I extended the ISO8601 collate function as follows and now it works like it should.
function Utf8SQLDateTime(CollateParam: pointer; s1Len: integer; s1: pointer;
s2Len: integer; s2: pointer) : integer; {$ifdef USEC}cdecl;{$endif}
var V1,V2: Int64; // faster than Iso8601ToDateTimePChar: uses integer math
begin
// begin patch
if s1Len = 0 then
s1:=PUTF8Char('');
if s2Len = 0 then
s2:=PUTF8Char('');
// end patch
if (s1=s2) then begin
result := 0;
exit;
end;
V1 := Iso8601ToSecondsPUTF8Char(s1,s1Len);
V2 := Iso8601ToSecondsPUTF8Char(s2,s2Len);
if (V1=0) or (V2=0) then // any invalid date -> compare as UTF-8 strings
result := UTF8ILComp(s1,s2,s1Len,s2Len) else
if V1<V2 then
result := -1 else
if V1=V2 then
result := 0 else
result := +1;
end;
May be the same issue exists in the function SQLite3.Utf8SQLCompNoCase, because here are also no length checks.
It's time for next commit
P.S. I can post the announced test project, if you are interesting...?
IMHO collations are nothing to do with this. You are just comparing integer values (ItemFolderID or such), which don't use collation at all.
To be sure, put a debugger breakpoint in the collation routines, or comment them in TSQLDataBase.DBOpen (SQLite3.pas).
You are partially right. The subselect for the right part of the join (...FROM Item AS a JOIN (SELECT DISTINCT)...) don't need collate, because there is only a compare/select on integers. But the compund operators (EXCEPT, UNION...) require the collate functions, if the result contains text fields which has a default collate, see SQLite doc (section Compound Select Statements). Yes I set a break in Utf8SQLCompNoCase and it stops on each text column with collate for every row, on executing the posted query.
So I suspect your CompoundSQL statement is not transmitted to the SQLite engine.
Step by step debugg the TSQLRequest.Execute method.
Is the retrieved JSON content correct?
The posted query is executed without any errors, the JSON content is well formed and filled with unexpected items .
Which SQLite3 cli version are you using?
What is the purpose of your ConvertItem function?
The SQLite cli is 3.7.2.
ConvertItem converts only from TSQLRecordItem to another TObject descendant, there is no high level actions, so I can definitely say, that there is nothing wrong.
I write currently a test project to reproduce the behaviour and post it here later.
I look and look and can't find anything.
The only difference are the custom collate functions. It's possible that one of them work not correct and so SQLite is not able to detect identical rows? The documentation says:
For the purposes of determining duplicate rows for the results of compound SELECT operators, NULL values are considered equal to other NULL values and distinct from all non-NULL values. The collation sequence used to compare two text values is determined as if the columns of the left and right-hand SELECT statements were the left and right-hand operands of the equals (=) operator, except that greater precedence is not assigned to a collation sequence specified with the postfix COLLATE operator. No affinity transformations are applied to any values when comparing rows as part of a compound SELECT.
In the "Item" table I have both framework collations ISO8601 and SYSTEMNOCASE. May be this is the issue.
Hello, sadly I'm again,
but I'm currently completely stuck on a simple task and hope to find the answer through this post.
I build in a function a query (for example):
SELECT a.* FROM Item AS a JOIN (SELECT DISTINCT ItemID FROM ItemFolderEntry WHERE (ItemFolderID = 1) OR (ItemFolderID = 3)) AS aa ON (aa.ItemID = a.ID)
EXCEPT
SELECT a.* FROM Item AS a JOIN (SELECT DISTINCT ItemID FROM ItemFolderEntry WHERE (ItemFolderID = 2)) AS aa ON (aa.ItemID = a.ID)
and execute it as this:
...
JSONResult:=TSQLAccess.RestServer.DB.ExecuteJSON(CompoundSQL);
if JSONResult = '' then
Exit;
JSONTable:=TSQLTableJSON.Create([TSQLRecordItem], CompoundSQL, JSONResult);
Item:=TSQLRecordItem.Create;
try
Item.FillPrepare(JSONTable);
while Item.FillOne do
TargetList.Add(ConvertItem(Item));
finally
Item.Free;
JSONTable.Free;
end;
...
I retrieve results, but not what I expect. The compund query is not considered, there can be a UNION, INTERSECT, EXCEPT but it has no effect, it's always ignored.
When I use the same query directly in SQLite (cli) on the same tables (copies without the collations SYSTEMNOCASE or ISO8601) I get the result like concepted.
Is there any code in the framework (i has researched and hasn't found), which manipulate such compound queries?
May be I figured out me imprecise. Follows a detailed try of my concept.
TSQLRecord = class(TObject)
protected
procedure FillInitialize; virtual;
public
procedure FillRow(Row: integer; Dest: TSQLRecord=nil); // can still be static
end;
implementation
procedure TSQLRecord.FillRow(Row: integer; Dest: TSQLRecord=nil);
var i: integer;
P: PPropInfo;
U: PPUTF8CharArray;
begin
// 0. validate params
if (self=nil) or (fTable=nil) or (pointer(TableMap)=nil) or
(cardinal(Row)>cardinal(fTable.RowCount)) then exit;
U := @fTable.fResults[Row*fTable.FieldCount]; // U=PPUTF8CharArray of this Row
if Dest=nil then
Dest := self;
// 1. get ID field
if IDMap>=0 then
Dest.fID := GetInteger(U[IDMap]);
// 2. update published fields values
with ClassProp^ do begin
P := @PropList;
for i := 0 to PropCount-1 do begin
if TableMap[i]>=0 then // don't nil missing properties
P^.SetValue(Dest,U[TableMap[i]]); // update existing value
P := P^.Next;
end;
end;
Dest.FillInitialize; // <--
end;
procedure TSQLRecord.FillInitialize;
begin
// Is empty in TSQLRecord, but can be implemented in descendants
end;
Further there are other locations, where the call of FillInitialize would makes sense e.g. in TSQLRecord.CreateCopy.
But at this time it's enough for me, if FillRow is overrideable.
Perfect "team" work.
Thank you.
Another idea on the same subject. May be it would be better to introduce a new (dynamic) method, which is empty in TSQLRecord, but will be called by FillRow at the right place. For example:
TSQLRecord = class(TObject)
public
procedure Reset; dynamic;
end;
???
You are the master.
Hello Arnaud,
may be I found a little bug:
function TSQLRestServer.Retrieve(aID: integer; Value: TSQLRecord;
ForUpdate: boolean): boolean;
var TableIndex: integer; // used by EngineRetrieve() for SQL statement caching
Resp: RawUTF8;
Static: TSQLRestServerStatic;
begin // this version handles locking and use fast EngineRetrieve() method
// check parameters
result := false;
if Value=nil then
exit; // avoid GPF
Value.fID := 0;
if (self=nil) or (aID=0) then
exit;
TableIndex := Model.GetTableIndex(Value.RecordClass);
if TableIndex<0 then
exit;
// try to lock before retrieval (if ForUpdate)
if ForUpdate and not Model.Lock(TableIndex,aID) then
exit;
// get JSON object '{...}' in Resp
Static := fStaticData[TableIndex]; // <-- Here, fStaticData may be nil or smaller TableIndex
if Static<>nil then
Resp := Static.EngineRetrieve(TableIndex,aID) else
Resp := EngineRetrieve(TableIndex,aID);
// fill Value from JSON if was correctly retrieved
if Resp<>'' then begin
Value.FillFrom(Resp);
result := true;
end;
end;
It's rising by me with the following code:
TSQLRecordImages = class(TSQLRecordMany);
TSQLRecordItem = class(TSQLRecord)
private
FImages:TSQLRecordImages;
published
property Images:TSQLRecordImages read FImages write FImages;
end;
function TSQLRecordItem.RetrievePictureURLs:TStringList;
var
OneRecord:TSQLRecord;
begin
if not ((ID > 0) and (Images.FillMany(TSQLAccess.RestServer, ID) > 0)) then
Exit(nil);
Result:=TStringList.Create;
while Images.FillOne do
begin
OneRecord:=TSQLAccess.RestServer.Retrieve(Images.Dest); // <--From this point to the upper posted code it happens
try
if Assigned(OneRecord) and (OneRecord is TSQLRecordImage) then
Result.Add(UTF8ToString(TSQLRecordImage(OneRecord).URL));
finally
OneRecord.Free;
end;
end;
end;
I hope, I don't bother you.?
I think it were very useful, when TSQLRecord.FillRow would be virtual.
For example you hold a calculated value in a field (not table related), but it must get recalculated, if the same instance is reused for iteration (Instance.FillPrepare; while Instance.FillOne do...).
In any descendant I could then implement my custom reset/recalculation code in FillRow and anything were fine. Or is there a better solution already exist?
Super. No AV at start of Lauge anymore. Thank you!
Synopse ORgyM (Overview of an orgy in gym ORM)
Just fun, but very distinctive.
Hi Androu,
I try to implement your new "foreign keys" semantic and so far I was partially successful with following code:
TSQLRecordImage = class;
TSQLRecordImages = class(TSQLRecordMany)
private
FCreated:TDateTime;
published
property Source:TRecordReference read FSource;
property Dest:TRecordReference read FDest;
property Created:TDateTime read FCreated write FCreated;
end;
TSQLRecordImage = class(TSQLRecord)
private
FURL:RawUTF8;
FCreated:TDateTime;
FImage:TSQLRawBlob;
published
property URL:RawUTF8 read FURL write FURL stored FALSE;
property Created:TDateTime read FCreated write FCreated;
property Image:TSQLRawBlob read FImage write FImage;
end;
TSQLRecordItem = class(TSQLRecord, IInterface, IEBayItemBase, IEBayItemPictures) // I need interfaces at this record
private
FItemID:RawUTF8;
...
FImages:TSQLRecordImages;
FPictureURLs:TStringList;
protected
FRefCount:Integer;
function QueryInterface(const IID:TGUID; out Obj):HResult; stdcall;
function _AddRef:Integer; stdcall;
function _Release:Integer; stdcall;
function GetItemID:String;
...
public
destructor Destroy; override;
procedure Assign(const Source:IEBayItemBase); overload;
procedure Assign(const Source:IEBayItemPictures); overload;
procedure AfterConstruction; override;
procedure BeforeDestruction; override;
class function NewInstance:TObject; override;
published
property ItemID:RawUTF8 read FItemID write FItemID stored FALSE;
...
property Images:TSQLRecordImages read FImages write FImages;
end;
If the database isn't exist, the tables will be created like expected, but if it already exists an AV occurs at the following point:
procedure TSQLRestServerDB.CreateMissingTables(user_version: cardinal);
...
end else begin
// this table is existing: check that all fields do exist -> create if necessary
DB.GetFieldNames(Fields,Model.Tables[t].SQLTableName);
with InternalClassProp(Model.Tables[t])^ do begin
Field := @PropList;
for f := 0 to PropCount-1 do begin
if FindRawUTF8(Fields,Field^.Name,false)<0 then begin
if not DB.TransactionActive then
DB.TransactionBegin;
DB.Execute(Model.GetSQLAddField(t,f)); // <---HERE an empty query will be passed to DB.Execute
Model.Tables[t].InitializeTable(self,Field^.Name);
end;
Field := Field^.Next;
end;
end;
It want to add a field for TSQLRecordItem.Images:TSQLRecordImages, but that is not a native table field ...
I hope this hint is helpful to find the bug.
Can you release the v. 1.11 with these changes? The current contents from fossil are somehow not working for me. Thanks in advance.
I'm currently study your implementation of "foreign keys" and I think the opposite of TSQLRecordMany.DestAdd is absent.
I mean the possibility to release a connection between A and B without delete it.
Or have I something overlooked/misunderstood?
Very nice, that you react so fast and have integrated sqlite3_busy_timeout and sqlite3_busy_handler functions. I have integrated the changed SQLite3.pas from fossil and use now "DB.BusyTimeout := 1000;".
BigTable is not suitable for me, because the Lauge-DB must take much more things, not only images/thumbs, in further versions. Currently I implement a solution for items, with tags and cusomizeable properties for each tag and item.
I just checked out Lauge, which sounds great!
I'm really pleased. Perhaps you can translate the relative tiny lang file, anytime when Lauge is ready, to french?
Nice having feedback about users!
I know, how nice that is. Feedback is like a secondary currency for developers
Thank you very much!
Hello Arnaud Bouchez,
I use your excellent SQLite-Framework in Lauge. Thank you for this work!
But I missing the sqlite3_busy_timeout function in SQLite3.pas. I have added it manually, as follows:
...
function sqlite3_busy_timeout(V:TSQLHandle; Milliseconds:Integer):Integer; {$IFDEF USEC}cdecl;{$ENDIF} external;
...
and it was a fine solution for me. But with every new release it getting lost.
Please, include it in one of the next release.
Sorry, my English skills are not the best.
With best regards,
Waldemar Derr
Pages: 1