You are not logged in.
Did some further checking, and it would seem that on the Lazarus side, any existing composite primary keys
are not found/are not used, hence the delay. Creating new composite keys each time produces a fast query though.
Noticed that there was long ago a bug in this issue regarding mormot (1) and
Delphi, could it be that the bug is now coming up with Lazarus..
Thanks for pointing me in the right direction, the culprit was
indeed here, the missing index name with Delphi did not give any problems.
But with Lazarus, apparently the multi-index was not being created.
However, the following code only works fast once, the 2nd time,
it acts as if there is no index, until I change again the name of the index.
Am I missing something, perhaps having been using CreateSQLMultiIndex wrongly all along...?
lang := TRestClientDB(GG_globalClient).Server.Server.CreateSQLMultiIndex(
TOrmTbl1, Ar_IndexesB,True,'IndexName1'); // quick only if IndexName change for each run
P.S.
Tried using the memory manager mormot.core.fpcx64mm with FPC 3.3.1, it compiled,
but at run time got a mysterious segmentation fault.
Thanks for your help! Indeed, the index problem first came to mind,
but rechecked it and Delphi is pretty much using the same code,
with both tables having that 3-field composite index.
The Lazarus and the D2009 both read the same DB3 file.
Unless there is some mysterious error preventing the Lazarus application
from making use of the indexes..
The Lazarus application seems to go into intensive processing during the
query, so could also be the memory manager as you suggest, will try it out
and then finally FPC 3.2 if nothing else helps.
Hello,
The new mORMot2 runs an SQL query with an outer
join probably slightly faster than the earlier version,
just a few seconds for the following query that returns
an answer set of about 180 000 rows and 30 columns.
However, the very same query, when run with the same
code (in the same PC) over the same dataset using Lazarus 2.1 and FPC 3.3.1
takes one whole minute (!) to run...
It would be nice to make use of Lazarus when needed, but am puzzled here,
would greatly appreciate any pointers...
Sami
Var
GG_Model: TOrmModel;
GG_globalClient: TRestClientDB;
G_Database_FullPath:String;
Ar_IndexesA,Ar_IndexesB:
Array [1..3] of RawUTF8;
lang :Boolean;
data1 : TOrmTable; (* was TSQLTable *)
BEGIN
G_Database_FullPath:= ....
GG_Model := TOrmModel.Create(
TOrmTbl1, TOrmTbl2],'root'); // w/o root slower ?
GG_globalClient := TRestClientDB.Create(GG_Model,nil,
G_Database_FullPath,TRestServerDB, false, '');
GG_globalClient.Server.Server.CreateMissingTables;
Ar_IndexesA[1] := 'ACompPrimKey1'Ar_IndexesA[2] := 'ACompPrimKey2';
Ar_IndexesA[3] := 'ACompPrimKey3';
Ar_IndexesB[1] := 'BCompPrimKey1'Ar_IndexesB[2] := 'BCompPrimKey2';
Ar_IndexesB[3] := 'ABCompPrimKey3';
// using a composite primary key for both tables
lang := TRestClientDB(GG_globalClient).Server.Server.CreateSQLMultiIndex(
TOrmTbl1, Ar_IndexesB,True);
lang := TRestClientDB(GG_globalClient).Server.Server.CreateSQLMultiIndex(
TOrmTbl2, Ar_IndexesB,True);
sql_str := 'SELECT * FROM Tbl1 T1 LEFT JOIN Tbl2 T2 WHERE T1.FieldX = T2.FieldX';
data1 := GG_globalClient.Orm.ExecuteList([],sql_str); // very slow on Lazarus
END;
Thank you very much for taking the time to make it D2009 compatible!
It's wonderful to be able to make again use of the latest version of this great package!
Hi,
am probably missing something, but under 'core', found an update for file mormot.core.variants.pas
but unfortunately could not find one for file mormot.core.rtti.pas where the error appeared,
so am still getting the same error..
jonsafi wrote:Yes, exactly, Lazarus's slowness is in the compilation/linking process not the executable itself.
Is this difference visible? For my project for full rebuild (on Linux) I got
343140 lines compiled, 11.4 sec
in case of incremental build result is < 1 second.
Or fpc compiler is slow on Windows?
That is quite fast would say, seems a bit slower on a Windows 7 (8 GBRAM), bit over 15sec.
But yes, incremental compiling does help considerably :-)
Thanks a lot, again moved one step further, now the error is in File mormot.core.rtti.pas in line 5590:
{$ifndef HASNOSTATICRTTI}
if (not FirstSearchByName) and
(Info = TypeInfo(TGuid)) then // ERROR here, line 5590: E2134 Type 'TGUID' has no type info
Agree that Lazarus' s incremental compiling is neat!
Work with an external drive, which currently is no longer an SSD, should probably get one, although writing
to an SSD is still slow compared to its blazing read speed...
Many thanks for this version, it got past the previous TCallConv problem, but is now stopping at line 451
in the same unit, and complains about incompatible types between TRttiKind and TTypeKind.
function TRttiInfo.IsQWord: boolean;
begin
if @self = TypeInfo(QWord) then
result := true
else
{$ifdef UNICODE}
if Kind = tkInt64 then //error here: E2010 Incompatible types: TRttiKind and TTypeKind.
if you find the time to come up with a solution for this too, would be grateful and happy to test it.
Yes, exactly, Lazarus's slowness is in the compilation/linking process not the executable itself.
Hello,
Though was able to compile the brand-new mORMot2
with Lazarus, still miss the speed of Delphi.
Am stuck with D2009, and since mORMot2 works
with D2007, was wondering if there is any way to
make mORMot2 work with D2009.
Know that RTTI was introduced only in D2010,
but even when I set {$define NEWRTTINOTUSED},
I still get an error with unknown identifier.
in file mormot.core.rtti.delphi.inc:
procedure TGetRttiInterface.AddMethodsFromTypeInfo(aInterface: PTypeInfo);
Var cc: TCallConv; // line 622 ERROR: unknown identifier
Much obliged for your comments,
Sami
Many thanks, works very nicely!
Just had to add .orm in between:
GG_globalClient.Orm.ExecuteList([], sql_str);
Hello,
am trying out the nice brand new mORMot2 with Lazarus (2.1) and FPC 3.3.1.
Was able to compile and run the sample 01-StandAloneORM kindly converted by Martin Doyle.
However, the following will not compile as ExecuteList is not found for TRestClientDB.
Var
GG_globalClient: TRestClientDB;
sql_str :String;
data1 : TOrmTable;
sql_str := 'SELECT * FROM My_Table WHERE ...';
data1 := GG_globalClient.ExecuteList([], sql_str);
Am I missing something or is there currently no way to run free
SQL statements in mORMot2?
Any hints are much appreciated,
Sami
OK, thanks a lot for the tip . Happily commenting
{or not BlobField^.IsBlob }
will clear the error, and it seems to be an integrity check that won't affect other parts.
Yes, too bad that while introducing new features in D2009, they also brought in
a whole set of new bugs .
Hello,
The new version now goes much further in compilation with Delphi 2009
than previous ones, but unfortunately
compiling with D2009 produces an 'incompatible types' error whenever
testing for the Boolean
BlobField^.IsBlob
as in the following in file mORMorSQLite3.pas:
function TSQLRestServerDB.MainEngineUpdateBlob(TableModelIndex: integer; aID: TID;
BlobField: PPropInfo; const BlobData: TSQLRawBlob): boolean;
var SQL: RawUTF8;
AffectedField: TSQLFieldBits;
Props: TSQLRecordProperties;
begin
result := false;
if (aID<0) or (TableModelIndex<0)
or not BlobField^.IsBlob then // ERROR occurs here due to this boolean condition
exit;
Does anyone have a clue as how to fix this..?
Much obliged,
regards, Sami
Hello,
Thank you, just tried it, it got me at least one step forward: the error in file SynZap.pas is now gone,
what remains is:
the 'incompatible types' which stubbornly still appears in file Mormot.pas:
function TIPBan.Add(const aIP: RawUTF8): boolean;
var ip4: cardinal;
begin
result := false;
if (self=nil) or not IPToCardinal(aIP,ip4) then
exit;
fSafe.Lock; // ERROR here at compilation: gives' incompatible types'
Thanks for this hint, ab. Made sure that the correct synopse.inc
was being used (renamed the file to synopsenew.inc, changing all inc.references in the *.pas files accordingly).
However, compilation still stops at the file SynZip.pas
due to the following declaration.
Var
PDataStart: ZipPtrUint;
Am definitely using D2009, (checked that constant CompilerVersion does return 20)
so this is really puzzling...seems as if the D2009 directives are having no effect..?
Hello everyone,
Downloaded the latest build (25 Jul 2018 this morning)
and came across an error that appears during compilation
and just cant' figure out...
The error msg when compiling with Delphi 2009
is 'incompatible types ' and it appears in the file Mormot.pas:
function TIPBan.Add(const aIP: RawUTF8): boolean;
var ip4: cardinal;
begin
result := false;
if (self=nil) or not IPToCardinal(aIP,ip4) then
exit;
fSafe.Lock; // ERROR here at compilation: gives' incompatible types'
I wonder what is happening at compilation,
(do know that D2009 is buggy with respect to NativeUInt which
can usually be avoided by changing the type to Cardinal or NativeInt).
Would really appreciate any comments here as am stuck with Delphi 20009 :-)
Kind regards,
Sami
Hello,
Tried updating my Mormot version to a later release (June '18),
and due to the changes in TZipRead.RetrieveFileInfo (file SynZip.pas),
get a compiler error there with D2009.
This is probably due to an internal problem of NativeInt with D2009,
and indeed the error disappears when
changing the declaration
Var
PDataStart: ZipPtrUint;
to:
Var
PDataStart: NativeInt;
but as a result of that change, get an other error
'incompatible types ' in file Mormot.pas:
function TIPBan.Add(const aIP: RawUTF8): boolean;
var ip4: cardinal;
begin
result := false;
if (self=nil) or not IPToCardinal(aIP,ip4) then
exit;
fSafe.Lock; // ERROR here at compilation: gives' incompatible types'
Would be very grateful for any hints/pointers on how to deal with this issue..
Regards,
Sami
OK, thanks a lot for your reply!
Don't use JSON so will
try and figure out something. Will post if I come up with
something useful for the Mormot community.
Hello,
TSQLTableToGrid is just great when used with TSQLTable.
Lately have been trying out the TSynBigTable
and was just wondering if there's any nice solution
for displaying its data in a similar way as TSQLTableToGrid does...
Regards,
Sami
Just a quick update, am using the nice
TSynBigTable successfully, the above problem with
TSynBigTableString is still a bit unclear, but
now guess it has something to do with the
fact that the file holding was not being
created successfully and was not checking the
error status...
Hello,
Tried the above code in a new, separate application and it works just OK.
However, when used together with the original application with the native Oracle
driver unit SynDBOracle, it just won't write a file when called
by UpdateToFile.
It does report a filesize however, using
FileSizeOnDisk,
which is suspiciously large, over 40GB.
Could this be somekind of memory interference...
Much obliged for your comments. In the meantime,
will try using TSynBigTable instead.
Hello,
Have just updated the framework and am using Delphi 2009.
Can't seem to get the file saved to disk for the info stored on
TSynBigTableString, perhaps it did save it once out of five trials or so...
Checked that the info does go into TSynBigTableString
before the attempt to save the file..
Also checked that the data indexes are unique..
Could you please help me out as to what could be the reason...?
Var
BT_: TSynBigTableString;
BEGIN
BT_ := TSynBigTableString.Create(stud_dir +GG_STUD_DIR + 'Stud_Info' + GG_TXT_); // create file handle
//....
BT_.Add(student_id_str + GG_SEP + Email_str
+ GG_SEP + FullName_str
+ GG_SEP + GPA_str
+ GG_SEP + Total_Credits_str,
StringToUtf8(student_id_str)); // add data
write_log(BT_.FileName + ' '+ inttostr(BT_.Count) + ' entries',True); // save to log nr of entries
If BT_.Get(StringToUtf8('123456'+GG_SEP),Raw_2) Then
write_log('found ' + Raw_2 + '!',True) // Student '123456' is indeed there
Else
write_log('Not found!',True);
BT_.UpdateToFile(); // fails to save file
BT_.UpdateToFile(True); // also fails to save file
BT_.Free;
Hello,
Thanks a lot for both your help.
Using:
aDynArray(TypeInfo(TStudents),MyStudents).LoadFromStream(MStream_)
would not compile, perhaps the error is elsewhere..
Anyhow, got rid of the memory streams and using the following
code no longer produces an access error,
but now fails to find the given student (which should be in the file).
Still need to try this with a new framework...
Var
Raw_: RawByteString;
indx_,StudentsCount:Integer;
BEGIN
Raw_ := BinToBase64(StringFromFile(GG_STUD_DIR + GG_S_
+ 'Stud_Info' + GG_TXT));
StudentsCount := 0;
aDynArray.Init(TypeInfo(TStudents),MyStudents, nil, nil, nil, @StudentsCount);
aDynArray.Capacity := 120000;
DynArray(TypeInfo(TStudents),MyStudents).LoadFrom(Pointer(Raw_));
aStudent.sStudentId := '296380'; // student actually is in file...
indx_ := aDynArray.FindHashed(aStudent); // indx_ returns -1
END;
Hello,
Was using successfully TDynArray and saving and loading the data
from a file. Since the file generated was a bit large, tried
switching to TDynarrayHashed instead..
However, keep getting an access violation at the
line:
DynArray(TypeInfo(TStudents),MyStudents).LoadFromStream(MStream_);
when trying to load data from the file into hashed array.
Any comments/hints would be much appreciated.
P.S. Am using an older version of the framework, released in Feb '17.
// simple student DB
type
TStudent = packed record
sEmail: string;
sGPA,sTotalCredits:Currency;
sStudentId: string;
sGender,sFullName: string;
end;
TStudents = array of TStudent;
Var
MyStudents: TStudents;
aStudent : TStudent;
aDynArray: TDynArrayHashed;
MStream_ : TMemoryStream;
StudentsCount: Integer;
BEGIN
MStream_ := TMemoryStream.Create;
MStream_.LoadFromFile(GG_STUD_DIR + GG_S_
+ 'Stud_Info' + GG_TXT);
aDynArray.Init(TypeInfo(TStudents),MyStudents, nil, nil, nil, @StudentsCount);
aDynArray.Capacity := 1200000; // should be enough
DynArray(TypeInfo(TStudents),MyStudents).LoadFromStream(MStream_); // access violation
MStream_.Free;
END;
Just to add to the previous post, downloaded the latest nightly build
and now BatchSend works fine the first time when adding a set of records.
If however, the same set of records are added again, the
application returns error code 500, 'internal server error'.
Could it be that the new Mormot
is now complaining of key violations which went unnoticed before...
Am using the following though for addind records...:
GG_globalClient.BatchAdd(mytbl, True);
Thank you again, did that switch all over in the code and finally got it to compile!
What can't figure out now is why the BatchSend fails (used to work
with old Mormot) and returns error code 400 (bad request).
Made sure that the server code too got updated & compiled...
Regards,
Sami
GG_globalClient.BatchSend(AllResults);
Hello,
Many thanks for all your help, finally went back and
changing NativeUInt to UInt did solve this mysterious problem!
Now, however, when trying to compile a D2009 application
containing the BatchSend command (as below) that used
to compile fine with a very old Mormot (from Jan '14)
gives the following error when compiling it with a new Mormot from Jan '17:
' There is no overloaded version of batchsend that can be called with
these arguments..'
Would be very grateful for any hints, am not aware of changes in BatchSend...
Cheers,
Sami
Var
AllResults: TIntegerDynArray;
GG_globalClient.BatchSend(AllResults);
Hello,
Have been using Mormot successfully for quite some time now, albeit an older
version. Tried now to install several later builds,
for instance one released last August (44d7dcce95aa324)
along with the latest stable version.
In both cases, the compiler for Delphi 2009 stops/breaks at Unit 'SynZip.pas'
apparently at the end of function TZipRead.RetrieveFileInfo(Index: integer; var Info: TFileInfo): boolean;
and just says, pointing to line 1321 'F2084 Internal Error: C12074'
Wonder if anyone has come across something similar..?
Am using Win 7 with D2009.
Any help would be much appreciated,
Cheers,
Sami
Yes, will do that, thanks. Just wanted to get 'the big picture' right.
Hello,
For those of us who are developing so-called basic
Windows client-server applications (using SQLite3) that are *not* Web applications,
I understand the way to go is still to use
the HTTP server (http.sys) for remote access.
If one does not want to use interfaces
and write the server application as a service, eg. using TServiceFactoryServer,
then we just use
TSQLite3HttpServer and TSQLite3HttpClient...
Now if I understood correctly, it still makes
sense to run the server console application as a service
for maximum flexibility, and this can be achieved using
CreateNewService,CreateOpenService?
With thanks for your comments,
Sami
OK, will try that, thanks!
Hello Arnaud,
Am doing a multi-page report using the nice
mORMotReport, but have been
stuck when trying to re-generate the same line of text on another
page using 'DrawTextAt'.
Have tried various tricks like saving the last
row from CurrentYPos, but this has side-effects (for the header, etc.)
Are you by any chance considering to extend the 'DrawTextAt'
with an additional parameter for specifying YPos?
Kind Regards,
Sami
Many thanks! Was looking for the error in the wrong place, thinking that
some additional binding was required.
Hello,
Having succesfully connected to Oracle am now trying to run
a simple query with parameters (w/o parameters works fine).
The query is a simple select with one parameter as shown in below.
When running it, get the following error:
'Prepare called with wrong ExpectedResults'
which apparently means that Mormot does not
recognize it as a legitimate select statement.
Any help would be greatly appreciated.
Regards,
Sami
The code is as follows (actually the sql statement sql_ is read from an inifile)
=============
Var
sql_ : RawUTF8;
Props : TSQLDBOracleConnectionProperties;
BEGIN
sql_ := 'select * from tbl.studentId where accepted_date > To_date(?, 'DD-MM-YYYY')';
Props := TSQLDBOracleConnectionProperties.Create(MyAlias,'',
MyUser,Myencrp_str);
Props.ExecuteNoResult(sql_ ,['01.09.2009']); // Error here
With Props.Execute(sql_ ,[],@row) Do
While Step Do
Begin (* Step thru results *)
// .....
End; (* while *)
End; (* with *)
END;
Thanks a lot! Felt that something was missing but could not figure out what.
The client can now run successfully w/o having
to run as 'admin' (am using localhost on a single PC)
However, when it doesn't run as admin, seems to be slower...
When running the server under the debugger, then
at the following statement
aServer := TSQLHttpServer.Create('888',[DB]);
got an error saying that administrator rights
needed for 'root'. The error does *not* appear
when the server is started normally outside the IDE.
Tried running the application as the administrator, and
it seems to work now.
Am in the habit of avoiding working under the administrator
account (for protection against malware).
Would not have guessed this to be the issue, so thank you so much
for your kind assistance!
Thanks for this info, now compiled latest unstable version with no problem.
Can now finally add data to the server tables, (not using batch yet)
however, when stepping on the statement:
GG_globalClient.TransactionBegin(TSQLMyTable2)
still get the error Exception class EOSError' with message 'System Error. Code 12152
which apparently means the following:
HTTP Status 12152 The server has been taken down momentarily for database or server maintenance, or there has been a network error. This status will generally come up when attempting to upload.
Wonder if you have any ideas on this?
Thanks for the tip, now realize had moved
VirtualTableExternalRegisterAll(aModel,aProps)
too high in the code, it was supposed to just be before
TSQLRestServerDB.Create().
Will try instead
Try
aModel := get_model_HEVDB; // tbls defined in unit HEVDB
VirtualTableExternalRegisterAll(aModel,aProps);
Had downloaded the latest 'unstable' 1.18 version,
but there was a D2009 issue at compilation,
(will try to reproduce it and post it)
so went back a little in history...
Thanks for your pointers, in fact, it turns out that
only one of the tables works with regular Add
(both have a similar structure though)
and none of them will work with
TransactionBegin:
GG_globalClient.TransactionBegin(TSQLTable1);
causes a mysterious 'Exception class EOSError'
with message 'System Error. Code 12152,
apparently generated from InternalGetInfo32 in
TWinHttpAPI.Request in the unit SynCrtSock
The server code is very simple:
Try
aProps := TSQLDBSQLite3ConnectionProperties.Create(ChangeFileExt(paramstr(0),'.db'),'','','');
VirtualTableExternalRegisterAll(aModel,aProps);
Try
aModel := get_model_HEVDB; // tbls defined in unit HEVDB
DB := TSQLRestServerDB.Create(aModel,'Hev.DB3',
CHECK_USERS);
DB.CreateMissingTables;
Try
aServer := TSQLHttpServer.Create('888',[DB]);
Write('Press [Enter] to shut down server.');
finally
aServer.Free;
end;
finally
aModel.Free;
end;
finally
aProps.Free;
To step down code in the server would require attaching some event handler to
the server, I guess?
Hello,
For some reason am not getting the addition of records via batch
to a client server (HTTP, No Ajax or Web) to work.
Immediate updates to another table do work though.
The following call returns 0 so it seems to work:
indx := GG_globalClient.BatchAdd(d_TableInstance, true);
The following call returns 200 so it would seem to work,
but on closer inspection
AllResults[0] and AllResults[1] are returning 0,
so there seems to be a problem.
However, cannot figure out how to find out the
cause of this error?
Var
AllResults: TIntegerDynArray;
BEGIN
ix := GG_globalClient.BatchSend(AllResults);
GG_globalClient.Commit;
Made sure the server is working properly.
The client starts as follows:
GG_globalClient := TSQLHttpClient.Create('someIP',
'888', GG_Model);
Would be very grateful for your pointers over here.
Sami
Yes, am actually using TSQLRestClientURINamedPipe to
connect on the client side, although the server and client are still
in the same PC.
The client can now successfully retrieve data from a table defined on
the server, My mistake sorry, no additional definitions are
needed on the client side, thanks for pointing this out.
I assume when the server is located physically on another PC,
this method should still operate rather fast?
Thanks a lot,
Sami
Hello,
Am trying to develop a simple Client-Server DB application to which a client can connect from
a different PC using service names (no Web-browsing so no HTTP).
Was thinking that if one puts the code to generate the tables in the server-side,
then one still needs that table definition code on the Client-side
in order to perform routines for input data into the tables.
So is one supposed to write a shared unit for the DB table
definitions that gets shared by both the server and the client side
or am I missing something?
Am grateful for any pointers,
Regards
Sami
Below is a sample of the server code.
Var
aModel : TSQLModel;
aProps : TSQLDBSQLite3ConnectionProperties;
aServer : TSQLRestServerDB;
BEGIN
aProps := TSQLDBSQLite3ConnectionProperties.Create(ChangeFileExt(paramstr(0),'.db'),'','','');
Try
aModel := TSQLModel.Create(
[TSQLAuthGroup,TSQLAuthUser,
TSQLBabies,
],ROOT_NAME);
VirtualTableExternalRegisterAll(aModel,aProps);
aServer := TSQLRestServerDB.Create(aModel,'Babies.DB3',
CHECK_USERS);
Try
With aServer Do
Try
CreateMissingTables;
........
Thanks so much for pointing this out,
somehow got confused in the order of parameters :-).
Thanks for your help,
and sorry, my mistake, am using a newer version from March 2013,
but left out one parameter in the example,
so am using this:
TSQLDBOracleConnectionProperties.Create
('','tnsnames.ora','USERId', 'mypassword');
Will try the other way you suggested,
hopefully will find all the required parameters in the tnsnames.Ora
file.
Many thanks,
Sami
Hello,
Am trying to change an old application that used a commercial
component to connect to Oracle to just use the greate MorMot framework.
Am calling the following
TSQLDBOracleConnectionProperties.Create('database','user','password');
and still can't connect.
The old component used SID (from the tnsnames.ora file ) as a connection parameter.
If I understood correctly , what is needed now as the first parameter
is the *actual* name of the tnsnames.ora file?
( have Instant client and tnsnames.ora in the same directory along
with my applic)
Much obliged for your help,
Sami
Hello, was trying to delete hundreds of rows from
a table based on a date value. Tried first
retrieving the row matching the date and then
deleting the rows using:
BatchDelete
BatchSend
followed by Commit.
This returned success (200)
but for some reason the rows were not removed.
Anyhow am now trying to use the following simpler way:
Var
sql8:putf8Char;
where_str:AnsiString;
where_str := 'date(datetaken) > '2010-10-15''';
sql8 := 'DELETE FROM MYTABLE WHERE ' + where_str;
globalClient.EngineExecuteFmt(sql8,[]);
This code will not compile, because of a type error.
Could not find any easy way of converting a string to
putf8Char. Could you please help me out here?
Thanks a lot for any pointers.
Regards,
Sami
OK, thanks a lot for your kind help.
Can figure out a manual workaround then..
Thanks for pointing out, it was my mistake with the call
CreateSQLMultiIndex(....,TRUE)?
If was returning False indeed.
So now indexes are created and am getting the
exception 'constraint failed'.
However, am using BatchAdd(..) to add the record,
so I imagine the key violation error/exception
doesn't get triggered until BatchSend(..)
Is there a way to trap this key violation
easily and get hold of the control so as to
perform a record update on key violation?
Am really grateful for any suggestions.
Thanks for your tip,
I already tried the following to create a multifield index ( 6 fields).
It seems to have no effect,
as the application happily keep inserting duplicates
into the same table without giving any error..
TSQLRestClientDB(globalClient).Server.CreateMissingTables(0);
TSQLRestClientDB(globalClient).Server.CreateSQLMultiIndex(TSQLmyTable,
G_CRSTBL_INDEXES_Ar,True);
G_CRSTBL_INDEXES_Ar contains
6 field names starting from index 0.
I'd like to have control of the code when there
is a key violation to update
the existing record if necessary.
Know that there is a method such as
TSynTableFieldProperties.Validate()
but couldn't find an example on how to use it.
Could you give me one more hint, please?
Thanks!