You are not logged in.
The JSONToClientDataSet function sets FieldType to sftDateTime for the field Order_ID (must be sftUTF8Text)
JSON
[{"id":69736,"Order_ID":"1435051262-45869-63626","First_Name":"Alex","Last_Name":"Markov","Email":"xxx@xxx.com","program_id":13,"Subscribe_Date":"2015-06-23T09:21:02","subscribed":0}]The problem is in procedure TSQLTable.InitFieldTypes - the function Iso8601ToTimeLogPUTF8Char returns <> 0 value from input "1435051262-45869-63626"
for i := 1 to fRowCount do
if U^=nil then // search for a non void column
inc(U,FieldCount) else begin
if Iso8601ToTimeLogPUTF8Char(U^,0)<>0 then // U = "1435051262-45869-63626"
FieldType := sftDateTime; // this was a ISO-8601 date/time value
break;
end;function DataSetToJSON(Data: TDataSet): RawUTF8;When Data contains a calculated field, the exception "Argument out of range" is raised.
For example, DevExpress TdxMemData component contains the hidden calculated field RecID. And this function cannot return JSON for TdxMemData.
To fix - it's needed to change Data.FieldDefs[f].Name to Data.Fields[f].FieldName:
// get col names and types
SetLength(W.ColNames,Data.FieldCount);
for f := 0 to high(W.ColNames) do
StringToUTF8(Data.Fields[f].FieldName,W.ColNames[f]); // instead of Data.FieldDefs[f].NameApproximately one month ago handling of the TSQLRawBlob property in ObjectToJson and JsonToObject functions stopped working correctly.
Below is a sample code that demonstrates the problem:
Type
TTestStorage = class(TSQLRecord)
private
FName : RawUTF8;
FPicture : TSQLRawBlob;
published
property Name : RawUTF8 read FName write FName;
property Picture : TSQLRawBlob read FPicture write FPicture;
end;
procedure Test;
var
FS : TTestStorage;
begin
FS := TTestStorage.Create;
try
FS.Name := 'test';
Fs.Picture := StringFromFile('d:\source.png');
ObjectToJSONFile(FS, BlockFileName, [woHumanReadable, woSQLRawBlobAsBase64]);
finally
Fs.Free;
end;
FS := TTestStorage.Create;
try
JSONFileToObject(BlockFileName, Fs);
FileFromString(Fs.Picture, 'd:\saved.png', true);
finally
Fs.Free;
end;
end;ObjectToJSONFile will generate JSON file:
{
"ID": 0,
"Name": "test",
"Picture": ""iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAJzSURBVDhPhZP9S1NRGMdPmU5dtbYh/tof1U+ZEZVBUaYrY8lKM52WJr6Uy0iCHBRoZYmRKWxObRWufo0iSKvNue3evbq7u5dvzzm7rA2CDny4Gzuf5/me89yxf6yq8ekN9z3nJ9x1fiZ8oO8Ym/ZhlPN4AyPE0MN3HtpbU1QqV/UYCYUCiEKJPCdfJEfcmfoI2ltXVBjbQ1QR+wj93NIXKIoCOZoSROQiUjSNWFJFJpvH7UkvL1BPiFVld3jdA5Pv0X//A+GF3bGOPocXvZyJddwkHjzxYUdOQlHzsE+sobGxUa/5rLrPsfbf2F0jHvhDUaQzOXSPetDQ0LBf81lNzzgvUEByNytI7OYE8WQOsVRWxLYNufBzW0ZKycE27ILZbD6g+Ux3fXhFdEykVMSpAJeiGjKdm8e23lrGj0BENOgcXK4sYB10iQIlMUliQhVEEhkRu733Lb7/CiNOv18ZWILJZDqo+ay2o39JnJULEnWU6BmOqQjFMghFMyL2he7X+La5Q8VVXLIvwmg0GjSf1bX1LIqLisQzAi7tCBQEiQTFPndtHl83g6JJ2403MBgMhzSf1Z/vWkA2V/gryQq2iaCkIEDwo53pfIUtf4iOpIo05Qn0Z23zVCAvpG0pLeBigJ7+cFp0PWWdw9jUCiVZQEvH09Xy90B/+upLqDQqLgQiJEZIJH6HdwVSXMUJy3PodLrDPLom87dXLP3Jyy/ErPkI+S3HaJwyTUKiCUh0J2Eq0Nw+y1/fUuzyVdvU6nQftzxDs2VWbGy6OItjbTNFWmdwlDjS8miV9pZil6+9BP9n8bnyDhx+wxyjBv9cEZsxxv4AETtm2YLat/UAAAAASUVORK5CYII="
}But after executing JSONFileToObject, the Picture property is empty.
I call ExecuteList from the client side to determine the contact's Birthday:
SQL := Select * from Emails Where strftime('%m-%d',Birthday) = strftime('%m-%d','now')
Client.ExecuteList([TEmails], SQL);After the first call of ExecuteList I change the computer date to get a list of other contacts.
Even if I perform Client.ServerCacheFlush(TEmails); the returned data is old (from cache).
If I call Server.FlushInternalDBCache from the server side, the returned data is new.
How to clear DB cache from the client side?
property name: RawUTF8 index 32 read fname write fname;name - reserved SQL word, generated SQL should be ALTER TABLE dbo.res_shop ADD [name] for MS SQL or ALTER TABLE dbo.res_shop ADD `name` for MySQL
By default the mORMot is compiled with directive {WITHLOG} and all server method calls are written to the memory log. Is there a method to periodically clear the memory log to prevent extensive memory usage during service application running for a long time period?
This error still occurs:
procedure TSQLRestServer.SessionDelete(aSessionIndex: integer;
Ctxt: TSQLRestServerURIContext);
begin
if (self<>nil) and (cardinal(aSessionIndex)<cardinal(fSessions.Count)) then
with TAuthSession(fSessions.List[aSessionIndex]) do begin
if Services is TServiceContainerServer then
TServiceContainerServer(Services).OnCloseSession(IDCardinal);
{$ifdef WITHLOG}
fLogFamily.SynLog.Log(sllUserAuth,'Deleted session %/% from %/%',
[User.LogonName,IDCardinal,RemoteIP,Ctxt.Call^.LowLevelConnectionID],self); //<------ Ctxt = nil -> SessionDelete(i,nil); in function TSQLRestServer.SessionAccess(Ctxt: TSQLRestServerURIContext): TAuthSession;
{$endif}
if Assigned(OnSessionClosed) then
OnSessionClosed(self,fSessions.List[aSessionIndex],Ctxt);
fSessions.Delete(aSessionIndex);
fStats.ClientDisconnect;
end;
end;I think that we should use unicode version of functions GetUserName and GetComputerName for Delphi 2009 + in procedure SetExecutableVersion:
procedure SetExecutableVersion(aMajor,aMinor,aRelease: integer);
var setVersion,i: integer;
{$ifdef MSWINDOWS}
TmpSize: cardinal;
{$ifdef unicode}
Tmp : array[byte] of Char;
{$else}
Tmp: array[byte] of AnsiChar;
{$endif}
{$endif}
...and use GetUserNameW/GetComputerNameW for Unicode and GetUserNameA/GetComputerNameA for non unicode version
Thanks, ab
I think that we need the same modification in
procedure TPropInfo.CopyValue(Source, Dest: TObject; DestInfo: PPropInfo);
..
obj: S := GetObjProp(Source);
if (DestInfo=@self) or
((kD=tkClass) and (DestInfo^.PropType^.ClassSQLFieldType=ft)) then begin
D := DestInfo.GetObjProp(Dest);
{$ifndef LVCL}
if S.InheritsFrom(TCollection) then
CopyCollection(TCollection(S),TCollection(D)) else
{$endif} begin
D.Free; // release previous D instance then set a new copy of S
// Insert :
if S=nil then
D := nil else begin
D := ClassInstanceCreate(S.ClassType);
CopyObject(S,D);
end;
DestInfo.SetOrdProp(Dest,PtrInt(D));
// instead of:
// DestInfo.SetOrdProp(Dest,PtrInt(DestInfo^.CopyToNewObject(S))); // - this line
..Using a raw TPersistent in TSQLRecord does not make sense IMHO: the ORM won't be able to unserialize its content.
Even if the previous version of CreateCopy did work for the in-memory copy, I'm quite sure that your won't be able to read the TAccount.Properties value from the database via the ORM.
In my application TAccount is stored in the database and it has the property AccountType
property AccountType : TAccountType read FAccountType write SetAccountType;I suppose that when ORM deserializes content of TAccount, the property "AccountType" is set before "Properties", and this calls the method TAccount.SetAccountType, in which the appropriate class is created:
case FAccountType of
act_smpt : FProperties := TISPProperties.Create;
act_amazon_ses : FProperties := TAmazonSESProperties.Create;
end;Therefore the correct deserialization happens.
This even works in the connection: ObjectToJson->JSONToObject
For example ObjectToJson(Account):
{"ID":0,"AccountName":"Test Account","AccountType":1,"Properties":{"SMTPServer":"localhost","SMTPPort":25,"MyBooleanValue":false,"MyStringValue":""}}after converting Json to Account:
JSONToObject(TargetAccount, Pointer(AccountJson), Valid);I can read all properties:
if TargetAccount.AccountType = act_smpt then
begin
server := (TargetAccount.Properties as TISPProperties).SMTPServer;
port := (TargetAccount.Properties as TISPProperties).SMTPPort;
...
StartSendingViaSMTP;
end
else
if TargetAccount.AccountType = act_amazon_ses then
begin
AccessKey := (TargetAccount.Properties as TAmazonSESProperties).AccessKey
SecretKey := (TargetAccount.Properties as TAmazonSESProperties).SecretKey;
..
StartSendingViaAmazonSES
end;Probably I will use ObjectToJson->JSONToObject instead of TargetAccount := Account.CreateCopy as TAccount; to avoid problems in the future.
I think that old code
try
D := ClassInstanceCreate(S.ClassType); // create new child instance
CopyObject(S,D); // copy child content
except
FreeAndNil(D); // avoid memory leak if error during new instance copy
end;
SetInstance(Dest,D);is more right way instead of new code
SetInstance(Dest,TSQLPropInfoRTTIObject(DestInfo).PropInfo^.CopyToNewObject(S));It seems that in the new version the function TSQLRecord.CreateCopy does not copy the TPersistent property. In the previous version it worked fine.
The demo project to demonstrate problem:
unit Unit13;
interface
uses
Winapi.Windows, Winapi.Messages, System.SysUtils, System.Variants, System.Classes, Vcl.Graphics,
Vcl.Controls, Vcl.Forms, Vcl.Dialogs, mORMot, Vcl.StdCtrls, SynCommons;
Type
TAccountType = (act_notdefined, act_smpt, act_amazon_ses);
Type
TCommonProperties = class(TPersistent)
private
FMyBooleanValue : Boolean;
FMyStringValue : String;
published
property MyBooleanValue : Boolean read FMyBooleanValue write FMyBooleanValue;
property MyStringValue : string read FMyStringValue write FMyStringValue;
end;
Type
TISPProperties = class(TCommonProperties)
private
FSMTPServer : String;
FSMTPPort : integer;
published
property SMTPServer : String read FSMTPServer write FSMTPServer;
property SMTPPort : integer read FSMTPPort write FSMTPPort;
end;
Type
TAmazonSESProperties = class(TCommonProperties)
private
FEndPoint : String;
FAccessKey : String;
FSecretKey : String;
published
property EndPoint : String read FEndPoint write FEndPoint;
property AccessKey : String read FAccessKey write FAccessKey;
property SecretKey : String read FSecretKey write FSecretKey;
end;
Type
TAccount = class(TSQLRecord)
private
FAccountName : String;
FAccountType : TAccountType;
FProperties : TPersistent;
function GetProperties: TPersistent;
procedure SetAccountType(const Value: TAccountType);
public
constructor Create; override;
destructor Destroy; override;
published
property AccountName : String read FAccountName write FAccountName;
property AccountType : TAccountType read FAccountType write SetAccountType;
property Properties : TPersistent read GetProperties write FProperties;
end;
type
TForm13 = class(TForm)
Button1: TButton;
procedure Button1Click(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form13: TForm13;
implementation
{$R *.dfm}
{ TAccount }
constructor TAccount.Create;
begin
inherited;
FAccountType := act_notdefined;
end;
destructor TAccount.Destroy;
begin
if Assigned(FProperties) then
FProperties.Free;
inherited;
end;
function TAccount.GetProperties: TPersistent;
begin
case FAccountType of
act_smpt : Result := TISPProperties(FProperties);
act_amazon_ses : Result := TAmazonSESProperties(FProperties);
end;
end;
procedure TAccount.SetAccountType(const Value: TAccountType);
begin
if Value <> FAccountType then
begin
FAccountType := Value;
if Assigned(FProperties) then
FProperties.Free;
case FAccountType of
act_smpt : FProperties := TISPProperties.Create;
act_amazon_ses : FProperties := TAmazonSESProperties.Create;
end;
end;
end;
procedure TForm13.Button1Click(Sender: TObject);
var
SourceAccount : TAccount;
TargetAccount : TAccount;
Prop : TPersistent;
FSourceJson : RawUTF8;
FTargetJson : RawUTF8;
begin
SourceAccount := TAccount.Create;
SourceAccount.AccountType := act_smpt;
SourceAccount.AccountName := 'Test Account';
(SourceAccount.Properties as TISPProperties).SMTPServer := 'localhost';
(SourceAccount.Properties as TISPProperties).SMTPPort := 25;
FSourceJson := ObjectToJSON(SourceAccount);
TargetAccount := SourceAccount.CreateCopy as TAccount;
TargetAccount.AccountType := SourceAccount.AccountType;
FTargetJson := ObjectToJSON(TargetAccount);
ShowMessage(FSourceJson + #13#10+ '=============='#13#10 + FTargetJson );
end;
end.It may be better to use ADSI (Active Directory Service Interfaces) or LDAP ?
Windows 8.1 - client, Server 2012 as server
The LogonUser function is called with LogonType = 9 (LOGON32_LOGON_NEW_CREDENTIALS). LOGON32_LOGON_NEW_CREDENTIALS causes LogonUser to always return true, even wrong credentials is provide.
There is the same problem with TSQLRestServerAuthenticationActiveDirectory.Auth
function TSQLRestServerAuthenticationActiveDirectory.Auth(Ctxt: TSQLRestServerURIContext): boolean;
..
result := LogonUser(pointer(UTF8ToString(Login)),pointer(UTF8ToString(Domain)),
pointer(UTF8ToString(Password)),9 ,LOGON32_PROVIDER_WINNT50,hToken);When I try with a different logon type (LOGON32_LOGON_NETWORK, LOGON32_LOGON_INTERACTIVE, LOGON32_LOGON_SERVICE) the LogonUser to always returns false.
Does anybody have success to implement TSQLRestServerAuthenticationActiveDirectory authentication?
The test.txt file content: (has Unicode symbol »)
Read more »
Next LineLoading file content to the Value property and generating JSON file:
Type
TMyRecord = class(TSQLRecord)
private
FValue : TSQLRawBlob;
published
property Value: TSQLRawBlob read FValue write FValue;
end;
...
FRecord := TMyRecord.Create;
try
FRecord.Value := StringFromFile('d:\test.txt');
ObjectToJSONFile(FRecord, 'd:\output.txt');
finally
FRecord.free;
end;Output file (the value property does not contain text after »:
{"ID":0,"Value":"Read more "}I think the problem is in procedure TTextWriter.AddAnyAnsiString(..)
Maybe we should check codepage:
case CodePage of
CP_SQLRAWBLOB:
Add(pointer(s),0,Escape); // direct write of TSQLRawBlob content
....I also have a problem with InternalBatchStop.
Sometimes the SQL generated in the line
SQL := SQL+','+CSVOfValue('('+CSVOfValue('?',fieldCount)+')',rowCount-1);'insert or ignore into Emails (RowID,Email,First_Name,Last_Name,Recipient_Name,Subscribed,Subscribe_Date,GroupID,Fields) values (?,?,?,?,?,?,?,?,?),'
has an extra unnecessary comma at the end and that leads to exception at Statement^.Prepare(DB.DB,SQL).
As a workaround I use the following code:
FCSVValues := CSVOfValue('('+CSVOfValue('?',fieldCount)+')',rowCount-1);
if FCSVValues <> '' then
SQL := SQL +',' + FCSVValues;I found the similar problem in this topic http://synopse.info/forum/viewtopic.php?id=2228
I think it should be fixed.
Thanks.
Счастливого Нового Года!
What we may do is add some new attribute for TSQLAllowRemoteExecute, and explicitly check the TSQLAccessRights for the current logged user.
I think this will be the best solution.
You should install 32bit ODBC driver for PostgreSQL.
I need to check if the current logged user has sufficient rights to perform BatchSend.
Though the function TSQLRestServer.EngineBatchSend calls RecordCanBeUpdated, it's possible to check the user access rights in the overload function RecordCanBeUpdated,
but in that case the checking will be called fo each inserted record which will slow down the process of adding records considerably.
function TGroupServer.RecordCanBeUpdated(Table: TSQLRecordClass; ID: integer;
Action: TSQLEvent; ErrorMsg: PRawUTF8): boolean;
var
U : TAuthUser;
intU : TAuthUser;
G : TAuthGroup;
f : TServiceRunningContext;
begin
if (Table <> TEmails) and (Table <> TExclusionList) then exit;
f := CurrentServiceContext;
if Assigned(f.Request) then begin
U := Self.SessionGetUser(f.Request.Session) as TAuthUser;
if Assigned(U) then
try
intU := TAuthUser.CreateAndFillPrepare(self, 'LogonName=?', [U.LogonName]);
try
intU.FillOne;
g := TAuthGroup.CreateAndFillPrepare(self, 'ID=?', [intU.GroupRights.ID]);
try
g.FillOne;
if ((Action = seUpdate) and ((g.Ident = 'Guest') or not (contact_edit in G.AccessInfo))) or
((Action = seAdd) and ((g.Ident = 'Guest') or not (contact_add in G.AccessInfo))) or
((Action = seDelete) and ((g.Ident = 'Guest') or not (contact_delete in G.AccessInfo))) then
begin
ErrorMsg^ := StringToUTF8(sNoRights);
Result := False;
end;
finally
g.Free;
end;
finally
intU.Free;
end;
finally
U.Free;
end;
end;
end;I slightly modified the function TSQLRestServer.EngineBatchSend in order the checking of the user access rights is called one time before the start of BatchSend
function TSQLRestServer.EngineBatchSend(Table: TSQLRecordClass;
const Data: RawUTF8; var Results: TIntegerDynArray): integer;
...
begin
Sent := pointer(Data);
if Sent=nil then
raise EORMBatchException.CreateUTF8('%.EngineBatchSend(%,"")',[self,Table]);
if Table<>nil then begin
// unserialize expected sequence array as '{"Table":["cmd",values,...]}'
while not (Sent^ in ['{',#0]) do inc(Sent);
if Sent^<>'{' then
raise EORMBatchException.CreateUTF8('%.EngineBatchSend: Missing {',[self]);
inc(Sent);
TableName := GetJSONPropName(Sent);
if (TableName='') or (Sent=nil) then
raise EORMBatchException.CreateUTF8('%.EngineBatchSend: Wrong "Table":"%"',
[self,TableName]);
// NEW CODE ===========================================
if not BatchCanPerformed(Table,@ErrMsg) then
raise EORMBatchException.CreateUTF8('%.EngineBatchSend impossible: %',
[self,ErrMsg]);
// END NEW CODE ========================================
end; //
What do you think about this modification?
Hello to all
I have the same troubles with Kerberos authentication.
In the organization we have one Windows 2012 server server-pc12 and 2 client computers alex-pc and boss-pc
The server and client computers are in the same domain GLOCK.local
The mORMot service application is installed on the server-pc12 and SPN has been registered properly on the server. MyServer/server-pc12.GLOCK.local
In mORMot service application I have 3 users (1. GLOCK/Alex; 2. GLOCK/Admin; 3. GLOCK/Boss)
Windows server also have these users.
When the client from alex-pc computer logs in to the server MyClient.SetUser('','MyServer/server-pc12.GLOCK.local'), he is logged in as GLOCK/Admin, not as GLOCK/Alex.
From the log:
TSQLRestServerAuthenticationSSPI(01C6FBE0) NTLM Authentication success for GLOCK\adminWhat are the correct settings to make the clients from different computers being logged in under their User Names?
To do this you can create custom AuthUser class and inherit if from the TSQLAuthUser class, for example:
Type
TAuthUser = class(TSQLAuthUser)
protected
FCreated : TDateTime;
FEmail : RawUTF8;
FFirstName : RawUTF8;
FLastName : RawUTF8;
FValidTill : TDateTime;
FGuid : RawUTF8;
published
property Created : TDateTime read FCreated write FCreated;
property Email : RawUTF8 read FEmail write FEmail;
property FirstName : RawUTF8 read FFirstName write FFirstName;
property LastName : RawUTF8 read FLastName write FLastName;
property ValidTill : TDateTime read FValidTill write FValidTill;
property Guid : RawUTF8 read FGuid write FGuid;
end;And you must add this new class to the model:
GroupsModel := TSQLModel.Create([TAuthUser, TAuthGroup, ..., FormatUtf8('wp/%/group',[ServerLogon.WPID]) );Since User Access Right is not verified in the TSQLRestServer.EngineBatchSend function but the RecordCanBeUpdated event is called, is there a possibility to get the SessionID in the RecordCanBeUpdated event to get the user who is executing the current operation?
function TWPServer.OnUpdate(Sender: TSQLRestServer; Event: TSQLEvent;
aTable: TSQLRecordClass; aID: integer): boolean;
var
u : TAuthUser;
begin
if (Event = seDelete) and (aTable = TEmails) then
begin
U := GroupServer.SessionGetUser(SessionID) <- How to get SessionID here?
end
end;I uploaded whole JavaScript project to the Dropbox:
https://www.dropbox.com/s/ex9pfemvwmp4kba/www.zip?dl=0
but to understand how it works is better to download Software (G-Lock EasyMail) from a topic http://synopse.info/forum/viewtopic.php?id=1954
and visit browser url: http://127.0.0.1:50888/wps/www/index.html
This code is used in the real application. see (http://synopse.info/forum/viewtopic.php?id=1954)
syn-auth.js
angular.module('syn-auth', ['em-utils'], function($provide) {
// the number of login retries on 403 before logout
var MAX_RECONNECT_COUNT = 5;
$provide.factory('SynAuth', function(crc32, sha256) {
function SynAuth(host, port) {
var defaults = {
host: '',
defaultPort: '888',
User : "",
fRoot : "",
fSessionID : 0,
fSessionIDHexa8 : "",
fSessionPrivateKey : 0,
fSessionTickCountOffset : 0,
fLastSessionTickCount : 0,
PasswordHashHexa : "",
fServerTimeStampOffset : 0,
fcallBack : null,
ffailCallBack : null
}; // SynAuth namespace
this.connectionReady = false;
this.readyCallbacks = [];
for ( var p in defaults ) {
this[p] = defaults[p];
}
if ( host ) {
this.setHost(host, port);
}
}
var sp = SynAuth.prototype;
sp.ready = function(cb) {
if ( this.connectionReady ) {
cb(this);
} else {
this.readyCallbacks.push(cb);
}
};
sp.fireReadyCbs = function() {
var _this = this;
angular.forEach(this.readyCallbacks, function(cb){
cb(_this);
});
};
sp.wrap = function(method){
var that = this;
return function(){
method.apply(that, arguments);
};
};
sp.setHost = function(host, port) {
if ( !host.match(/\:\d+$/) ) {
host += ':'+(port || this.defaultPort);
}
this.host = host;
};
sp.LogIn = function (root, username, password, callback, failCallback){
this.fRoot = root;
this.User = username;
this.PasswordHashHexa = sha256.hash("salt"+password);
if (callback) {this.fcallBack = callback;}
if (failCallback) {this.ffailCallback = failCallback;}
$.get(this.host+"/"+root+"/TimeStamp", this.wrap(this.gotTimeStamp));
}
sp.LogInAgain = function(callback){ //after timeout error for silent re-login
this.fSessionID = 0;
this.fSessionIDHexa8 = "";
this.fSessionPrivateKey = 0;
if (callback) {this.fcallBack = callback;} else {this.fcallBack = null;}
$.get(this.host+"/"+this.fRoot+"/TimeStamp", this.wrap(this.gotTimeStamp));
}
sp.gotTimeStamp = function (timestamp) {
var s = '', d = new Date(), clientTime = '';
timestamp = parseInt(timestamp, 10);
s = d.getFullYear().toString(2);
while(s.length < 13) { s = '0'+s;}
clientTime = s;
s = d.getMonth().toString(2);
while(s.length < 4) { s = '0'+s;}
clientTime = clientTime +s;
s = (d.getDate()-1).toString(2);
while(s.length < 5) { s = '0'+s;}
clientTime = clientTime +s;
s = d.getHours().toString(2);
while(s.length < 5) { s = '0'+s;}
clientTime = clientTime +s;
s = d.getMinutes().toString(2);
while(s.length < 6) { s = '0'+s;}
clientTime = clientTime +s;
s = d.getSeconds().toString(2);
while(s.length < 6) { s = '0'+s;}
clientTime = clientTime +s;
this.fServerTimeStampOffset = (timestamp - Math.floor(d.getTime()/10));
$.get(this.host+"/"+this.fRoot+"/auth?UserName="+this.User, this.wrap(this.gotNonce));
}
sp.gotNonce = function (aNonce){
var that = this;
//create client nonce
var aClientNonce = "", s = "", d = new Date();
aClientNonce = d.getFullYear().toString();
s = d.getMonth().toString();
if (s.length === 1) { s = '0'+s;}
aClientNonce = aClientNonce + '-' + s;
s = d.getDate().toString();
if (s.length === 1) { s = '0'+s;}
aClientNonce = aClientNonce + '-' + s + ' ';
s = d.getHours().toString();
if (s.length === 1) { s = '0'+s;}
aClientNonce = aClientNonce + s;
s = d.getMinutes().toString();
if (s.length === 1) { s = '0'+s;}
aClientNonce = aClientNonce + ':' + s;
s = d.getSeconds().toString();
if (s.length === 1) { s = '0'+s;}
aClientNonce = aClientNonce + ':' + s;
aClientNonce = sha256.hash(aClientNonce);
s = this.host+"/"+ this.fRoot+"/auth?UserName="+this.User+"&Password=" +
sha256.hash(this.fRoot+aNonce.result+aClientNonce+this.User+this.PasswordHashHexa )+
"&ClientNonce="+aClientNonce;
$.ajax({
type: "GET",
dataType: "json",
url: s,
success: function(){ that.gotSession.apply(that, arguments); },
error: function(){ that.ffailCallback.apply(that, arguments); }
});
};
sp.gotSession = function (aSessionKey){
var sessArr = aSessionKey.result.split('+');
this.fSessionID = parseInt(sessArr[0], 10);
this.fSessionIDHexa8 = this.fSessionID.toString(16);
while ( this.fSessionIDHexa8.length < 8 ) { this.fSessionIDHexa8 = '0'+this.fSessionIDHexa8; }
this.fSessionPrivateKey = crc32(this.PasswordHashHexa, crc32(aSessionKey.result, 0));
if (this.fcallBack != null) {
this.connectionReady = true;
this.fireReadyCbs();
this.fcallBack();
}
}
sp.SessionSign = function (url) {
var Tix, Nonce, s, ss;
Tix = Date.now(); // # of ms since Epoch
if ( Tix <= this.fLastSessionTickCount ) {
this.fLastSessionTickCount += 1;
} else {
this.fLastSessionTickCount = Tix;
}
Nonce = Tix.toString(16);
while ( Nonce.length < 8 ) { Nonce = '0'+Nonce; }
if ( Nonce.length > 8 ) { Nonce = Nonce.slice(Nonce.length-8) }
ss = crc32(url, crc32(Nonce, this.fSessionPrivateKey)).toString(16);
while ( ss.length < 8 ) { ss = '0'+ss; }
s = url.indexOf("?") == -1 ? url+'?session_signature=' : url+'&session_signature=';
return s + this.fSessionIDHexa8 + Nonce + ss;
}
sp.getURL = function(uri) {
return this.host+'/'+this.SessionSign(uri);
}
sp.Logout = function (callback) {
if (this.fSessionID == 0) {if (callback){callback();}} else {
$.get(this.host+"/"+this.fRoot+"/auth?UserName="+this.User+"&Session="+this.fSessionID, callback);
this.fRoot = '';
this.User = '';
this.fSessionID = 0;
this.fSessionIDHexa8 = "";
this.fSessionPrivateKey = 0;
}
}
return SynAuth;
});
// wps
// wp/%/settings
// wp/%/group
// wp/%/bounce
// infinitypropertycomau
// garethinfinitypropertycomau
// bernieinfinitypropertycomau
var forEach = angular.forEach;
function isArray(value) {
return Object.prototype.toString.apply(value) == '[object Array]';
}
function isObject(value){return value != null && typeof value == 'object';}
function sortedKeys(obj) {
var keys = [];
for (var key in obj) {
if (obj.hasOwnProperty(key)) {
keys.push(key);
}
}
return keys.sort();
}
function forEachSorted(obj, iterator, context) {
var keys = sortedKeys(obj);
for ( var i = 0; i < keys.length; i++) {
iterator.call(context, obj[keys[i]], keys[i]);
}
return keys;
}
function encodeUriQuery(val, pctEncodeSpaces) {
return encodeURIComponent(val).
replace(/%40/gi, '@').
replace(/%3A/gi, ':').
replace(/%24/g, '$').
replace(/%2C/gi, ',').
replace(/%20/g, (pctEncodeSpaces ? '%20' : '+'));
}
function buildQueryString(params) {
var parts = [];
forEachSorted(params, function(value, key) {
if (value == null || value == undefined) return;
if (!isArray(value)) value = [value];
forEach(value, function(v) {
if (isObject(v)) {
v = toJson(v);
}
parts.push(encodeUriQuery(key) + '=' +
encodeUriQuery(v));
});
});
return parts.join('&');
}
function buildUrl(url, params) {
if (!params) return url;
var qs = buildQueryString(params);
return url + ((url.indexOf('?') == -1) ? '?' : '&') + qs;
}
$provide.factory('synConn', function(SynAuth, emAuth, $http, $q, emsLoadingBar) {
var synConn = {
opts: {
host: emAuth.get('host'),
port: emAuth.get('port'),
email: emAuth.get('email'),
password: emAuth.get('password'),
secure: emAuth.get('secure')
}
};
var roots = [
'wps',
'wp/\\d+/settings',
'wp/\\d+/group',
'wp/\\d+/bounce'
];
var connCache = {};
function setOpts(o) {
synConn.opts = o;
if ( !o.host ) { throw new Exception('[SynConn] setOpts. "host" option is not defined.'); }
if ( !o.email ) { throw new Exception('[SynConn] setOpts. "email" option is not defined.'); }
if ( !o.password ) { throw new Exception('[SynConn] setOpts. "password" option is not defined.'); }
o.port = o.port || '888';
}
function supplant (str, o) {
return str.replace(
/\{([^{}]*)\}/g,
function (a, b) {
var r = o[b];
return typeof r === 'string' || typeof r === 'number' ? r : a;
}
);
}
function getBaseURL() {
var pr = synConn.opts.secure ? 'https' : 'http';
return pr+'://'+synConn.opts.host+':'+synConn.opts.port+'/';
}
function createNewSynAuth(root, done) {
var pr = synConn.opts.secure ? 'https' : 'http';
//return { SynAuth: 'dummy'+(new Date()).getTime() };
var sa = new SynAuth(pr+'://'+synConn.opts.host, synConn.opts.port);
sa.LogIn(root, synConn.opts.email, synConn.opts.password, function(){
done(null, sa);
}, function(xhr, status, msg){
// failback
var err = new Error(msg);
err.status = xhr.status;
done(err);
});
return sa;
}
function getConnectionFromURi(uri, done) {
var found = false;
// looking for cached connection
for ( var root in connCache ) {
if ( uri.indexOf(root) === 0 ) {
return connCache[root].ready(function(conn){
done(null, conn);
});
}
}
// creating new connection
for (var r, res, i = 0; i < roots.length; i++) {
if ( res = (new RegExp(roots[i],'i')).exec(uri) ) {
r = res[0]; // matched root uri, like wp/1/settings
found = true;
connCache[r] = createNewSynAuth(r, function(err, sa){
if ( err ) {
delete connCache[r];
return done(err);
}
return done(null, sa);
});
}
}
if ( !found ) {
done(new Error('getConnectionFromURi: Cannot get rootURi for '+uri));
}
}
function removeCachedConnection(sa) {
var found = false;
// looking for cached connection
for ( var root in connCache ) {
if ( connCache[root] === sa) {
delete connCache[root];
break;
}
}
}
function http(o) {
var that = {
sa: null,
err: null,
errorCb: null,
successCb: null
};
if ( !o.headers ) {
o.headers = {};
}
var reconnect = 0;
var wp = emAuth.get('workplace');
if ( wp ) {
synConn.opts.wpId = emAuth.get('workplace').ID;
}
o.uri = supplant(o.uri, synConn.opts);
function connect() {
getConnectionFromURi(o.uri, function(err, sa){
if ( err ) { that.err = err; }
that.sa = sa;
setTimeout(function() {
if ( err ) { return that.errorCb(err); }
runRequest();
}, 1);
});
}
connect();
function runRequest() {
var url;
if ( o.params ) {
url = that.sa.getURL( buildUrl(o.uri, o.params) );
delete o.params;
} else {
url = that.sa.getURL(o.uri);
}
if ( o.data ) {
if ( o.headers['Content-Type'] === 'application/x-www-form-urlencoded' ) {
o.data = buildQueryString(o.data);
}
}
var opts = angular.extend({ url : url }, o);
delete opts.uri;
opts.method = opts.method || 'GET';
$http(opts).success(function(){
that.successCb.apply(this, arguments);
emsLoadingBar.stop();
reconnect = 0;
}).error(function(data){
var err = new Error(data.ErrorText);
err.ErrorCode = data.ErrorCode;
if ( err && err.ErrorCode == 403 ) {
if ( reconnect == MAX_RECONNECT_COUNT ) {
alert('Authentication failure. You will be logged out now. Shall this error repeat, contact your server administrator.');
window.location.hash = '#/logout';
return;
}
removeCachedConnection(that.sa);
reconnect += 1;
connect();
} else {
var args = Array.prototype.slice.call(arguments);
args.unshift(err);
that.errorCb.apply(this, args);
emsLoadingBar.stop();
}
});
emsLoadingBar.start();
}
return {
success: function(cb){
that.successCb = cb;
return this;
},
error: function(cb){
that.errorCb = cb;
return this;
}
};
}
function touch(uri, done) {
getConnectionFromURi(uri, done);
}
synConn.logout = function() {
connCache = {};
};
//getConnectionFromURi('wp/1/settings/GetTemplateMessage?ID=');
synConn.setOpts = setOpts;
synConn.http = http;
synConn.touch = touch;
synConn.getBaseURL = getBaseURL;
return synConn;
});
});Please pay attention to ROOT:
var roots = [
'wps',
'wp/\\d+/settings',
'wp/\\d+/group',
'wp/\\d+/bounce'
];em-utils.js
angular.module('em-utils', [], function($provide) {
var makeCRCTable = function(){
var c;
var crcTable = [];
for(var n =0; n < 256; n++){
c = n;
for(var k =0; k < 8; k++){
c = ((c&1) ? (0xEDB88320 ^ (c >>> 1)) : (c >>> 1));
}
crcTable[n] = c;
}
return crcTable;
}
var crcTable = makeCRCTable();
var crc32 = function(str, init) {
var crc = (typeof init === 'undefined') ? 0 ^ (-1) : init^0xFFFFFFFF;
//var crc = 0 ^ (-1);
for (var i = 0; i < str.length; i++ ) {
crc = (crc >>> 8) ^ crcTable[(crc ^ str.charCodeAt(i)) & 0xFF];
}
return (crc ^ (-1)) >>> 0;
};
/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
/* SHA-256 implementation in JavaScript | (c) Chris Veness 2002-2010 | www.movable-type.co.uk */
/* - see http://csrc.nist.gov/groups/ST/toolkit/secure_hashing.html */
/* http://csrc.nist.gov/groups/ST/toolkit/examples.html */
/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
var Sha256 = {}; // Sha256 namespace
/**
* Generates SHA-256 hash of string
*
* @param {String} msg String to be hashed
* @param {Boolean} [utf8encode=true] Encode msg as UTF-8 before generating hash
* @returns {String} Hash of msg as hex character string
*/
Sha256.hash = function(msg, utf8encode) {
utf8encode = (typeof utf8encode == 'undefined') ? true : utf8encode;
// convert string to UTF-8, as SHA only deals with byte-streams
if (utf8encode) msg = Utf8.encode(msg);
// constants [§4.2.2]
var K = [0x428a2f98, 0x71374491, 0xb5c0fbcf, 0xe9b5dba5, 0x3956c25b, 0x59f111f1, 0x923f82a4, 0xab1c5ed5,
0xd807aa98, 0x12835b01, 0x243185be, 0x550c7dc3, 0x72be5d74, 0x80deb1fe, 0x9bdc06a7, 0xc19bf174,
0xe49b69c1, 0xefbe4786, 0x0fc19dc6, 0x240ca1cc, 0x2de92c6f, 0x4a7484aa, 0x5cb0a9dc, 0x76f988da,
0x983e5152, 0xa831c66d, 0xb00327c8, 0xbf597fc7, 0xc6e00bf3, 0xd5a79147, 0x06ca6351, 0x14292967,
0x27b70a85, 0x2e1b2138, 0x4d2c6dfc, 0x53380d13, 0x650a7354, 0x766a0abb, 0x81c2c92e, 0x92722c85,
0xa2bfe8a1, 0xa81a664b, 0xc24b8b70, 0xc76c51a3, 0xd192e819, 0xd6990624, 0xf40e3585, 0x106aa070,
0x19a4c116, 0x1e376c08, 0x2748774c, 0x34b0bcb5, 0x391c0cb3, 0x4ed8aa4a, 0x5b9cca4f, 0x682e6ff3,
0x748f82ee, 0x78a5636f, 0x84c87814, 0x8cc70208, 0x90befffa, 0xa4506ceb, 0xbef9a3f7, 0xc67178f2];
// initial hash value [§5.3.1]
var H = [0x6a09e667, 0xbb67ae85, 0x3c6ef372, 0xa54ff53a, 0x510e527f, 0x9b05688c, 0x1f83d9ab, 0x5be0cd19];
// PREPROCESSING
msg += String.fromCharCode(0x80); // add trailing '1' bit (+ 0's padding) to string [§5.1.1]
// convert string msg into 512-bit/16-integer blocks arrays of ints [§5.2.1]
var l = msg.length/4 + 2; // length (in 32-bit integers) of msg + '1' + appended length
var N = Math.ceil(l/16); // number of 16-integer-blocks required to hold 'l' ints
var M = new Array(N);
for (var i=0; i<N; i++) {
M[i] = new Array(16);
for (var j=0; j<16; j++) { // encode 4 chars per integer, big-endian encoding
M[i][j] = (msg.charCodeAt(i*64+j*4)<<24) | (msg.charCodeAt(i*64+j*4+1)<<16) |
(msg.charCodeAt(i*64+j*4+2)<<8) | (msg.charCodeAt(i*64+j*4+3));
} // note running off the end of msg is ok 'cos bitwise ops on NaN return 0
}
// add length (in bits) into final pair of 32-bit integers (big-endian) [§5.1.1]
// note: most significant word would be (len-1)*8 >>> 32, but since JS converts
// bitwise-op args to 32 bits, we need to simulate this by arithmetic operators
M[N-1][14] = ((msg.length-1)*8) / Math.pow(2, 32); M[N-1][14] = Math.floor(M[N-1][14])
M[N-1][15] = ((msg.length-1)*8) & 0xffffffff;
// HASH COMPUTATION [§6.1.2]
var W = new Array(64); var a, b, c, d, e, f, g, h;
for (var i=0; i<N; i++) {
// 1 - prepare message schedule 'W'
for (var t=0; t<16; t++) W[t] = M[i][t];
for (var t=16; t<64; t++) W[t] = (Sha256.sigma1(W[t-2]) + W[t-7] + Sha256.sigma0(W[t-15]) + W[t-16]) & 0xffffffff;
// 2 - initialise working variables a, b, c, d, e, f, g, h with previous hash value
a = H[0]; b = H[1]; c = H[2]; d = H[3]; e = H[4]; f = H[5]; g = H[6]; h = H[7];
// 3 - main loop (note 'addition modulo 2^32')
for (var t=0; t<64; t++) {
var T1 = h + Sha256.Sigma1(e) + Sha256.Ch(e, f, g) + K[t] + W[t];
var T2 = Sha256.Sigma0(a) + Sha256.Maj(a, b, c);
h = g;
g = f;
f = e;
e = (d + T1) & 0xffffffff;
d = c;
c = b;
b = a;
a = (T1 + T2) & 0xffffffff;
}
// 4 - compute the new intermediate hash value (note 'addition modulo 2^32')
H[0] = (H[0]+a) & 0xffffffff;
H[1] = (H[1]+b) & 0xffffffff;
H[2] = (H[2]+c) & 0xffffffff;
H[3] = (H[3]+d) & 0xffffffff;
H[4] = (H[4]+e) & 0xffffffff;
H[5] = (H[5]+f) & 0xffffffff;
H[6] = (H[6]+g) & 0xffffffff;
H[7] = (H[7]+h) & 0xffffffff;
}
return Sha256.toHexStr(H[0]) + Sha256.toHexStr(H[1]) + Sha256.toHexStr(H[2]) + Sha256.toHexStr(H[3]) +
Sha256.toHexStr(H[4]) + Sha256.toHexStr(H[5]) + Sha256.toHexStr(H[6]) + Sha256.toHexStr(H[7]);
}
Sha256.ROTR = function(n, x) { return (x >>> n) | (x << (32-n)); }
Sha256.Sigma0 = function(x) { return Sha256.ROTR(2, x) ^ Sha256.ROTR(13, x) ^ Sha256.ROTR(22, x); }
Sha256.Sigma1 = function(x) { return Sha256.ROTR(6, x) ^ Sha256.ROTR(11, x) ^ Sha256.ROTR(25, x); }
Sha256.sigma0 = function(x) { return Sha256.ROTR(7, x) ^ Sha256.ROTR(18, x) ^ (x>>>3); }
Sha256.sigma1 = function(x) { return Sha256.ROTR(17, x) ^ Sha256.ROTR(19, x) ^ (x>>>10); }
Sha256.Ch = function(x, y, z) { return (x & y) ^ (~x & z); }
Sha256.Maj = function(x, y, z) { return (x & y) ^ (x & z) ^ (y & z); }
//
// hexadecimal representation of a number
// (note toString(16) is implementation-dependant, and
// in IE returns signed numbers when used on full words)
//
Sha256.toHexStr = function(n) {
var s="", v;
for (var i=7; i>=0; i--) { v = (n>>>(i*4)) & 0xf; s += v.toString(16); }
return s;
}
/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
/* Utf8 class: encode / decode between multi-byte Unicode characters and UTF-8 multiple */
/* single-byte character encoding (c) Chris Veness 2002-2010 */
/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
var Utf8 = {}; // Utf8 namespace
/**
* Encode multi-byte Unicode string into utf-8 multiple single-byte characters
* (BMP / basic multilingual plane only)
*
* Chars in range U+0080 - U+07FF are encoded in 2 chars, U+0800 - U+FFFF in 3 chars
*
* @param {String} strUni Unicode string to be encoded as UTF-8
* @returns {String} encoded string
*/
Utf8.encode = function(strUni) {
// use regular expressions & String.replace callback function for better efficiency
// than procedural approaches
var strUtf = strUni.replace(
/[\u0080-\u07ff]/g, // U+0080 - U+07FF => 2 bytes 110yyyyy, 10zzzzzz
function(c) {
var cc = c.charCodeAt(0);
return String.fromCharCode(0xc0 | cc>>6, 0x80 | cc&0x3f); }
);
strUtf = strUtf.replace(
/[\u0800-\uffff]/g, // U+0800 - U+FFFF => 3 bytes 1110xxxx, 10yyyyyy, 10zzzzzz
function(c) {
var cc = c.charCodeAt(0);
return String.fromCharCode(0xe0 | cc>>12, 0x80 | cc>>6&0x3F, 0x80 | cc&0x3f); }
);
return strUtf;
}
/**
* Decode utf-8 encoded string back into multi-byte Unicode characters
*
* @param {String} strUtf UTF-8 string to be decoded back to Unicode
* @returns {String} decoded string
*/
Utf8.decode = function(strUtf) {
// note: decode 3-byte chars first as decoded 2-byte strings could appear to be 3-byte char!
var strUni = strUtf.replace(
/[\u00e0-\u00ef][\u0080-\u00bf][\u0080-\u00bf]/g, // 3-byte chars
function(c) { // (note parentheses for precence)
var cc = ((c.charCodeAt(0)&0x0f)<<12) | ((c.charCodeAt(1)&0x3f)<<6) | ( c.charCodeAt(2)&0x3f);
return String.fromCharCode(cc); }
);
strUni = strUni.replace(
/[\u00c0-\u00df][\u0080-\u00bf]/g, // 2-byte chars
function(c) { // (note parentheses for precence)
var cc = (c.charCodeAt(0)&0x1f)<<6 | c.charCodeAt(1)&0x3f;
return String.fromCharCode(cc); }
);
return strUni;
}
/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
$provide.factory('crc32', function() {
return crc32;
});
$provide.factory('sha256', function() {
return Sha256;
});
});I think that we should add the check of Users rights to the method ExecuteORMGet.
When the client calls MultiFieldValues or ExecuteList, the User Rights are not checked in the method ExecuteORMGet.
procedure TSQLRestServerURIContext.ExecuteORMGet;
var SQLSelect, SQLWhere, SQLSort, SQLDir, SQL: RawUTF8;
SQLStartIndex, SQLResults, SQLTotalRowsCount: integer;
NonStandardSQLSelectParameter, NonStandardSQLWhereParameter: boolean;
SQLisSelect: boolean;
ResultList: TSQLTableJSON;
P: PUTF8Char;
i,j,L: integer;
Blob: PPropInfo;
tbIndex: integer;
begin
case Method of
mLOCK,mGET: begin
if Table=nil then begin
if (Method<>mLOCK) then begin
if (Call.InBody='') and (Parameters<>nil) and
(reUrlEncodedSQL in Call.RestAccessRights^.AllowRemoteExecute) then begin
// GET with a SQL statement sent in URI, as sql=....
while not UrlDecodeValue(Parameters,'SQL=',SQL,@Parameters) do
if Parameters=nil then break;
end else
// GET with a SQL statement sent as UTF-8 body
SQL := Call.InBody;
SQLisSelect := isSelect(pointer(SQL));
//check User Access
tbIndex := Server.Model.GetTableIndexFromSQLSelect(SQL, false);
if not (tbIndex in Call.RestAccessRights^.GET) then // check User Access
begin
Call.OutStatus := HTML_NOTALLOWED;
exit;
end;
// end ckeck User Access
...In the test computer I use SSD.
Th import process is more complex than simple generating values and sending it via batch.
It reads CSV file, performs fields mapping from CSV record to TSQLRecord. (takes about 4 sec (with call BatchAdd) for 1M record)
Type
TFieldItem = class(TCollectionItem)
private
FFieldName : String;
FFieldValue : Variant;
public
published
property Name : String read FFieldName write FFieldName;
property Value : Variant read FFieldValue write FFieldValue;
end;
Type
TFields = Class(TCollection)
private
function GetItem(index: integer): TFieldItem;
procedure SetItem(index: integer; value: TFieldItem);
function GetItemByName(name: string): TFieldItem;
procedure SetItemByName(name: string; value: TFieldItem);
public
constructor Create;
function Add(_FFieldName: String; _FFieldValue: Variant): TFieldItem;
property Items[index: integer]: TFieldItem read GetItem write SetItem; default;
property Item[name: string]: TFieldItem read GetItemByName write SetItemByName;
end;
Type
TEmails = class(TSQLRecord)
private
FGroupID : Integer;
FEmail: String;
FFirst_Name: String;
FLast_Name: String;
FRecipientName : String;
FSubscribed: Integer;
FSubscribe_Date: TDateTime;
FFields: TFields;
public
constructor Create; override;
destructor Destroy; override;
published
property Email: String read FEmail Write FEmail;
property First_Name: String read FFirst_Name Write FFirst_Name;
property Last_Name: String read FLast_Name Write FLast_Name;
property Recipient_Name : String read FRecipientName write FRecipientName;
property Subscribed: Integer read FSubscribed Write FSubscribed;
property Subscribe_Date: TDateTime read FSubscribe_Date Write FSubscribe_Date;
property GroupID : Integer read FGroupID write FGroupID;
property Fields: TFields read FFields write FFields;
end;Plus in db several indexes are created:
FGroupServer.CreateSQLMultiIndex(TEmails, ['Email','GroupID'], True);
FGroupServer.CreateSQLIndex(TEmails, 'Subscribed', False);
FGroupServer.CreateSQLIndex(TEmails, 'Email', False);
FGroupServer.CreateSQLIndex(TEmails, 'GroupID', False);I try to use exclusive locking mode, but speed is not grown significantly (about 25 Sec).
Plus HTTP connection uses SynAes encryption.
In any case the inserting speed is very high for my purpose and none of competitor's software has the import speed close to mine.
SUPER! Now 1M records via HTTP on a local computer is inserted in 29 sec! instead of 43 sec before.
Thanks ab!
Ctxt = nil
procedure TSQLRestServer.SessionDelete(aSessionIndex: integer;
Ctxt: TSQLRestServerURIContext);
...
{$ifdef WITHLOG}
if Assigned(User) then
Ctxt.Log.Log(sllUserAuth,'Deleted session %/% from %/%', <---------------------------------------------- Ctxt = nil
[User.LogonName,IDCardinal,RemoteIP,ConnectionID],self);
{$endif}When SessionDelete function is called from TSQLRestServer.SessionAccess, the Ctxt - is nil
function TSQLRestServer.SessionAccess(Ctxt: TSQLRestServerURIContext): TAuthSession;
var i: integer;
Tix64: Int64;
begin // caller shall be locked via fSessionCriticalSection
if (self<>nil) and (fSessions<>nil) then begin
// first check for outdated sessions to be deleted
Tix64 := GetTickCount64;
for i := fSessions.Count-1 downto 0 do
with TAuthSession(fSessions.List[i]) do
if Tix64>LastAccess64+TimeOutMS then
SessionDelete(i,nil); <------------------------------------------------------------------ Ctxt - nil
// retrieve session
for i := 0 to fSessions.Count-1 do begin
result := TAuthSession(fSessions.List[i]);
if result.IDCardinal=Ctxt.Session then begin
result.fLastAccess64 := Tix64; // refresh session access timestamp
Ctxt.SessionUser := result.User.fID;
Ctxt.SessionGroup := result.User.GroupRights.fID;
Ctxt.SessionUserName := result.User.LogonName;
Ctxt.SessionRemoteIP := result.RemoteIP;
Ctxt.SessionConnectionID := result.ConnectionID;
exit;
end;
end;
end;
result := nil;
end;The program allows to have several groups of email addresses.
Email addresses of all groups are stored in one database which is managed by FGroupServer.
The group cannot contain duplicate email addresses. For this purpose I built the index by 'Email','GroupID'.
FGroupServer := TGroupServer.Create(GroupsModel, Format(DefaultWorkPlace + '%d\groups.db3' ,[FWorkPlaceID]), true);
FGroupServer.CreateSQLMultiIndex(TExclusionList, ['Email','GroupID'], True);For the quick import of an email list I use BatchStart, BatchSend.
No such way as identifying "duplicates in the email lists" unless you first read the data, and search for the email.
Before when EngineBatchSend was working at the RestServer level, there was no problem with duplicate emails.
ID := EngineAdd(RunTableIndex,Value);
Results[Count] := ID;If the email address was already in the group, then ID = 0 (function TSQLRestServerDB.InternalExecute doesn't raise exception), and there was no problem with the import.
Now if the email address is already in the database, then when inserting data in the TSQLRestServerDB.InternalBatchStop function, exception is raised at:
repeat until Statement^.Step<>SQLITE_ROW; after that RollBack(CONST_AUTHENTICATION_NOT_USED) is performed and the entire batch in the quantity AutomaticTransactionPerRow will be ignored.
I understand that when using my code Results: TIntegerDynArray will not always contain correct IDs. In my situation it's not important.
In order changes are not applied to external databases, I added an additional parameter to the function BatchStart.
How I transmit the "IgnoreDuplicates" parameter to TSQLRestServerDB.InternalBatchStop:
If I need to ignore duplicate emails in BatchSend, I transmit the additional parameter SkipDuplicates
const
AUTOMATICTRANSACTIONPERROW_PATTERN = '"AUTOMATICTRANSACTIONPERROW":';
AUTOMATICTRANSACTIONPERROW_IGNORE = '"AUTOMATICTRANSACTIONDUPIGNORE":';
...
function TSQLRest.BatchStart(aTable: TSQLRecordClass;
AutomaticTransactionPerRow: cardinal; SkipDuplicates: boolean): boolean;
...
if SkipDuplicates then
begin
fBatch.AddShort(AUTOMATICTRANSACTIONPERROW_IGNORE);
fBatch.Add(Integer(SkipDuplicates));
fBatch.Add(',');
end;On the server I check if the parameter SkipDuplicates: exists
function TSQLRestServer.EngineBatchSend(Table: TSQLRecordClass;
const Data: RawUTF8; var Results: TIntegerDynArray): integer;
...
if IdemPChar(Sent,AUTOMATICTRANSACTIONPERROW_IGNORE) then begin
inc(Sent,Length(AUTOMATICTRANSACTIONPERROW_IGNORE));
SkipDuplicates := GetNextItemInteger(Sent,',');
end
else
SkipDuplicates := 0;
...
RunningBatchRest.InternalBatchStop(SkipDuplicates=1);In the function InternalBatchStop the existence of SkipDuplicates is checked and the according SQL text is generated:
procedure TSQLRestServerDB.InternalBatchStop(SkipDuplicates: Boolean= false);
...
if SkipDuplicates then
SQL := Decode.EncodeAsSQLPrepared(Props.SQLTableName, soInsertIgnore ,'')
else
SQL := Decode.EncodeAsSQLPrepared(Props.SQLTableName,soInsert,'');Now if I need to quickly import data and not to worry about duplicate addresses, I simply transmit SkipDuplicates into the BatchStart function.
In my program users import hundreds or thousands of emails in the database.
Naturally there are duplicates in the email lists.
I need the import NOT to break when it detects a duplicate like it was in the previous version.
I would like to be able to do it automatically.
SQLITE has the ability to ignore duplicates automatically using insert or ignore Into SQL.
For example:
Add new Occasion :
TSQLOccasion = (
soSelect,
soInsert,
soUpdate,
soDelete,
soInsertIgnore);Little modification in function TJSONObjectDecoder.EncodeAsSQLPrepared
function TJSONObjectDecoder.EncodeAsSQLPrepared(const TableName: RawUTF8;
Occasion: TSQLOccasion; const UpdateIDFieldName: RawUTF8;
MultipleInsertCount: integer): RawUTF8;
const SQL: array[Boolean] of PUTF8Char = (
'insert into %%','update % set % where %=?');
var F: integer;
P: PUTF8Char;
tmp: RawUTF8;
begin
result := '';
if FieldCount<>0 then begin
if ((Occasion<>soInsert) and (Occasion<>soInsertIgnore)) or (MultipleInsertCount<=0) then
MultipleInsertCount := 1;
SetLength(tmp,FieldNameLen+4*FieldCount*MultipleInsertCount+24); // max length
P := pointer(tmp);
case Occasion of
soUpdate: begin
// returns 'COL1=?,COL2=?' (UPDATE SET format)
for F := 0 to FieldCount-1 do begin
P := AppendRawUTF8ToBuffer(P,DecodedFieldNames[F]);
PInteger(P)^ := Ord('=')+Ord('?')shl 8+Ord(',')shl 16;
inc(P,3);
end;
dec(P);
end;
soInsert, soInsertIgnore: begin
// returns ' (COL1,COL2) VALUES (?,?)' (INSERT format)
PWord(P)^ := Ord(' ')+ord('(')shl 8;
inc(P,2);
for F := 0 to FieldCount-1 do begin
P := AppendRawUTF8ToBuffer(P,DecodedFieldNames[F]);
P^ := ',';
inc(P);
end;
P := AppendRawUTF8ToBuffer(P-1,') VALUES (');
repeat
for F := 1 to FieldCount do begin
PWord(P)^ := Ord('?')+Ord(',')shl 8;
inc(P,2);
end;
P[-1] := ')';
dec(MultipleInsertCount);
if MultipleInsertCount=0 then
break;
PWord(P)^ := Ord(',')+Ord('(')shl 8;
inc(P,2);
until false;
end;
else
raise EORMException.Create('Invalid EncodeAsSQLPrepared() call');
end;
assert(P-pointer(tmp)<length(tmp));
SetLength(tmp,P-pointer(tmp));
end else
if Occasion=soUpdate then
exit else
tmp := ' default values';
if Occasion = soInsertIgnore then
Result := FormatUTF8('insert or ignore into %%',[TableName,tmp,UpdateIDFieldName])
else
result := FormatUTF8(SQL[Occasion=soUpdate],[TableName,tmp,UpdateIDFieldName]);
end;in procedure TSQLRestServerDB.InternalBatchStop(IgnoreDuplicates: boolean= false);
if IgnoreDuplicates then
SQL := Decode.EncodeAsSQLPrepared(Props.SQLTableName, soInsertIgnore ,'')
else
SQL := Decode.EncodeAsSQLPrepared(Props.SQLTableName,soInsert,'');end;
ups, a little late
I think, that should be: if InternalExecute(SQL,@LastID, nil,nil,nil) instead of if InternalExecute(SQL,nil,nil,nil,@LastID):
if fBatchFirstID=0 then begin
SQL := 'select max(rowid) from '+SQL;
if InternalExecute(SQL,@LastID, nil,nil,nil) then
fBatchFirstID := LastID+1 else begin
fBatchFirstID := -1; // will force error for whole BATCH block
exit;
end;The same problem may be in the function SQLRestServerDB.MainEngineUpdateField:
if not InternalExecute(FormatUTF8('select RowID from % where %=:(%):',
[SQLTableName,WhereFieldName,WhereValue]),nil,nil,@ID) then
exit else
if ID=nil then begin
result := true; // nothing to update, but return success
exit;
end;Steps to reproduce:
1. Start the server with an empty DB
2. Add data (BatchStart, BatchSend...) - all OK
3. Stop the server
4. Start the server again (DB contains data from Step 2)
5. Adding data (BatchStart, BatchSend...) - FAILURE problem in MainEngineAdd.
function TSQLRestServerDB.MainEngineAdd
...
if fBatchFirstID=0 then begin
SQL := 'select max(rowid) from '+SQL;
if InternalExecute(SQL,nil,nil,nil,@LastID) then <------------------------------------ LastID = 0 ???
fBatchFirstID := LastID+1 else begin
fBatchFirstID := -1; // will force error for whole BATCH block
exit;
end;procedure TSQLRestServerDB.InternalBatchStop;
...
repeat until Statement^.Step<>SQLITE_ROW; <-------------- EXCEPTION 'Error SQLITE_CONSTRAINT (19) - "UNIQUE constraint failed: Emails.ID"'Generated SQL:
insert into Emails (RowID,Email,First_Name,Last_Name,Recipient_Name,Subscribed,Subscribe_Date,GroupID,Fields) VALUES (?,?,?,?,?,?,?,?,?),(?,?,?,?,?,?,?,?,?);
VALUES:
('1', 'email1@domain.com', 'First', 'Last', '', '0', '2014-08-27T14:49:31', '306', '[]', '2', 'email2@domain.com', 'Gary', 'Booher', '', '0', '2014-08-27T14:49:31', '306', '[]')
The version from: http://synopse.info/fossil/info/7335db42cb does not instert the record. The generated SQL contains RowID field, I think there must be no RowID.
SQL:
INSERT INTO Emails (RowID,Email,First_Name,Last_Name,Recipient_Name,Subscribed,Subscribe_Date,GroupID,Fields) VALUES (:(1):,:(''email@domain.com''):,:(''''):,:(''''):,:(''''):,:(1):,:(''2014-08-27T14:26:35''):,:(302):,:(''[]''):);procedure TSQLRestServerDB.InternalBatchStop;
...
if fBatchValuesCount=1 then begin // handle single record insertion as usual
Decode.Decode(fBatchValues[0],nil,pInlined,fBatchFirstID);
SQL := 'INSERT INTO '+Props.SQLTableName+Decode.EncodeAsSQL(False)+';';
InternalExecute(SQL);
exit;
end;I think that the error is in the implementation of the function TSQLRestServer.EngineBatchSend.
First TransactionBegin is called, then adding records:
ID := EngineAdd(RunTableIndex,Value);after RowCountForCurrentTransaction reaches AutomaticTransactionPerRow, the Commit transaction is called.
And at the end of EngineBatchSend function, the RunningBatchRest.InternalBatchStop is called, which makes the insertion of the same data through insert sql.
About inserting duplicates:
When we use BatchSend(FArray) we expect that FArray contains IDs of inserted records or 0 if insert fails (duplicates for example).
ID := EngineAdd(RunTableIndex,Value);
Results[Count] := ID;If we use the new implementation with batch inserting record through INSERT INTO sql, we cannot skip adding duplicates, the exception will raise and the BatchSend will fail at all.
Did you try with an external SQLite3 table?
No, there is no need to use an external SQLite3 table in my project.
Could you try with BatchStart(TEmails,10000) instead of 5000?
I tried. It makes no big difference.
The new version of BATCH adding is significantly slower than the previous version. And BATCH fails when inserting a duplicate record if the field contains the unique index. (the generated SQL should be insert or ignore into tablename / insert or replace into tablename)
From timeline: [590567d3f0] Leaf: BATCH adding in TSQLRestServerDB will now perform SQLite3 multi-INSERT statements: performance boost is from 2x (memory with transaction) to 60x (full locking, w/out transaction)
In my case inserting 1M records takes more than 3 minutes. In the previous version it took about 43 sec!
Client Code:
dm.GroupClient.BatchStart(TEmails, 5000);
...
dm.GroupClient.BatchAdd(FEmails, True);
inc(FInsertCount);
if FInsertCount >= 5000 then
begin
FInsertCount := 0;
if dm.GroupClient.BatchSend(FArray) <> HTML_SUCCESS then
begin
ShowLastClientError(dm.GroupClient);
ModalResult := mrNone;
dm.GroupClient.BatchAbort;
if NeedToCreateTable then
dm.SettingsClient.Delete(TGroups, FGroupID);
exit;
end;
end;
...I think it very good idea create repository or Sticky topic with links and short description of software is created using mORMot.
EasyMail7 - client-Server Email Marketing Solution for Windows.
G-Lock EasyMail7 is the first email marketing software for Windows built on a client-server model.
You get two components: EasyMail7 Server and EasyMail7 Client. You can install the EasyMail7 Server
on a local computer, or on a network Windows server, or on a cloud-based server for example Amazon EC2 or Windows Azure Virtual Machine.
We included the basic demo web client into the program installation which shows how to create a RESTful JavaScript client (using angular).
Desktop and JavaScript clients use mORMot secure RESTful authentication scheme.
It would be great to create a topic with links to the software created using mORMot.
In my project I use method to upload (MIME-PART encoding) content from Javascript:
procedure TSettingsServer.UploadContent(Ctxt: TSQLRestServerURIContext);
var
FBody : RawUTF8;
boundary : string;
ContentType : String;
Header : TStrings;
Decoder : TMimeDecoder;
i : Integer;
...
begin
...
ContentType := UTF8ToString(FindIniNameValue(pointer(Ctxt.Call.InHead),'CONTENT-TYPE: '));
Header := TStringList.Create;
try
ExtractHeaderFields([';'], [' '], PChar(ContentType), Header, False, False);
Boundary := Header.Values['boundary'];
finally
Header.Free;
end;
FBody := Ctxt.Call.InBody;
Decoder := TMimeDecoder.Create;
try
Decoder.AllContent := (FBody);
Decoder.Boundary := StringToAnsi7(boundary);
Decoder.UploadPath := DefaultWorkPlace + Format('%s\%d\%s\%s\', [sType, FWPID, SGuid, SFolder]);
if not ForceDirectories(Decoder.UploadPath) then
begin
Ctxt.Error(FormatUtf8('Unable to create folder: %', [Decoder.UploadPath]));
exit;
end
else
Decoder.ProcessMimeData(Length(FBody));
finally
Decoder.Free;
end;
end;Source of TMimeDecoder class:
unit ProcessMime;
interface
uses System.Classes, System.SysUtils, Web.HTTPApp, SynCommons;
Type
TMimeDecoder = class
private
Data : RawByteString;
function ReadMultipartRequest(const Boundary: RawByteString;
ARequest: RawByteString; var AHeader: TStrings; var Data: RawByteString): RawByteString;
public
AllContent, Boundary: RawByteString;
UploadPath : String;
procedure ProcessMimeData(ContentLength: Cardinal);
end;
implementation
function TMimeDecoder.ReadMultipartRequest(const Boundary: RawByteString;
ARequest: RawByteString; var AHeader: TStrings; var Data: RawByteString): RawByteString;
var
Req, RHead: RawByteString;
i: Integer;
begin
Result := '';
AHeader.Clear;
Data := '';
if (Pos(Boundary, ARequest) < Pos (Boundary + '--', ARequest))
and (Pos(Boundary, ARequest) = 1) then
begin
Delete(ARequest, 1, Length(Boundary) + 2);
Req := Copy(ARequest, 1, Pos(Boundary, ARequest) - 3);
Delete(ARequest, 1, Length(Req) + 2);
RHead := Copy(Req, 1, Pos(#13#10#13#10,Req)-1);
Delete(Req, 1, Length(RHead) + 4);
AHeader.Text := RHead;
for i := 0 to AHeader.Count - 1 do
if Pos(':', AHeader.Strings[i]) > 0 then
AHeader.Strings[i] := Trim(Copy(AHeader.Strings[i], 1,
Pos(':', AHeader.Strings[i])-1)) + '=' + Trim(Copy(AHeader.Strings[i],
Pos(':', AHeader.Strings[i])+1, Length(AHeader.Strings[i]) -
Pos(':', AHeader.Strings[i])));
Data := Req;
Result := ARequest;
end
end;
procedure TMimeDecoder.ProcessMimeData(ContentLength: Cardinal);
var
Header, HList: TStrings;
FileName : String;
begin
if ContentLength = Length(AllContent) then
while Length(AllContent) > Length('--' + Boundary + '--' + #13#10) do
begin
Header := TStringList.Create;
HList := TStringList.Create;
try
AllContent := ReadMultipartRequest('--' + Boundary, AllContent,
Header, Data);
ExtractHeaderFields([';'], [' '],
PChar(Header.Values['Content-Disposition']), HList, False, True);
if (Header.Values['Content-Type'] <> '') and (Data <> '') then
begin
FileName := Utf8ToAnsi(ExtractFileName(HList.Values['filename']));
FileFromString(Data, UploadPath + FileName, true);
end
finally
Header.Free;
HList.Free;
end;
end;
end;
end.Thank you so much for the patch.
The file was created by standalone 32 bit WinZip. In the previous version of SynZip this file unzipped successfully.
Ups. In win32 does not works too.
Hello to all,
I try to move my application to 64 bit architecture (Delphi XE3).
I found the problem with reading ZIP file.
constructor TZipRead.Create(BufZip: pByteArray; Size: cardinal);
...
for i := 1 to lhr^.totalFiles do begin // totalFiles = 10
if H^.signature+1<>$02014b51 then begin
UnMap;
raise ESynZipException.Create('ZIP format');
end;
lfhr := @BufZip[H^.localHeadOff];
with lfhr^.fileInfo do
if flags and (1 shl 3)<>0 then begin // crc+sizes in "data descriptor"
if (zcrc32<>0) or (zzipSize<>0) or (zfullSize<>0) then
raise ESynZipException.Create('ZIP extended format');
// UnZip() will call RetrieveFileInfo()
end else
if (zzipSize=0) or (zfullSize=0) then
raise ESynZipException.Create('ZIP format size=0') else // - Exception hereThe zip file can be downloaded from:
I also have a problem when starting TestMustache in Delphi XE3 .
constructor TSMEngine.Create(aManager: TSMEngineManager);
...
fCx.Options := [jsoVarObjFix,jsoBaseLine,jsoTypeInference,jsoIon,jsoAsmJs];Error in line JS_SetOptions(@self,uint32(Value))
procedure JSContext.SetOptions(const Value: TJSOptions);
begin
JS_SetOptions(@self,uint32(Value))
end;Value: (from debuger)
[jsoUnused5,jsoUnused11,jsoNoScriptRVal,jsoUnrootedGlobal,jsoBaseLine,jsoPcCount,(out of bound) 22,(out of bound) 25]Report.SaveToStream(Stream);
Stream.seek(0,0);
Should I create a ticket?