You are not logged in.
See TSChannelNetTls.AfterBind logic and comments.
Convert to PFX:
openssl pkcs12 -inkey myprivatkey.key -in mycertificat.pem -export -out mycert.pfx
It's expected behavior.
String type is AnsiString in Delphi 7 and UnicodeString in Delphi 12, but RawUtf8 is AnsiString(65001) in both.
So, code in first example is wrong. DynArray.IndexOf parameter must be of the same exact type as the dynamic array element - i.e. RawUtf8.
There are two options, choose the appropriate one:
1. Include mormot.crypt.openssl and mormot.lib.openssl11, define USE_OPENSSL and FORCE_OPENSSL in project options and call RegisterOpenSsl. Then use CryptCertOpenSsl[caaRS256].
2. Include mormot.crypt.x509, call RegisterX509 and then use CryptCert[caaRS256].
Maybe this helps:
https://blog.synopse.info/?post/2023/04 … n-mORMot-2
For DNS challenge we need to change DNS server configuration, and provide DNS record with Key:
_acme-challenge.www.example.org. 300 IN TXT "gfj9Xq...Rg85nM"
You can change in TAcmeClient.CreateOrder line from "if v2[0].Idem('HTTP-01') then" to "if v2[0].Idem('DNS-01') then" and use OnChallenges callback.
In callback modify your DNS server configuration to provide DNS record.
If this works for you, we can in mORMot add TAcmeClient.ChallengeType property (or change TAcmeChallenge record and add DnsUrl, DnsToken and DnsKey fields).
Which Windows version you are use on client?
You need to override abstract methods of TOrmPropInfo. I did not provide my solution because it contains some specific application logic.
For example see TOrmPropInfoRttiVariant.
Main methods is SetValue and GetValue.
For example:
procedure TOrmPropInfoVariantArray.SetValue(Instance: TObject; Value: PUtf8Char;
ValueLen: PtrInt; wasString: boolean);
var
V: PVariant;
tmp: TSynTempBuffer;
begin
V := GetFieldAddr(Instance);
if ValueLen > 0 then
begin
tmp.Init(Value, ValueLen);
try
GetVariantFromJsonField(tmp.buf, wasString, V^, nil);
finally
tmp.Done;
end;
end
else
VarClear(V^);
end;
procedure TOrmPropInfoVariantArray.GetValueVar(Instance: TObject; ToSql:
boolean; var result: RawUtf8; wasSqlString: PBoolean);
var
wasString: Boolean;
V: PVariant;
begin
V:= GetFieldAddr(Instance);
VariantToUTF8(V^, result, wasString);
if wasSQLString <> nil then
wasSQLString^ := not VarIsEmptyOrNull(V^);
end;
Also, you need override NormalizeValue (do nothing), GetBinary and SetBinary (see TOrmPropInfoRttiVariant).
There is two way:
1) One field in database. Use Variant property and DocVariant to store additional properties in one JSON field in database.
See:
https://www.delphipraxis.net/210843-mor … tellt.html
2) Multiple fields in database.
Describe you properties like that:
TPropInfo = packed record
Name: RawUtf8;
FieldType: TOrmFieldType;
FieldWidth: Integer;
// may be any additional property info, like Caption
end;
TPropInfoDynArray = array of TPropInfo;
TPropValueArray = array [0..MAX_SQLFIELDS - 1] of Variant;
Then ORM class, with properties descriptions for entity of that class and property values:
class TOrmMyEntity = class(TOrm)
FMyValues: TPropValueArray;
class var
FMyProps: TPropInfoDynArray;
class procedure Init;
end;
Then initialize properties descriptions at your application start:
class procedure TOrmMyEntity.Init;
begin
// Load FMyProps array from client options
// May be DynArrayLoadJson or something else
end;
Create TOrmPropInfoCustom descendant that can store your fields:
TOrmPropInfoMyField = class(TOrmPropInfoCustom)
...
end;
// Code to access property value:
// var
// Value: PVariant;
// begin
// Value := GetFieldAddr(Instance);
// end
And register your properties in InternalRegisterCustomProperties, as above.
Note, this is complicated solution, and you need to go deep in mormot.orm.base.pas and mormot.orm.core.pas source code.
2) Not going so deep, but seems like all TDocvariant (entire JSON / object) - image with all properties in example mentioned - will be put in ONE separate field. Am I right?
Yes. Then, in SQL queries, you can use database functions to work with JSON, like json_extract and etc.
But there is more complex way, if you need separate fields in database.
Create TOrmPropInfoCustom descendant that can store your fields in any way. And override InternalRegisterCustomProperties to add your props.
class procedure TOrmMyEntity.InternalRegisterCustomProperties(Props: TOrmProperties);
var
i: Integer;
begin
inherited InternalRegisterCustomProperties(Props);
for i := 0 to Length(FMyProps) - 1 do
Props.Fields.Add(TOrmPropInfoMyField.Create(..., {aProperty=}@TOrmMyEntity(nil).FMyValues[i]));
end;
You can try TMemoryMap, and then JsonDecode/JsonArrayDecode.
It's a bug in mormot.lib.sspi.pas.
CertOpenStore does not have CertOpenStoreA and CertOpenStoreW variants.
Ansi or Unicode usage depends on first parameter, for example CERT_STORE_PROV_SYSTEM_A or CERT_STORE_PROV_SYSTEM_W.
P.S.
And first parameter is PAnsiChar, so:
var LProvider: AnsiChar := #10; // CERT_STORE_PROV_SYSTEM_W
var LMy: UnicodeString := 'MY';
You can use certmgr.msc to import .p7b file and then export as .pfx.
This code works for me:
function SendTgDoc(const Token, Chat, Text: RawUtf8;
const Doc: RawByteString): boolean;
var
LMultiPart: THttpMultiPartStream;
LContentType, LData: RawUtf8;
LClient: THttpClientSocket;
begin
LMultiPart := THttpMultiPartStream.Create();
try
LMultiPart.AddContent('chat_id', Chat, 'text/plain');
LMultiPart.AddContent('caption', Text, 'text/plain');
LMultiPart.AddFileContent('document', 'attach.pdf', Doc, 'application/pdf', 'binary');
LMultiPart.Flush();
LClient := THttpClientSocket.Open('api.telegram.org', '443', nlTcp, 5000, True);
try
LClient.Post('/bot' + Token + '/sendDocument',
LMultiPart, LMultiPart.MultipartContentType);
finally
LClient.Free();
end;
finally
LMultiPart.Free();
end;
end;
In JavaScript you need to use "synopsejson" protocol (in WebSocket constructor).
For example, this works:
{{#Equals Category,"Admin"}}Welcome, Mister Administrator!{{/Equals}}
But this doesn't work:
{{#Equals "Admin",Category}}Welcome, Mister Administrator!{{/Equals}}
I think problem at line 705 in mormot.core.mustache.pas.
In case when value starts from digit, '"', '[' or '{' it is being interpreted as single JSON value.
You can try ssl_password_file for nginx (https://nginx.org/en/docs/http/ngx_http … sword_file).
May be it helps.
Please try the following changes:
--- a/src/net/mormot.net.acme.pas
+++ b/src/net/mormot.net.acme.pas
@@ -868,8 +868,8 @@ var
begin
try
// Generate a new PKCS#10 Certificate Signing Request
- csr := fHttpClient.fCert.CertAlgo.CreateSelfSignedCsr(
- fSubjects, aPrivateKeyPassword, pk);
+ csr := PemToDer(fHttpClient.fCert.CertAlgo.CreateSelfSignedCsr(
+ fSubjects, aPrivateKeyPassword, pk));
// Before sending a POST request to the server, an ACME client needs to
// have a fresh anti-replay nonce to put in the "nonce" header of the JWS
fHttpClient.Head(fNewNonce);
Note, there is added PemToDer conversion.
We already have function for this purpose: Utf8TruncatedLength.
Also, in TRestStorageExternal methods like EngineList or EngineRetrieve and so on, exceptions are supressed in try..except block, without being stored anywhere.
Thank's, now it works.
But there's another problem after I updated mORMot:
Before, I changed FavIcon by assigning TRestHttpServer.FavIcon property.
Now, after adding URI routing, if I try to call HttpServer.SetFavIcon, there is an exception EUriRouter with message 'TUriRouter.Setup('/favicon.ico'): already registered'.
So, I not see the way of how to change FavIcon.
In 64-bit Linux type THandle cannot be used in resource management functions, because it's 32-bit file handle, but HInstance handles are 64-bit. This leads to Access Violation error. We must use PtrUInt (or TLibHandle) instead.
I see problems in:
ResourceToRawByteString, ResourceSynLZToRawByteString, TExecutableResource, TSynTimeZone, TZipRead.
Variable "raiseonfailure" is not initialized to ESqlDBPostgres on function start.
Can we make PostgreSQL pipelining optional (in mormot.db.raw.postgres.pas)?
My problem: I create Windows 32-bit application, and use 32-bit libpq.dll. But 32-bit builds available only up to version 12, and has no pipelining support.
Version 14 with pipelining support has only 64-bit builds.
So now I changed ESqlDBPostgres to nil in call to Resolve.
May be there is a 32-bit libpq.dll with pipelining?
On the real life the URL length is limited (by web browser, by proxies, by CDNs etc). I've encountered such limitation several times. The best practice is to keep URL length < 2000 (so 128 parameters is far enough IMHO).
Yes, for GET requests parameters is limited to 2000-4000 bytes, but for POST there is no such limits (DataTables uses POST).
It means that WebSocketsEnable requires a websockets-enabled kind of server, i.e. one HTTP_BIDIR mode
In previous example I use two useBidirSocket servers, one for HTTP, and one for HTTPS. But REST-server NotifyCallback can be used only with one useBidirSocket server (last registered).
Also, ConnectionID of those two useBidirSocket servers is the same, so we can not distinguish between them in IsActiveWebSocket, even if we can call NotifyCallback of all HTTP-servers.
Please test it, and give here some feedback to fix any problem before the actual release!
Some tips.
1. In configuration as below, where one REST registered in http and https servers, websockets works only for last registered server (HttpsServer in that case).
It may be fixed (can be difficult) or at least documented.
var
Server: TRestServerFullMemory;
HttpServer: TRestHttpServer;
HttpsServer: TRestHttpServer;
begin
Server := TRestServerFullMemory.CreateWithOwnModel([]);
HttpServer := TRestHttpServer.Create('80', [Server], ...);
HttpServer.WebSocketsEnable(Server, '').
HttpsServer := TRestHttpServer.Create('443', [Server], ...);
HttpsServer.WebSocketsEnable(Server, '').
end;
2. Next, in TRestServerUriContext.FillInput I changed limit from 128 parameters to 512 parameters, because I use DataTables Javascript library (https://datatables.net/). And it uses up to a dozen parameters per table column. So 128 is not enough.
Try this:
<script>
window.templates = {};
window.templates.vtSettings = '{{ToJson VTSettings}}';
</script>
Isn't it fixed?
I check CreateCopy and SetFieldVariant. Now it works OK.
Thanks to you and commit https://github.com/synopse/mORMot2/comm … e371698401 !
Yes, It helps.
But I think the root case is in the TRttiProp.SetVariantProp - it does not handle varVariantByRef.
But varByRef used in many parts of the core units.
For example, TOrm.SetFieldVariant may also be affected:
v := _Json('{arr:[1,2]}');
a.SetFieldVariant('Opts', v.arr); // v.arr will be returned by reference
CreateCopy copies DocVariant fields by reference, because GetVariantProp called with byref=true, and SetVariantProp calls VarCopyDeep, but VarCopyDeep cannot find custom variant in FindCustomVariantType with type varVariantByRef, and call oleaut32.VariantCopy.
Next, when original object destroyed, reference becomes invalid, and bad things happens.
I my case, TJsonWriter.WriteObject fails when I pass ORM object with DocVariant field, that was previously copied via CreateCopy, and original object was destroyed.
Let's say I have object like this:
TOrmA = class(TOrm)
FOpts: Variant;
published
Opts: Variant read FOpts write FOpts;
end;
Then I load object from DB or create a new one, and call CreateCopy:
a := TOrmA.Create;
a.Opts := _Obj([]);
b := a.CreateCopy;
a.Free;
Next call to TJsonWriter.WriteObject fails with EJsonException with message 'TJsonWriter.AddVariant VType=<random number>'.
Great, everything works now.
Thanks!
I am not able to reproduce.
My bad, sorry.
The real case is in MVC Web application (mormot.rest.mvc.pas).
i = interface(IMvcApplication)
procedure m(s: RawJson);
end;
Query: http://localhost/root/m?s=["x"]
And in mORMot 1.18: ["x"]
In mORMot 2: "[\"x\"]"
May be using RawJson parameter in MVC Web App is not a good idea.
In client-server services via interfaces I used RawJson in mORMot 1.18, and it's transmitted "as is", without serialization.
But in mORMot 2 transmission of RawJson now changed.
Is that expected?
For example,
i = interface
procedure m(s: RawJson);
end;
mORMot 1.18:
s := ' ["x"] '
i.m(s) -> ' ["x"] '
mORMot 2:
s := ' ["x"] '
i.m(s) -> ' "[\"x\"]" '
It is just a review of your code, with more integration to other units, as you suggested.
It's a very good review!
Some small fixes:
https://github.com/synopse/mORMot2/pull/119
2. Instead of using unfinished mormot.crypt.x509.pas we could just use DerToEcc() from mormot.crypt.ecc.pas which does the parsing for us.
Yes, it works (now only ES256).
But I can adopt it to use DerParse for 32, 48 or 66 bytes of signature (ECC_SIZE).
P.S.
It's not so simple, because for ES512 there is 2-byte length value in DER:
https://gist.github.com/achechulin/3212 … b0595bfc24
I created the basic ACME client:
https://gist.github.com/achechulin/7aa9 … e5f940b3bb
I think it's needed a critical review.
And may be some parts, like JSON Web Signature (JWS), can be moved to JWT support unit.
BN_bn2bin can be moved to BIGNUM, GetEccPubKey and GetRsaPubKey to EVP_PKEY, CreateCsr to mormot.crypt.openssl and so on.
Also, I not fully understood ECC coordinate compression details, so GetEccPubKey and DerToEccSign may be not optimal in some ways.
Note that I have found proper ACME v2 support in unit OverbyteIcsSslX509Certs of ICS.
So we may have some reference code in pascal to interface with Let's Encrypt.
Overbyte code is too complicated, but answered some questions about ECC, thanks!
Something like this:
var
doc: variant;
begin
doc := _Arr([]);
// or TDocVariantData(doc).Init(mDefault, dvArray)
with _Safe(doc)^ do
begin
// add new item
AddItem(MESSAGE_FORTUNES);
end;
// or
P := _Safe(doc);
P^.AddItem(MESSAGE_FORTUNES);
end;
Thanks!
Now it works fine.
Perhaps FPC_X64MM conditional is not coherent between your mormot package and your project.
I do not use package and do not use mormot.core.fpcx64mm.
I use FPC 3.2.3-554-g8b21bf1cce, and may be it's too old.
Look at the code: RedirectRtlCall is not the same as PatchCode. It does not redirect one function in mormot.core.rtti, it parses the binary one function in mormot.core.rtti to extract a low-level RTL call, which is redirected.
In my case RedirectRtlCall calls RedirectCode($FFFFFFFFF14BF09F, $0000000000D659E0) - function address invalid - negative, and then AV.
With latest version (2.0.3676) there is no changes.
Generated ASM for _fpc_setstring_ansistr:
https://drive.google.com/file/d/1GqN080 … sp=sharing
Some tips I found in source:
1. https://github.com/synopse/mORMot2/blob … r.pas#L716
Inherited Create called twice.
2. https://github.com/synopse/mORMot2/blob … .pas#L4259
IsRemoteIPBanned always return True.
3. https://github.com/synopse/mORMot2/blob … .inc#L1208
RedirectRtlCall do something strange. Maybe some defines missing. It's redirect one function in mormot.core.rtti to another in mormot.core.rtti.
It's crashed with AV on my x86_64 linux.
After latest changes (added TOpenSslNetTls.SetupCtx), browser asks client certificate when connecting via HTTPS:
https://drive.google.com/file/d/1SoyGOv … Rk0ZDmH1t/
So, we need to change IgnoreCertificateErrors to True to avoid client certificate request.
But this can be done only in THttpServerSocketGeneric.WaitStarted.
May be add new option to THttpServerOptions, like hsoTlsClientCert (or some other name, hsoTlsVerifyPeer?), and in WaitStarted:
// now the server socket has been bound, and is ready to accept connections
if (hsoEnableTls in fOptions) and
(CertificateFile <> '') and
(fSock <> nil) and // may be nil at first
not fSock.TLS.Enabled then
begin
...
fSock.TLS.IgnoreCertificateErrors := not (hsoTlsClientCert in fOptions);
InitializeTlsAfterBind; // validate TLS certificate(s) now
...
end;
And there is problem in THttpServerSocketGeneric.WaitStarted - for THttpAsyncServer fSock is nil. We need to use Async.Server instead.
Or maybe in THttpAsyncServer set fSock := Async.Server.
Some little patches.
For mormot.net.server.pas, in THttpServerSocketGeneric.WaitStarted, because for WIndows CertificateFile is enough:
@ -1620,7 +1620,7 @@ begin
until false;
// now the server socket has been bound, and is ready to accept connections
if (hsoEnableTls in fOptions) and
- (PrivateKeyFile <> '') and
+ (CertificateFile <> '') and
not fSock.TLS.Enabled then
begin
StringToUtf8(CertificateFile, fSock.TLS.CertificateFile);
For mormot.net.sock.windows.inc, in TSChannelNetTls.AfterBind, for proper error when .pfx can not be read:
@ -1117,6 +1117,8 @@ begin
// openssl pkcs12 -inkey privkey.pem -in cert.pem -export -out mycert.pfx
fAcceptCertStore := PFXImportCertStore(@blob, nil,
PKCS12_INCLUDE_EXTENDED_PROPERTIES);
+ if fAcceptCertStore = nil then
+ ESChannelRaiseLastError(SEC_E_CERT_UNKNOWN);
end;
// find first certificate in store
fAcceptCert := CertFindCertificateInStore(fAcceptCertStore, 0, 0,
For mormot.net.async.pas, in TPollAsyncSockets.ProcessRead, because working thread terminates after exception in TLS code:
@ -1353,7 +1353,13 @@ begin
not (fFirstRead in connection.fFlags) then
begin
include(connection.fFlags, fFirstRead);
- fOnFirstRead(connection); // e.g. TAsyncServer.OnFirstReadDoTls
+ try
+ fOnFirstRead(connection); // e.g. TAsyncServer.OnFirstReadDoTls
+ except
+ // TLS error -> abort
+ UnlockAndCloseConnection(false, connection, 'ProcessRead OnFirstRead');
+ exit;
+ end;
end;
repeat
if fRead.Terminated or
There is one problem with OpenSSL and Linux.
While OpenSSL write to a socket where the other end is closed already, a SIGPIPE will be generated, and process terminated.
It's because OpenSSL use "send" without MSG_NOSIGNAL.
CURL authors suggest changes to default BIO:
https://github.com/openssl/openssl/pull/17734
I see two option:
1. Add fpSignal(SIGPIPE, SIG_IGN) and wait for OpenSSL changes.
2. Add custom BIO with MSG_NOSIGNAL, like https://github.com/alanxz/rabbitmq-c/pull/402
Thanks for merge.
But I see commit in history:
https://github.com/synopse/mORMot2/comm … 057af03443
May be it's wrong? Git is quite complex.
Initial TLS support:
https://github.com/synopse/mORMot2/pull/92
Next, I want to fix compilation in case OPENSSLSTATIC or OPENSSLFULLAPI defined.
For infomation, patch to use OpenSSL 3.0 (installed on Ubuntu 22.04):
https://gist.github.com/achechulin/adbb … 1810dc78c6
Yesterday, I prepared moving all SSPI/SChannel code to mormot.lib.sspi.
Great, thanks!
ab, can I use mormot.lib.sspi in mormot.net.sock.windows.inc?
And why we load secur32.dll dynamically? It's available since Windows 95 and Windows NT 4 with installed Directory Service Client, and always available on Windows 2000.
Sorry, I was not clear enough.
We need some TLS-library specific context.
For example, for schannel we open system certificate store or create new store in memory and load certificate from file. Then we obtain certificate handle, and then create credentials handle from those handles.
And for OpenSSL we allocate SSL_CTX structure, set options, load certificate and corresponding private key.
We may put TLS-library specific context in the connection specific object (TSChannelClient) or in the server-specific object - opaque pointer in TNetTlsContext, that initialized at server start and copied from bind socket to accept socket.
So, let's start with first option.
Next, I will measure server performance, and we make decision on further development.