You are not logged in.
Please stay in safe MPV!
I'm Russian and was shocked when this insane bastard decided to fully open this Pandora's box
I'm feeling deep shame for our goverment and all this kleptocracy.
One of my grandfathers was Ukrainian and his stories about the WWII were terrific.
Hope that invaders will get enough bloody arguments to stop this mess.
Never thought that Kiev (where I was several times) will be bombarded with winged missiles in the 21st century
Слава Украине! - Героям слава!
You can e-mail directly to me (the e-mail is on the end of the library description): eugene.ilyin (at) gmail (dot) com
But right now, I'm working on increasing Brotli/Zopfli compression speed and mem reusage.
Maybe we should create a separate mORMotBP2 project on GitHub
P. S.
Or maybe we can put this library to mORMot as AB suggested couple of years ago
Hm, ok partial support of the JSON spec for this rare case is good enough.
Anything better than unexpected exception for incoming JSONs
Thanks, fix confirmed, now it's
FALSE
FALSE
FALSE
Hi, the recent 1.18.6318:
program Test;
{$APPTYPE CONSOLE}
uses
SynCommons;
var
V: Variant;
VD: TVarData;
begin
with TDocVariantData(V) do
begin
InitJSON('{"":1}');
Writeln(GetValueIndex('') = 0);
Writeln(GetVarData('', VD, @StrComp));
Writeln(AddValueFromText('','test') = 0);
end;
Readln;
end.
TRUE
TRUE
TRUE
Exception class $C0000005 with message 'access violation at 0x0052573a: read of address 0xfffffffc'. Process Test.exe (11556)
TDocVariantData.GetValueIndex problem place:
...
result := FindNonVoidRawUTF8(pointer(VName),aName,aNameLen,VCount) else
result := FindNonVoidRawUTF8I(pointer(VName),aName,aNameLen,VCount) else
[Actual]
FindNonVoidRawUTF8:
for result := 0 to count-1 do // all VName[]<>'' so n^<>0
if (PStrLen(n^-_STRLEN)^=len) and CompareMemFixed(pointer(n^),name,len) then
FindNonVoidRawUTF8I:
if (PStrLen(n^-_STRLEN)^=len) and IdemPropNameUSameLen(pointer(n^),name,len) then
[Expected]
FindNonVoidRawUTF8:
for result := 0 to count-1 do
if ((n^=0) and (len=0)) or
((PStrLen(n^-_STRLEN)^=len) and CompareMemFixed(pointer(n^),name,len)) then
FindNonVoidRawUTF8I:
if ((n^=0) and (len=0)) or
((n^<> 0) and (PStrLen(n^-_STRLEN)^=len) and IdemPropNameUSameLen(pointer(n^),name,len)) then
TDocVariantData.GetVarData problem place:
[Actual]
if (integer(VType)<>DocVariantVType) or not(dvoIsObject in VOptions) or
(VCount=0) or (aName='') then
[Expected]
if (integer(VType)<>DocVariantVType) or not(dvoIsObject in VOptions) or
(VCount=0) then
TDocVariantData.AddValueFromText problem place:
[Actual]
function TDocVariantData.AddValueFromText(const aName,aValue: RawUTF8;
Update, AllowVarDouble: boolean): integer;
begin
if aName='' then begin
result := -1;
exit;
end;
result := GetValueIndex(aName);
[Expected]
// if aName='' then begin
// result := -1;
// exit;
// end;
result := GetValueIndex(aName);
Any request contained the empty name in JSON object will crush the thread request processing
Any TDocVariantData access: Exists(aName), NameIndex(aName), AddValue, AddValueFromText, SearchItemByProp, ReduceAsArray, Rename, Delete(aName), GetValueOrDefault, GetValueOrNull, GetValueOrEmpty, GetValueEnumerate, GetAsPVariant, RetrieveValueOrRaiseException, SetValueOrItem, AddOrUpdateValue, GetOrAddIndexByName, GetOrAddPVariantByName, GetPVariantByName, IntSet, DoFunction, Exists, ExistsOrLock, AddExistingPropOrLock, ...
Any JSON iterating, search, etc.
Is empty name allowed for JSON object members?
Yes. Following the spec:
json
element
elements
element
element ',' elements
element
ws value ws
value
object
object
'{' members '}'
members
member
member ',' members
member
ws string ws ':' element <-- String HERE for member name
string
'"' characters '"' <-- Chars HERE for string value
characters
"" <-- Empty content HERE for chars
character characters
Coverage
No empty name tests provided in sanity checks.
P. S. Please check the other places where the Name prop assumed not to be empty (no sure that I found all the cases).
Yeah, but it was so usefull not to generate all this try ... finally nesting, all libraries and all sources with IAutoFree now must be rewritten.
Especially for TSQLRecord and it's ancesstors
Fix confirmed, thanks!
Hi, the recent 1.18.6309:
program Test;
{$APPTYPE CONSOLE}
uses
SynCommons;
var
R1, R2: TRawUTF8DynArray;
begin
CSVToRawUTF8DynArray('AA,BB,CC,DD', ',', ',', R1);
CSVToRawUTF8DynArray('A,B,C,D', ',', ',', R2);
Writeln(RawUTF8ArrayToCSV(R1));
Writeln(RawUTF8ArrayToCSV(R2));
Readln;
end.
AA,BB,CC,DD
A,B,C,D
AA,BB,CC,DD
A,B,C
RawUTF8ArrayToCSV problem place:
Actual:
while offs<length(CSV) do begin
Expected:
while offs<=length(CSV) do begin
SynECC.pas:
TECDHEProtocol.FromKey
CSVToRawUTF8DynArray(c,',','',chain);
mORMot.pas:
TSQLRecordPropertiesMapping.InternalCSVToExternalCSV
CSVToRawUTF8DynArray(CSVFieldNames,Sep,SepEnd,IntFields);
The SynSelfTests.pas TTestLowLevelCommon._UTF8 doesn't cover the listed case:
CSVToRawUTF8DynArray(res,',','',arr);
Check(arr[0]='one');
Check(arr[1]='two');
Check(arr[2]='three');
Finalize(arr);
CSVToRawUTF8DynArray('one=?,two=?,three=?','=?,','=?',arr);
Check(arr[0]='one');
Check(arr[1]='two');
Check(arr[2]='three');
Finalize(arr);
Hi htits2008,
Just get the recent release, follow the instructions and build the Demo project, run HTTP server locally, open the root page and see the headers, GZip/Brotli compression, CSP directives, cache behaviour, static assets etc.
Then play with options and see the results or check how the test cases work.
Then try to build some your favorite front-end: Angular, React, Vue, Svelte, Solid, VanillaJS, play with SSR and experiment with mORMot on-the-fly mustache template engine, etc.
Then connect all of it with your favorite reverse proxy like nginx, openhttplite, CDN, etc.
Fill free to ask if something is not clear.
Yeap, k6 is NOT for the maximum performance testing.
k6 is for complex scenarios to check relative to previous runs performance degradation/increase.
That why for the speed tests it's in my "Class B"
As for the raw HTTP queries flow - the "Class A" exist (really try Oha, after 10-12 Oha runs you will love this console like UI and performance measurements. Oha is written in Rust, very stable, small, fast, and predictable - the same run sequences show the same performance results with a very low deviation).
@mpv, you can easily migrate to k6
It's army swiss knife on the modern complex load testing scenarios:
Asserts, Thresholds, HTTP/1.1, HTTP/2, WebSocket, gRPC, Cookies, Crypto, Custom metrics, Encodings, Environment variables, JSON, HTML forms, files.
Configuation-as-a-script and very pleasant to write scenarios.
Not sticked to Linux only.
Here is a "hello world" HTTP load test under Windows:
import http from "k6/http";
export default function() {
let response = http.get("http://127.0.0.1:8080/");
};
k6.exe run -u 256 -i 1000000 script.js
My favorite HTTP benchmark performance testing tools used for mORMot:
Class A
Minimal resources consumption per worker, maximum parallel threads, precise measurements, fast start per instance
Oha (the best one from resource, quality, parallel measuments, reporting, UI )
Class B
Class C
Class D
The other are listed in awesome-http-benchmark
@ab,
Please consider the another alternative to fix it in the recent Delphi compilers with Managed Record inside TAutoFree.
One of the benefit from-the-box is reverse auto-invocation order of Finalize() in routines epilogue.
Small amendment:
Following Naming Conventions and Section 3.4 of Object Pascal Style Guide which is a base for both Delphi and Free Pascal:
Except for reserved words and directives, which are in all lowercase, all Pascal identifiers should use InfixCaps, which means the first letter should be a capital, and any embedded words in an identifier should be in caps, as well as any acronym that is embedded
And
Method names should use the InfixCaps style. Start with a capital letter, and capitalize the first letter of any subsequent word in the name, as well as any letters that are part of an acronym. All other characters in the name are lower case.
But as for me the Kotlin approach above looks more balanced.
*REST* became *Rest*, TJWT* into TJwt* and TSQL* into TSql*
Ok, I will print " Id and EntityId, not ID and EntityID ! " on A4 and meditate for an hour per day on it
Unfortunatelly Object Pascal Style Guide and Free Pascal guidelines are not clear about acronyms notation in names.
Maybe Kotlin (successor of all Android dev) balance is the best approach?
When using an acronym as part of a declaration name, capitalize it if it consists of two letters (IOStream); capitalize only the first letter if it is longer (XmlFormatter, HttpInputStream).
Ok, here is a typical case below.
What is better for you to fast and accurate reading/re-understanding your/another developer old code?
function TMyClass.DoSomething(const AValueA, AValueB: Utf8; const APosition: Integer): Utf8;
var
Index: Integer;
StrA, StrB: Utf8;
...
function TMyClass.DoSomething(const AValueA, AValueB: RawUTF8; const APosition: Integer): RawUTF8;
var
Index: Integer;
StrA, StrB: RawUTF8;
...
function TMyClass.DoSomething(const AValueA, AValueB: UTF8String; const APosition: Integer): UTF8String;
var
Index: Integer;
StrA, StrB: UTF8String;
...
function TMyClass.DoSomething(const AValueA, AValueB: UTF8Str; const APosition: Integer): UTF8Str;
var
Index: Integer;
StrA, StrB: UTF8Str;
...
IMHO, RawUTF8 is ok, Utf8 is too short and ugly, and could be hard for the fast code reading.
Maybe suggested alternative UTF8String (or UTF8Str) is long enough, but is it worth the code and libraries refactoring?
Also it will be not good at all to introduce another mORMot.UTF8String/System.UTF8String name collision with standard type.
Is it possible to mess it up when you use both: some Delphi libraries expected System.UTF8String and some mORMot UTF8String routines in the same code.
From the other point of view: introduction of mORMot.UTF8String will be a good protection barrier to throw out libraries which are work with System.UTF8String
P. S.
Btw (if we talk about such core changes in naming )
I'm very allergic to ab's CamelCase notation to almost all abbreviations, all these: Html, Http, Utf, Xml, Json, etc. in places where expected HTTP, XML, UTF, JSON... brrrrrr... (thanks that SQL is not cursed to Sql) I wonder is it because of issue to keep the Shift key pressed during typing or some code post-processor trash it out before commit?
With all respect, but fix this will improve readability/expectability more than RawUTF8 (hm... RawUtf8) renaming.
What do you think? Maybe it's a separate topic to discuss.
@damiand,
There is an easy way to avoid this mORMotHTTPServer exception on FPC:
Detach DBServer/DBServers from your HTTPServer before destruction, like so:
{$IFDEF FPC} HTTPServer.RemoveServer(Server); {$ENDIF}
I've updated the Demo sample, please see this small amendment here.
No exceptions on FPC 3.2.0 / Lazarus 2.0.10 / mORMot 1.18.6186 on Win64.
Please let me know if the issue is gone or still exist on the latest mORMot revision when you add this line to the end of your source code.
@vlad,
If you have a single project (not dynamically generated bunch of projects)
Please use RFG - Real Favicon Generator to generate all necessary favicons.
This service is free and generates all state-of-the-art set of required icons aligned with the most recent guidelines.
Then you can pack all favicons into a single file, pre-compute GZipped and Brotli versions for each asset in maximum available compression and make all client's browsers happy.
To embed necessary HTML tags into generated HTML template you can use mORMot mustache template engine with embedding of isolated favicons.partial file:
{{#favicons}}
<link rel="image_src" type="image/png" href="/apple-touch-icon-1024x1024.png"/>
<link rel="apple-touch-icon" href="/apple-touch-icon-1024x1024.png">
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
<link rel="mask-icon" href="/safari-pinned-tab.svg" color="{{{themeColor}}}">
<meta name="msapplication-TileColor" content="{{{themeColor}}}">
<meta name="theme-color" content="{{themeColor}}">
{{/favicons}}
As from the best caching practices I recommend to embed hash-versioning into the file names and strip hashed when the asset is requested:
<link rel="image_src" type="image/png" href="/apple-touch-icon-1024x1024.7a6ed56a.png">
<link rel="apple-touch-icon" href="/apple-touch-icon-1024x1024.7a6ed56a.png">
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.19f7af26.png">
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.29731e46.png">
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.686f4164.png">
<link rel="mask-icon" href="/safari-pinned-tab.77664866.svg" color="{{{themeColor}}}">
<meta name="msapplication-TileColor" content="{{{themeColor}}}">
<meta name="theme-color" content="{{themeColor}}">
If you do so, you can make such assets immutable:
Cache-Control: public, no-transform, max-age=31536000, immutable
Or when CDN/NGINX reverse-proxy used:
Cache-Control: public, no-transform, max-age=31536000, immutable, stale-while-revalidate=2592000, stale-if-error=2592000
To pre-compress, compute ETag hashes, embed as a single resource right into executable, and then distribute all icons assets you can use mORMotBP which is a direct class from TSQLHttpServer.
Add TBoilerplateHTTPServer.OnGetAsset for external, computable assets and files with custom redirections support.
This is usefull for:
Dynamically compute content.
Return custom redirects for any request, now you can run projects migration, A/B testing, synthetic/deprecated links support, etc.
Precache rare updated dynamic content and compress it with GZip, Zopfli, or Brotli.
Customize transfer of any stored files or save some dynamic content to files and delegate transfer to low-level HTTP API.
Change TAsset.Timestamp type to TUnixTime for better perfomance and modifications checks
Add TAssset.Clear and TAssset.Assign methods
@ab
What do you think of making a list of colliding types in mORMot?
We can find all collisions between units interface section identifiers in two different ways (please let me know if you find an easier way to do it):
Deep crawl Embarcadero site for all identifiers (not sure that every identifier is present in docs):
http://docwiki.embarcadero.com/Librarie … /Unit_List
http://docwiki.embarcadero.com/Librarie … /en/System
http://docwiki.embarcadero.com/Librarie … em.Actions
http://docwiki.embarcadero.com/Libraries/Sydney/en/Vcl
http://docwiki.embarcadero.com/Librarie … /en/Winapi
http://docwiki.embarcadero.com/Libraries/Sydney/en/Xml (maybe)
Deep crawl Free Pascal site for all identifiers by the next paths (again not sure that every identifier is present in docs):
https://www.freepascal.org/docs-html/cu … index.html
https://www.freepascal.org/docs-html/cu … index.html
https://www.freepascal.org/docs-html/cu … index.html
Deep crawl mORMot API referense site (is this reference actual, auto-generated and equal recent fossil version?):
https://synopse.info/files/html/Synopse … _FRAMEWORK
Extract all identifiers in interface section of all listed units and build the collision list.
Parse all *.pas files of Embarcadero / Free Pascal / mORMot sources with one of the next Delphi language AST builders:
Extract all identifiers from AST-interface-section-nodes and build the collision list.
P. S.
I agree with other developers that the SynTrim naming is short, elegant, colision free, and do not require conditional defines or other magic to use.
As for fully signature identical methods the replacement with more efficient code can be named equally.
Small amendments related to the names of functions (not classes, class procedures, etc).
mORMot is not alone when you develop a project,
I have to write alot of code mixed between string and RawUTF8 types.
And because in uses section the RTL (like SysUtils) are always goes before any libraries (like SynCommons), it's very annoying to write SysUtils.Trim() everywhere where the strings are used.
I'll be happy to see distinct prefixes/postfixes in mORMot for all collisions with standard Delphi utils (like SynTrim, TrimU, etc.)
As for classes the distinct prefix TSyn, TSy will be good for recognition and good for avoiding with any other libraries and frameworks (the common names like TService could collides very easilly if you have 2-3 libraries/services working with TService from mORMot, TService from library and your own domain core class TCustomService, so TSynService looks better, IMHO)
Thanks Pavel, it's great alternative to default MIME types list!
Yeap,
It was an intresting challenge to fetch, parse, combine, deduplicate and prioritize ~2000 IANA documents merge with Apache and Mozilla versions and provide them as an O(1) hash list of TSynNameValue values
Upgrade Assets to HTML Boilerplate v8.0.0
Align options with Apache Server Configs v4.0.0
Upgrade Brotli compression to v1.0.9
Support Dynamic Brotli compression (save about 25% CPU usage compared to GZip on 64-bit systems and 10% less delivery time and traffic utilization).
Add support of 1490 MIME Types file extensions from IANA, Apache, and Mozilla (see BoilerplateAssets.KnownMIMETypes as an alternative to SynCommons.GetMimeContentType)
bpoDelegateUnauthorizedTo404 set content for HTTP 401 "Unauthorized" response code equals to /404
bpoDelegateNotAcceptableTo404 set content for HTTP 406 "Not Acceptable" response code equals to /404
bpoDelegateHidden block access to all hidden files and directories except for the visible content from within the /.well-known/ hidden directory
bpoDisableTRACEMethod prevents TRACE requests being made via JavaScript
TStrictSSL supports strictSSLIncludeSubDomainsPreload
New DNSPrefetchControl property controls DNS prefetching
TAssets.SaveToFile now forces file directories before save and returns boolean success value
assetslz didn't store compressed content if it size is greater than the size of the identity content (which prevents unnecessary bundle increase)
Sbj, thanks
Thanks
Hi, all sub-folders search in FindFiled are failed under Windows.
The bug is in SynCommons.pas SearchRecValidFolder function:
function SearchRecValidFolder(const F: TSearchRec): boolean;
begin
result := (F.Attr and (faDirectory {$ifdef MSWINDOWS}and faHidden{$endif})=faDirectory) and
(F.Name<>'') and (F.Name<>'.') and (F.Name<>'..');
end;
Due to faDirectory = 16 and faHidden = 2 we have faDirectory and faHidden = 0 under Windows and all sub-folders are skipper from the search.
Please merge the fix: https://github.com/synopse/mORMot/pull/336
@macfly,
No, I think that issues is that ab is busy now with v2 migration.
No rush - just pinging him from time to time
#307 is fully backward capatible and solve 2 issues:
1. Hardcoded Accept: */*
2. Add QueryDataAvailable before ReadData (see separate forum topic ralated to it)
Both changes are minor (add couple of lines) and related only to Windows API.
I believe the mORMot HTTP clients is not very flexible to be used outside the ORM.
Let me disagree with you: mORMot HTTP Client Requests are good for feeds fetching and cancelling, cancellation of long requests, support of Brotli compression and many other applications compared to simple call of remote REST endpoint.
I can solve almost anything with inheritance, except hardcoded values or missing of QueryDataAvailable which is required.
#307 just fix this without any regression or changes in app code.
pull@request:~307$ tracert ab
Tracing route to Arnaud Bouchez [62.210.254.173]
over a maximum of 3 hops:
1 72 ms 72 ms 72 ms 51.158.8.71
2 72 ms 71 ms 71 ms 51.158.8.83
3 67 ms 67 ms 67 ms 62-210-254-173.rev.poneytelecom.eu [62.210.254.173]
Trace complete.
Hm...
Hi ab,
Do you need some additional changes from my side to accept #307?
It changes just couple of lines:
Before:
Bytes := InternalReadData(tmp,0);
if Bytes=0 then
break;
After:
Bytes := InternalQueryDataAvailable;
if Bytes=0 then
break;
Bytes := InternalReadData(tmp,0,Bytes);
if Bytes=0 then
break;
All SynSelfTests tests are passed.
No impact on perfomance and TWinHttpAPI.InternalReadData will not freeze anymore during the data feeds fetching.
Checked for all three cases ("OnDownload", "fetch with Content-Length", and "fetch without Content-Length") for both implementations: TWinHTTP and TWinINet.
Hi,
I have an issue with data feeds fetching from different APIs
with the current TWinHttpAPI.OnDownload implementation.
The issue is that data chunks are not provided to the client
immediately when delivered over the network but freeze in
almost infinite ReadData waiting.
To demonstrate the issue let's create some simple server
with 5 separate chunks response on each second and
simple TWinHttpAPI client to show received chunks on the console.
Minimal code to reproduce (sorry for the formatting, it's not trivial to show server/client code and avoid ab's long code warning )
program WinHttpAPIFeed; {$APPTYPE CONSOLE} {$I 'Synopse.inc'} uses SynCommons, SynCrtSock;
type
THttpServerFeed = class(THttpServer) procedure Process(ClientSock: THttpServerSocket; ConnectionID: THttpServerConnectionID; ConnectionThread: TSynThread); override; end;
TClientFeed = class function Download(Sender: TWinHttpAPI; CurrentSize, ContentLength, ChunkSize: Cardinal; const ChunkData): boolean; end;
procedure THttpServerFeed.Process(ClientSock: THttpServerSocket; ConnectionID: THttpServerConnectionID; ConnectionThread: TSynThread);
var Index: Integer; begin
with ClientSock do begin
SockSend('HTTP/1.1 200 OK'#$D#$A'Transfer-Encoding: chunked'#$D#$A); TrySockSendFlush;
for Index := 1 to 5 do begin
Sleep(1000); SockSend('10'#$D#$A'{"data":"feed"}'#$A); TrySockSendFlush;
end;
SockSend('0'#$D#$A#$D#$A); TrySockSendFlush; KeepAliveClient := False;
end;
end;
function TClientFeed.Download(Sender: TWinHttpAPI; CurrentSize, ContentLength, ChunkSize: Cardinal; const ChunkData): boolean;
var Content: RawByteString; begin
FastSetString(RawUTF8(Content), Pointer(@ChunkData), ChunkSize);
Write(Content); Result := True;
end;
var Server: THttpApiServer; Client: TWinHttpAPI; ClientFeed: TClientFeed; OutHeader, OutData: SockString;
begin
TAutoFree.Several([@Server, THttpServerFeed.Create('9000', nil, nil, '', 1),
@Client, TWinHTTP.Create('http://localhost:9000'), @ClientFeed, TClientFeed.Create]);
Server.AddUrl('/', '9000');
Client.OnDownload := ClientFeed.Download;
Client.Request('/', 'GET', 0, '', '', '', OutHeader, OutData);
Writeln('Done'); Readln;
end.
In real life - feeds can be opened much longer than 5 seconds
with the data chunks occurred from time to time.
If you run the sample the OnDownload occurred only once after 5 seconds
period with all data fetched and once (because connection is closed), but not during
the each chunk delivery.
The problem is that QueryDataAvailable is not called before ReadData
as it shown is all Microsoft API code samples for ReadData.
Because we do not request current available data before the ReadData call we
have infinite freeze till connection close. This makes chunk by chunk fetching used
in feeds API impossible.
You can check that Windows HTTP API usage shows that QueryDataAvailable
is always called before ReadData to provide information of the current size
of bytes available to read and prevent reads from freezings.
You can check WinHttpQueryDataAvailable / WinHttpReadData samples used by TWinHTTP
or check InternetQueryDataAvailable used by TWinINet.
@ab, I've add fully backward compatible minor amendment to pull quest #307 to fix this and
makes OnDowload fetch feed chunks as expected.
Hi ab,
As for now both windows-related HTTP-client classes (TWinINet, TWinHTTP)
add hard-coded Accept: */* header in their InternalCreateRequest methods.
It's ok for most cases, but when I try to customize Accept header in my
GET / POST requests - the WinHTTP library append the new value
to this */* but not replace it.
As the result I'm unable to set Accept header to required exact value:
the */* prefix is always hardcoded.
InHeaders := 'Accept: application/x-json-stream';
...Request(..., InHeaders, ...)
Expected HTTP Packet header:
Accept: application/x-json-stream
Actual HTTP Packet header:
Accept: */*, application/x-json-stream
To keep the full backward capability with the current implementation
the TWinHttpAPI.NoAllAccept property added to have an ability
to exlude this */* hardcoded value.
Please check pull request #307.
Hi sakura,
Seems like HTTP 402 is reserved as per RFC 7231:
6.5.2. 402 Payment Required
The 402 (Payment Required) status code is reserved for future use.
And in real practice this HTTP code has very unclear semantic:
402 Payment Required
Reserved for future use. The original intention was that this code might be used as part of some form of digital cash or micropayment scheme, as proposed, for example, by GNU Taler, but that has not yet happened, and this code is not widely used. Google Developers API uses this status if a particular developer has exceeded the daily limit on requests. Sipgate uses this code if an account does not have sufficient funds to start a call. Shopify uses this code when the store has not paid their fees and is temporarily disabled. Stripe uses this code for failed payments where parameters were correct, for example blocked fraudulent payments.
May be it will be better to revert #303, what do you think?
Thanks, ab.
Hi ab,
Just couple of lines with minor amendmends (remove some compiler warnings in stubs, one additional constant):
HTTP_PARAMETER_REQUIRED = 402;
function TOpenSSLConnectionClient.Connect(...): boolean;
begin
result := false;
end;
function TOpenSSLConnectionClient.SecureWrite(...): boolean;
begin
result := false;
end;
Good point!
I was thinking how to integrate this check and prevent such leaks in future and remembered, that SynSelfTests check for mem leaks on complete.
Thanks
Thanks ab,
All allocated memory now released property after the 1.18.5944 release.
Hi Daniel,
The issues is in construction/destruction sequence:
For the given creation sequence:
CreateModel
CreateDataBase
CreateRestServer
It's better to destruct everything in reverse order:
DestroyRestServer
DestroyDataBase
DestroyModel
So changing
DB.Free;
Rest.Free;
to
Rest.Free;
DB.Free;
Will finalyze/destroy/unassign all properties in proper sequence and leaks are gone.
Thanks macfly,
The issue is in internal buffer management of TTextWriter.
When the data is bigger than provided external stack-allocated buffer - the TTextWriter code is switching to heap memory allocation
in SynCommons:55649
if twoBufferIsExternal in fCustomOptions then // use heap, not stack
exclude(fCustomOptions,twoBufferIsExternal) else
FreeMem(fTempBuf); // with big content comes bigger buffer
GetMem(fTempBuf,fTempBufSize);
As you see the twoBufferIsExternal flag is excluded from fCustomOptions to indicate that we must release memory later in TTextWriter.Destroy or during the future internal buffer reallocations.
But! Externally the TDocVariant.ToJSON code restores fCustomOptions from some strange backup variable (?)
As the result twoBufferIsExternal flag is brings back to fCustomOptions and all allocated by the TTextWriter heap memory is never released.
SynCommons:48215
W.Add(']');
end;
W.fCustomOptions := backup;
end else
Restore from this backup variable all flags except twoBufferIsExternal.
W.fCustomOptions := backup - [twoBufferIsExternal] + W.fCustomOptions * [twoBufferIsExternal];
Refactor TDocVariant.ToJSON code and not use the protected section of TTextWriter class by the external TDocVariant class.
As for now any JSON requests, serialization, or data transfer larger than ~40KB (~4Kb gzip) provides memory leaks on server
Hi,
I have an issue with any JSON serialization for content size more than ~40KB.
Minimal code to reproduce:
program ToJSONMemoryLeak;
{$APPTYPE CONSOLE}
uses
SynCommons;
var
Index: Integer;
JSON: RawUTF8;
V: Variant;
begin
ReportMemoryLeaksOnShutdown := True;
// Let's build some DocVariant array of objects [{...}, {...}, ...]
JSON := '[';
for Index := 1 to 1400 do
JSON := JSON + '{"name":"value","prop":false},';
JSON[Length(JSON)] := ']';
V := _Json(JSON);
TDocVariantData(V).ToJSON; // Houston we have a problem
end.
On the recent 1.18.5940 (for both 32-bit and 64-bit platforms) on Delphi 10.3 Rio I have:
Unexpected Memory Leak
An unexpected memory leak has occurred. The sizes of unexpected leaked medium and large blocks are: 16424
Seems like regression in memory management or JSON serialization structures was introduced.
Besides D5 (maybe D6) drop, can we drop Kylix support also (due to FPC domination)? Write library for mORMot framework with Kylix support (because mORMot support it) is not fun.
Sermer is perfect (hope that 2.* will exists as much as 1.18 )
Please keep synking with git - this is the easiest way to have own git branches with required framework tuning.
Some whats new section/blog/file will be nice (maybe as git releases with notes) to skip sources analysis for changes for the last couple of months to see what new functions created/updated.
Speed techniques for different aspects of framework: mORMot has 4-8 ways do to the same work, I always have feeling that my code calls/usage are not optimal for my cases. Of couser some functions docs have "good/better/best" hints, but not for all day-by-day calls. Like string manipulations, search, arrays, JSON, TDocVariantData, etc.
Thanks, works as planned.
Please check #274 with JSONToVariant added.
Please follow the forum rules.
Don't post such huge pieces of code in the forum.
Sorry the code is small, just add procs annotations to save your time if you plan to add it
The primary difference from VariantLoadJSON() is that it handles all RFC 8259 types in the same manner, including parsing of objects and arrays, like '{}' and '[]', '[1e300]', or '{"n": -1e-300}':
V := _JSONStrict('{}'); // TDocVariantData(V).Kind = dvObject
V := VariantLoadJSON('{}'); // TDocVariantData(V).VarType = varNull
V := _JSONStrict('[]'); // TDocVariantData(V).Kind = dvArray
V := VariantLoadJSON('[]'); // TDocVariantData(V).VarType = varNull
V := _JSONStrict('[1e300]'); // TVarData(V._(0)).VType = varDouble;
V := VariantLoadJSON('{}'); // TDocVariantData(V).VarType = varNull
V := _JSONStrict('{"n": -1e-300}'); // TVarData(V._(0)).VType = varDouble;
V := VariantLoadJSON('{"n": -1e-300}'); // TDocVariantData(V).VarType = varNull
Anyway, if you don't see the reasons to add unified parser for all RFC 8259 types, it's ok, will use it internally.
Tried to make it more mORMot`ish and remove all intermediate procs (now this code contains the fastest available calls). Please confirm.
interface
/// retrieve a variant value from a JSON as per RFC 8259, RFC 7159, RFC 7158
// - follows TTextWriter.AddVariant() format (calls GetVariantFromJSON)
// - will instantiate either an Integer, Int64, currency, double or string value
// (as RawUTF8), guessing the best numeric type according to the textual content,
// and string in all other cases, except TryCustomVariants points to some options
// (e.g. @JSON_OPTIONS[true] for fast instance) and input is a known object or
// array, either encoded as strict-JSON (i.e. {..} or [..]), or with some
// extended (e.g. BSON) syntax
// - warning: by default dvoAllowDoubleValue is set and 32-bit floating-point
// conversion is tried, with potential loss of precision during the conversion
// - warning: the JSON buffer will be modified in-place during process - use
// a temporary copy or the overloaded functions with RawUTF8 parameter
// if you need to access it later
function _JsonStrictInPlace(const JSON: PUTF8Char;
Options: TDocVariantOptions = [dvoReturnNullForUnknownProperty];
const AllowDouble: Boolean = True): Variant;
{$ifdef HASINLINE}inline;{$endif}
/// retrieve a variant value from a JSON as per RFC 8259, RFC 7159, RFC 7158
// - follows TTextWriter.AddVariant() format (calls GetVariantFromJSON)
// - will instantiate either an Integer, Int64, currency, double or string value
// (as RawUTF8), guessing the best numeric type according to the textual content,
// and string in all other cases, except TryCustomVariants points to some options
// (e.g. @JSON_OPTIONS[true] for fast instance) and input is a known object or
// array, either encoded as strict-JSON (i.e. {..} or [..]), or with some
// extended (e.g. BSON) syntax
// - this overloaded procedure will make a temporary copy before JSON parsing
// and return the variant as result
// - warning: by default dvoAllowDoubleValue is set and 32-bit floating-point
// conversion is tried, with potential loss of precision during the conversion
function _JsonStrict(const JSON: RawUTF8;
Options: TDocVariantOptions = [dvoReturnNullForUnknownProperty];
const AllowDouble: Boolean = True): Variant;
{$ifdef HASINLINE}inline;{$endif}
/// retrieve a variant value from a JSON as per RFC 8259, RFC 7159, RFC 7158
// - this global function is an handy alias to:
// ! _JsonStrict(JSON,JSON_OPTIONS_FAST,AllowDouble);
// - follows TTextWriter.AddVariant() format (calls GetVariantFromJSON)
// - will instantiate either an Integer, Int64, currency, double or string value
// (as RawUTF8), guessing the best numeric type according to the textual content,
// and string in all other cases, except TryCustomVariants points to some options
// (e.g. @JSON_OPTIONS[true] for fast instance) and input is a known object or
// array, either encoded as strict-JSON (i.e. {..} or [..]), or with some
// extended (e.g. BSON) syntax
// - this overloaded procedure will make a temporary copy before JSON parsing
// and return the variant as result
// - warning: by default dvoAllowDoubleValue is set and 32-bit floating-point
// conversion is tried, with potential loss of precision during the conversion
function _JsonStrictFast(JSON: RawUTF8;
const AllowDouble: Boolean = True): Variant;
{$ifdef HASINLINE}inline;{$endif}
implementation
function _JsonStrictInPlace(const JSON: PUTF8Char; Options: TDocVariantOptions;
const AllowDouble: Boolean): Variant;
var wasString: boolean;
Val, Dest: PUTF8Char;
begin
if JSON = nil then
ZeroFill(@result) // varEmpty
else if IdemPChar(JSON, 'NULL') then
TVarData(result).VType := varNull
else begin
Dest := JSON;
if AllowDouble then
Dest := TDocVariantData(result).InitJSONInPlace(
Dest, Options + [dvoAllowDoubleValue])
else
Dest := TDocVariantData(result).InitJSONInPlace(Dest, Options);
if Dest = nil then begin
Dest := JSON;
Val := GetJSONField(Dest,Dest,@wasString);
GetVariantFromJSON(Val,wasString,result,nil,AllowDouble);
end;
end;
end;
function _JsonStrict(const JSON: RawUTF8; Options: TDocVariantOptions;
const AllowDouble: Boolean): Variant;
var tmp: TSynTempBuffer;
begin
tmp.Init(JSON); // temp copy before in-place decoding
try
Result := _JsonStrictInPlace(tmp.buf, Options, AllowDouble);
finally
tmp.Done;
end;
end;
function _JsonStrictFast(JSON: RawUTF8; const AllowDouble: Boolean): Variant;
begin
Result := _JsonStrict(JSON, JSON_OPTIONS_FAST, AllowDouble);
end;
ab, thanks for the hints!
After sources analysis of provided functions, I think I found how to handle RFC 8259 JSONs received from the 3rd party vendors API with mORMot (with respect to all peculiarities related to Double values).
Please find the code in the next topic reply.
Here are the results of _JsonStrict, _JsonStrictFast, _JsonStrictInPlace:
var
V: Variant;
begin
V := _JsonStrict('null'); // TDocVariantData(V).VarType = varNull
V := _JsonStrict('false'); // TDocVariantData(V).VarType = varBoolean
V := _JsonStrict('true'); // TDocVariantData(V).VarType = varBoolean
V := _JsonStrict('-0'); // TDocVariantData(V).VarType = varInteger
V := _JsonStrict('"t1 \r\n t2"'); // TDocVariantData(V).VarType = varString
V := _JsonStrict('-1E-300'); // TDocVariantData(V).VarType = varDouble
V := _JsonStrict('{}'); // TDocVariantData(V).Kind = dvObject
V := _JsonStrict('[]'); // TDocVariantData(V).Kind = dvArray
V := _JsonStrict(''); // TDocVariantData(V).VarType = varEmpty
V := _JsonStrict('$%#@'); // TDocVariantData(V).VarType = varNull
V := _JsonStrict('{"n": -1E-300}'); // TVarData(V._(0)).VType = varDouble
V := _JsonStrict('9223372036854775807'); // TDocVariantData(V).VarType = varInt64
V := _JsonStrict('9223372036854775808'); // TDocVariantData(V).VarType = varDouble
end;
Maybe you can add this or similar functions to the Framework?
Btw, from JSON.org:
json
element
element
ws value ws
value
object
array
string
number
"true"
"false"
"null"
And they also support standart exponent notation for numbers, which I am getting frequently from different 3rd party APIs:
Hi,
From mORMot documentation:
With _Json() or _JsonFmt(), either a document or array variant instance will be initialized with data supplied as JSON.
The supplied JSON can be either in strict JSON syntax.
Right now mORMot partially support JSON from the obsolete RFC 4627 (created 13 years ago):
JSON-text = object / array
Is it any plans to support the actual RFC 8259 (created 2 years ago) or RFC 7159 (created 6 years ago):
JSON-text = ws value ws
value = false / null / true / object / array / number / string
false = %x66.61.6c.73.65 ; false
null = %x6e.75.6c.6c ; null
true = %x74.72.75.65 ; true
The changes are not radical and extends the current implementation with full backward compatibility (may be with except for 'null', which is parsed as '{}').
What is needed is to make the next JSON checks to be true:
TVarData(_Json('null')).VType = varNull
TVarData(_Json('false')).VType = varBoolean
TVarData(_Json('true')).VType = varBoolean
VarIsStr(_Json('"text"')) = True
VarIsFloat(_Json('0.5')) = True // Currency by default (why not to make all floats to be Double?)
VarIsFloat(_Json('-1E-10')) = True // Single precision
VarIsFloat(_Json('-1e-300')) = True // Double precision
VarIsFloat(_Json('-1E-010')) = True // Single with exponent started from 0
VarIsFloat(_Json('-1e-0300')) = True // Double with exponent started from 0
VarIsOrdinal(_Json('0')) = True // Integer
VarIsOrdinal(_Json('5000000000')) = True // Int64
VarIsOrdinal(_Json('-0')) = True // Integer
VarIsOrdinal(_Json('-5000000000')) = True // Int64
Browsers have no issues with:
console.log(JSON.parse('null'), JSON.parse('false'), JSON.parse('"text"'), JSON.parse('-1E-300'));
Btw, _Json('{"n": 1E3}') or _Json('{"n": -1e-0300}') is a correct RFC 4627 strict JSON, but mORMot parse such correct numbers as string:
number = [ minus ] int [ frac ] [ exp ]
decimal-point = %x2E ; .
digit1-9 = %x31-39 ; 1-9
e = %x65 / %x45 ; e E
exp = e [ minus / plus ] 1*DIGIT
frac = decimal-point 1*DIGIT
int = zero / ( digit1-9 *DIGIT )
minus = %x2D ; -
plus = %x2B ; +
zero = %x30 ; 0
Seems like the UnCompressMem routine in PasZip has an issue and proceed only the first 2 bytes of the compressed data then exit with the success result.
The solution is simple: use SynZip unit based on zlib library instead of PasZip unit. It's recent, better managed, gives you control over compression level and let you choose the compression method (Deflate, GZip or ZLib).
You needn't to change anything in your code (parameters order and types are the same).
program DeflateTest;
{$APPTYPE CONSOLE}
uses
SynZip,
// PasZip, // Uncomment to check PasZip CompressMem
SysUtils;
var
Index, Size: Integer;
Data, Compressed, Decompressed: array of AnsiChar;
begin
SetLength(Data, 1000);
SetLength(Compressed, 1000);
SetLength(Decompressed, 1000);
for Index := Low(Data) to High(Data) do
Data[Index] := AnsiChar(Ord('A') + Index mod 10);
Size := CompressMem(Data, Compressed, Length(Data), Length(Compressed));
Writeln('Compressed size: ', Size);
Size := UnCompressMem(Compressed, Decompressed, Size, Length(Decompressed));
Writeln('Decompressed size: ', Size);
Writeln('Equality: ', CompareMem(Data, Decompressed, Length(Data)));
Readln;
end.
Don't set the compressed buffer size equals to the source data size.
You must add some extra bytes for the cases where the compressed sequence requires more bytes than the source data (it is possible when you try to compress high-entropy random data or better compressed data like synlz archives, png, pdf, docx, etc.). Additionally you have to reserve some extra bytes for the compression header overhead.
If you change AnsiChar(65 + j) to AnsiChar(Random(255)) you will see that compression failed because the compressed data requires 1005 bytes (in average case) as minimum (set compressed buffer to 2000 for example and check the actual compressed size).
The maximum possible memory required for (Gzip, Zlib or Deflate) compression is provided in SynZip.pas:5368 unit in CompressInternal:
procedure CompressInternal(var Data: ZipString; Compress, ZLib: boolean);
...
DataLen: integer;
begin
...
DataLen := length(Data);
...
SetString(Data,nil,DataLen+256+DataLen shr 3); // max mem required
So 12.5% overhead with additional 256 bytes for the header (or Size + Size shr 3 + 256) is a good heuristic for the max size of any compressed data (it's includes your possible plans to switch from Deflate to GZip):
begin
SetLength(Data, 1000);
for Index := Low(Data) to High(Data) do
Data[Index] := AnsiChar(Ord('A') + Index mod 10);
Size := Length(Data);
SetLength(Compressed, Size + Size shr 3 + 256);
Size := CompressMem(Data, Compressed, Length(Data), Length(Compressed));
Writeln('Compressed size: ', Size);
SetLength(Decompressed, Length(Data));
Size := UnCompressMem(Compressed, Decompressed, Size, Length(Decompressed));
Writeln('Decompressed size: ', Size);
Writeln('Equality: ', CompareMem(Data, Decompressed, Length(Data)));
Readln;
end.