You are not logged in.
Thank you!
You are right as always.
Hello,
After upgrading to the latest stable version, I found that long enough data encrypted in 1.18 cannot be decrypted in 2.3.8854.
Lazarus 2.0.12; FPC: 3.2.0.
Is it a bug (somewhere in the padding)?
// Data to encrypt: "SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA SOME TEST DATA"
function EncryptAES256CTR(Data: Pointer; DataLength: Int64; const Key: THash256; out EncryptedData: RawByteString): Boolean;
var
AES: TAESCTR;
EncryptedDataSize: Int64;
begin
Result := True;
try
EncryptedData := '';
AES := TAESCTR.Create(Key);
try
EncryptedDataSize := AES.EncryptPKCS7Length(DataLength, True);
SetString(EncryptedData, nil, EncryptedDataSize);
Result := AES.EncryptPKCS7Buffer(Data, @EncryptedData[1], DataLength, EncryptedDataSize, True);
except
Result := False;
end;
AES.Free;
except
Result := False;
end;
end;
// ===
function DecryptAES256CTR(Data: Pointer; DataLength: Int64; const Key: THash256; out DecryptedData: RawByteString): Boolean;
var
AES: TAESCTR;
begin
Result := True;
try
AES := TAESCTR.Create(Key);
try
DecryptedData := AES.DecryptPKCS7Buffer(Data, DataLength, True);
except
Result := False;
end;
AES.Free;
except
Result := False;
end;
end;
AB, congratulations on reaching the milestone!
Synopse mORMot 2 is an Open Source Client-Server ORM SOA MVC framework for Delphi 7 up to Delphi 11 Alexandria and FPC 3.2/trunk, targeting Windows/Linux/BSD/MacOS for servers, and any platform for clients (including mobile or AJAX).
Does it mean, that mORMot2 supports Delphi 11 Linux compiler?
UPD:
'Kylix or Delphi for MacOS/Linux/Mobile are unsupported'
'-> we recommend using FPC for POSIX platforms'
Thanks, i will use suitable temp fix.
Would be cool if mORMot v2 will be fully Unicode aware (for FPC, including file paths and file operations).
I still use 1.18..
Code below works, but require to load entire file.
{$mode delphi}
var
FilePathStr: RawUtf8; // 'C:\Program Files (x86)\'#208#161#208#181#209#128#208#178#208#181#209#128'\test.txt'
FileData: RawByteString;
Result: RawUtf8;
begin
FileData := ReadFileToString(FilePathStr);
Result := MD5(FileData);
end;
So, the only option is to load file (using ut8 ready methods) and then calculate hash from buffer (because there is no option to pass stream)?
Same with Utf8ToWinAnsi, cyrillic symbols become "?".
UTF8ToString(SrcPath) works only when system encoding is russian.
Hi!
HashFile method accept string type instead of rawutf8.
I found issue in case when windows encoding is set to United States and file path is rawutf8 (includes utf8 cyrillic characters).
{$mode delphi}
var
FilePathStr: TFileName;
Result: RawUtf8;
begin
FilePathStr := SrcPath; // 'C:\Program Files (x86)\'#208#161#208#181#209#128#208#178#208#181#209#128'\test.txt'
// FilePathStr := UTF8ToString(SrcPath);
// FilePathStr := UTF8ToSys(SrcPath);
Result := HashFile(FilePathStr, THashAlgo.hfMD5) // empty hash
end;
Compiler options: -FcUTF8
How to pass rawutf8 to HashFile?
mORMot 2 looks so pretty, and seems, SM engine will be included as well.
In mORMot 1.18, in order to get working SyNode, we must use separate branch.
How it will be implemented in mORMot2 - will the SyNode code appear on the master branch?
Thank you mpv!
Ping unitybase.info reports same ip 91.214.182.35
I believe this is some sort of cross country restrictions.
But i have no idea how it affects tor browser as well.
I tested from 4 separate ISPs and 3 different devices from different networks (and from mobile phone as well).
Unable to connect.
Firefox can’t establish a connection to the server at unitybase.info.
I checked also from tor browser, just in case (with IP address from France).
Precompiled binary can be downloaded here:
- Win x32: https://unitybase.info/media/files/synm … 32dlls.zip
- Win x64: https://unitybase.info/media/files/synm … 64dlls.zip
- Linux x64: https://unitybase.info/media/files/libsynmozjs52.zip
It seems, links are broken
We should create a "mormot.base.pack" (or mormot.base.run, following the "suffix pattern" which "run" stands for runtime packages and "dsgn" stands for IDE installed package).
Then, create another one just for Zeos, mormot.db.zeos.pack.
Zeos team should use mormot.base.pack instead .inc, conditionals, or paths for mORMot.
So, Zeos will require mormot always?
mormot.base.pack is entire framework, not just common module. Will it affect to exe size?
Currently, I'm not familiar with new mORMot2 module naming
But, I can share thoughts.
In order to compile zeos with mormot, zeos require some files, for example, SynCommons (which require SynLZ as well).
With those modules, zeos should compile.
So, you may create mormot2.common.lpk and include modules, that does not require much of your modules, just common part like SynCommon, SynLZ (and definitely, mormot.common.lpk should not rely on zeos).
In that case, zeos team can add custom package option (like you did) if checked, user should add mormot.common.lpk dependency to zcomponent package manually in order to compile zeos.
Next, your module, that rely on zeos, should be extracted to separate package, mormot2.zeos.lpk this one should require mormot2.common.lpk and appropriate zeos packages (zdbc, zcore).
Should be another main package, something like mormot.base.lpk (or mormot.core.lpk, sounds better) this one will require mormot2.common.lpk (and mormot.cross.lpk if this one exists in mORMot2 and is mandatory for mormot.base.lpk, otherwise mormot.cross.lpk may be standalone package).
When user compile mormot.base.lpk with enabled option "NOSYNDBZEOS" (it's better to change it to USESYNDBZEOS) then, mORMot should compile fine without any relations to zeos.
When user compile mormot.base.lpk with disabled option "NOSYNDBZEOS", he also must add mormot2.zeos.lpk dependency (mormot2.zeos.lpk will automatically require appropriate zeos packages, and Lazarus will try to find it).
Would be good to add mormot2.synode.lpk as well (that will require mormot2.common.lpk).
And last, but not least.
Every package should be properly versioned (with version data from SynopseCommit.inc).
Zeos team may also put package build number from commit code. SVN generates numerical codes. For example, latest is 7366, so, full package version may be 8.0.7366.
---
Later, when Lazarus 2.2 will be released, for zeos package dependencies, you can specify min and max versions that should work with mORMot2 (for example >= 8.0.xxxx).
---
By the way, conditional defines in zeos.inc allows to turn off some drivers and doing so, reduces resulting exe file (package must be recompiled).
Such switches are good and should present in package "custom options defines" as well. So, package will be more user friendly for tuning.
Are you sure that Zeos is using SynCommons on its official repository?
Yes, but it's optional and disabled by default.
Currently they even have mORMot 2 integration, that can be enabled form inc file.
Soon, Lazarus will get update with fix 38587. And then, I hope, Lazarus package control feature will work as expected.
If so AB would be able (if he wants) to rely on this features and create few packages for FPC users, at least, for mORMot2.
In order to solve circular dependencies, ZEOS can optionally rely on some common mORMot package, while mORMot common package will not require any zeos packages (even, when zeos integration in mORMot is switched on).
Additionally, every package should have proper version for each build (in order to work with Lazarus properly).
About multiple versions of package. I created ticket here 38587.
It turns out that this issue is a bug, and may be fixed in Lazarus 2.2 (or with minor update later).
And there is a workaround available.
git already have everything to manage a dependencies as sub-repositories, you do not need to create a redme file (which is hard to maintain).
I use local gitlab instance, and just excluded dependencies\mormot from my repo.
Like NPM packages that not supposed to be inside repo, I used readme just to store some useful tips about used packages.
and we do not need a lpk at all
That's default option for mormot framework.
But I don't understand why packages work so strange, so i opened issue, will see what laz maintainers will say.
mpv, do you use zeos in same way without lpk?
I didn't found manual how to add zeos directly.
Lazarus package system is not as good as NPM, for sure.
I found strange behaviour, that looks like a lazarus bug.
We can put our dependencies in project subfolder:
dependencies
dependencies\readme.md
dependencies\mormot
In readme file we can store all deps details, like path to github repo, revision (commit ID), etc.
Now if I open mormot_base.lpk from this subfolder and click "use" - "add to project", then package appears in project inspector as dependency.
lpi file indeed does not contain any information about place where package lives:
<Item1>
<PackageName Value="mormot_base"/>
</Item1>
But, from project inspector, we can right click on mormot_base, select option "store file name as default for this dependency".
Save changes. now, lpi file contain proper relative path:
<Item1>
<PackageName Value="mormot_base"/>
<DefaultFilename Value="..\dependencies\mormot\Packages\mormot_base.lpk"/>
</Item1>
But for some reason, this path will not be used as default.
So if we have another package and repeat all steps, mormot_base package will be used from last opened path.
This is really looks like a bug or wrong behavior.
I've checked second way.
Created global dependencies folder, for multiple versions of each one:
lazarus-packages
lazarus-packages\mormot\master\ // <- only this one connected to git repo
lazarus-packages\mormot\1.18.6247\
lazarus-packages\mormot\1.18.6248\
Copied src files to proper folders and opened lpk from each one, configured lpk version to match SynopseCommit.inc.
Each time when we open lpk package, lazarus will remember package name + version + path (see menu - package - package links).
Removed dependencies from project and re-added it from from project inspector - right click on dependencies, find mormot_base and specify "minimum" and "maximum" version.
Now, each project request exact mormot_base version and lazarus know where it is (checked menu - packages - package links).
But for some stupid reason, Lazarus can't properly load requested package for second project.
I don't know how is it possible to make such strange logic.
Is there any chance that they can fix it?
https://bugs.freepascal.org/my_view_page.php
A Lazarus package is a collection of units and components, containing information how they can be compiled and how they can be used by projects or other packages or the IDE.
Packages are ideal for sharing code between projects.
Like NPM in javascript world.
If your lazarus project specify dependencies, it gives clear understanding of which third party code you need and which version.
When package greatly organized it's easier to use it inside your project by just adding it as dependency.
In case when you have multiple projects and want to share some code - package may be great solution.
Lazarus package may be design time or runtime.
Runtime packages must be included as dependency from project tree, while design time packages require in-IDE installation.
Both package types are useful and worth using.
Also, lazarus have online package manager (with auto-update feature) and corresponding repository, where mORMot is also listed.
(I don't use this, and dont like idea that package can be updated automatically).
Hi.
instead, add`zcomponent` package from ZeosLib into it (item 5 from README).
I use zeos 8, there were some issues that recently was fixed.
With latest zeos and mormot 1.18 I don't have any package compile errors with disabled {$DEFINE USE_SYNCOMMONS} from zeos.inc.
If {$DEFINE USE_SYNCOMMONS} is enabled in zeos.inc, clean mORMot package compilation ("recompile clean") will fail.
SynDBZeos.pas(343,26) Error: Identifier not found "TZJSONComposeOptions"
---
At some point it compiled. What I've done:
1. mormot_base. Compile without any changes, to get compiled SynCommons.
2. zcomponent. compile without {$DEFINE USE_SYNCOMMONS}, then with it.
3. mormot_base. add zcomponent dep and compile.
4. mormot_base. remove NOSYNDBZEOS from package options and compile twice!
Issue comes because of circular dependencies.
For example, zcomponent require SynCommons from mormot_base that require zcomponent.
With clean package installation, how it should be resolved?) Maybe I don't get something about lazarus packages.
From mormot_base.lpk options, I've added relative path to folder where Zeos.inc located.
Now, i successfully compiled mormot_base.lpk, SynCommons works, as well as ZEOS.
While my project does not contain any additional strings in compiler options ("Other unit files", "Include files" and "Libraries").
All dependencies are installed just for project. Awesome!
It makes sense to add to Packages\README.md note about link to Zeos.inc folder for other users.
I installed ZEOS package as project dependency (without lazarus tool pallet icons, just runtime package).
This does not require any additional lines in the compiler path (from the project parameters).
Tested and it works.
Next, i found, that mORMot does provide lpk either.
I've read Packages\README.md.
Removed package define "-dNOSYNDBZEOS", added zcomponent as package dependency.
During compilation, i get error:
SynDBZeos.pas(122,2) Fatal: Cannot open include file "Zeos.inc"
Do i still need to fill "Other unit files", "Include files" and "Libraries" both for mORMot and ZEOS?
In my mind, since i have already added zeos as mORMot package dependency, it should just compile, am i wrong?
ps: latest mORMot 1.8 and zeos 8.0 (patch branch)
Great job. So awesome that mORMot 2 comes to us
Nicely structured, with new features.
Next logical step is to work on OpenSSL integration of the TLS layer
ab, what exactly this mean?
Will the socket based web server support HTTPS mode? (use key and certificate).
Hi!
SynCrtSock.pas
6608 ctxt.Prepare(URL,Method,HeaderGetText(fRemoteIP),Content,ContentType,'',ClientSock.fTLS);
Why aRemoteIP is empty string here?
6608 ctxt.Prepare(URL,Method,HeaderGetText(fRemoteIP),Content,ContentType,fRemoteIP,ClientSock.fTLS);
With that change, Ctxt.RemoteIP from THttpServer.OnRequest handler will be filled.
How can i ensure that server has been started?
If i use port that already in use, i see error in debugger, but it happen outside of try except block (seems, in separate thread, some time later).
About WaitStarted, documentation says:
Ensure the HTTP server thread is actually bound to the specified port.
You should call and check this method result
But this is just a procedure, there is no result.
try
if Assigned(SecondHttpServer) then
begin
SecondHttpServer.Shutdown();
FreeAndNil(SecondHttpServer);
end;
// Create new instance
SecondHttpServer := THttpServer.Create(NewPort.ToString, nil, nil, 'LS', 30 {thread pool}, 0 {disable keep-alive}, True, True);
SecondHttpServer.OnRequest := RequestRouter;
SecondHttpServer.Start;
SecondHttpServer.WaitStarted(10);
Except
on E: Exception do
// Something went wrong
end;
Thanks! Exactly what i looking for
Accept-Encoding is missing in Ctxt: THttpServerRequest
Tested with MSEdge (chromium) and Firefox.
I'm performing request to server, browser sends some request headers,
almost all headers are present in Ctxt.InHeaders, except "Connection" and "Accept-Encoding".
Is it some feature? How i can ensure that client accepts gzip?
function RequestRouter(Ctxt: THttpServerRequest): cardinal;
begin
WriteLn(Ctxt.InHeaders);
{
Host: 127.0.0.1:8080
Pragma: no-cache
Cache-Control: no-cache
DNT: 1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.102 Safari/537.36 Edg/85.0.564.51
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: cross-site
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Language: en-GB,en;q=0.9,en-US;q=0.8,ru;q=0.7
}
end;
SQLHttpServer := TSQLHttpServer.Create('8080', [], '+', TSQLHttpServerOptions.useHttpSocket, 30 {Thread pool}, TSQLHttpServerSecurity.secNone);
SQLHttpServer.AccessControlAllowOrigin := '*';
SQLHttpServer.HttpServer.OnRequest := RequestRouter;
I found that with TCurlHTTP Basic Authentication does not work.
Server does not receive Authorization header.
{$IFDEF WINDOWS}
THTTPClient = TWinHTTP;
{$ELSE IF UNIX}
THTTPClient = TCurlHTTP;
{$ENDIF}
// ...
function TCTR.GetConnector(): THTTPClient;
begin
Result := THTTPClient.Create(Settings.BackendUrl, Settings.ProxyServer, '', 60 {sec} * 1000, 60 {sec} * 1000, 60 {sec} * 1000, False);
Result.AuthScheme := THttpRequestAuthentication.wraBasic;
Result.AuthUserName := 'AuthUserName ';
Result.AuthPassword := 'AuthPassword ';
end;
In my application on the server-side I consider DateTime is in UTC0, during writing Date/DateTime to JSON I add 'Z' in the end (manually).
I used the same approach.
In another project written on another language, i implemented similar methods (for iso8601 conversions) that also supports timezone part.
Even if DateTimeToIso8601 and Iso8601ToDateTime will be updated (like i posted above), current approach will continue to work if date string contain "Z" at the end, but also will work correctly with timezone.
In general it more accurate iso8601 handling which gives more options.
iso8601 datetime string may include timezone.
For example: 2020-08-28T15:55:56+05:00
Datetime in UTC timezone: 2020-08-28T15:55:56Z
Iso8601ToDateTime() method converts "2020-08-28T15:55:56+05:00" to 2020-08-28 15:55:56. It ignores timezone part.
Theoretically, Iso8601ToDateTime may respect timezone part and return date converted to local timezone with proper timezone offset.
If you live in France (GMT+2), then for "2020-08-28T15:55:56+05:00" correct date will be: 2020-08-28T12:55:56.
When value contain "Z" (which mean utc+0) value may be converted like now (2020-08-28T15:55:56Z = 2020-08-28 15:55:56).
Second part is DateTimeToIso8601() method.
Currently it does not allow to specify timezone for the TDateTime value.
For example, method signature may be changed to support timezones:
// - Use UTC to specify that D in UTC timezone (will add Z at the end of string). When true, TimeZoneOffset will be ignored.
// - Use TimeZoneOffset to specify timezone offset in minutes, for example for +03:30 set 210
function DateTimeToIso8601(D: TDateTime; Expanded: boolean; FirstChar: AnsiChar='T'; WithMS: boolean=false; QuotedChar: AnsiChar=#0; UTC: boolean = false; TimeZoneOffset: integer = 0): RawUTF8; overload;
And maybe also would be usefull to add GetTimezoneOffset():integer method.
Similar to javascript.
As i understand, Iso8601ToDateTime and DateTimeToIso8601 just ignores timezones (and utc marker "z").
I can ensure that all incoming date strings will be in UTC timezone.
Then, after DateTimeToIso8601 i can just add "Z" at the end.
Maybe exists a better way?
As mentioned earlier in this topic RawUTF8 is the same as the FPC String
It's same as UTF8String, not simple string.
And table "Assign string literals to different string types" says that by default, we can't assign constant string directly to UTF8String (or to RawUTF8).
Instead of converting, you can add {$codepage utf8} at module begin which will tell the compiler that all constant strings in module should be interpreted as utf8.
If you would like to aplly this mode to entire project, it's possible by adding compilation flag "-FcUTF8" (project options -> custom options -> Add -FcUTF8).
you may try to use the hexa constant variation #$e2#$82#$ac instead of '€'
No, that not helps. Works only with {$codepage utf8}.
I found a table that describes which assignments are allowed.
In that table, they have UTF8String, which is the same thing as RawUTF8. Both = type AnsiString(CP_UTF8).
Based on that table, direct value assignments from source code to RawUTF8 are not allowed.
{$codepage utf8} directive may help here.
PS: i use UTF-8 for file encoding.
I have a question about RawUTF8.
Trying to build json object, and send it to external API service. But, i get unexpected character encoding..
As stated, RawUTF8 is AnsiString with codepage CP_UTF8.
This example works, external service receive correct character code ("€" = HEX: E2 82 AC).
// Each code compiled with
{$mode delphi}
function BuildJsonObject1(): RawUTF8;
var
JSONObj: variant;
begin
with TDocVariantData(JSONObj) do
begin
AddValue('c', '€');
Result := ToJSON();
end;
end;
// From another place:
HTTPClient.Request('/API/v1/test', 'POST', KeepAlive, RequestHeaders, BuildJsonObject1(), RequestDataType, ResponseHeaders, ResponseData);
This code produce unexpected character encoding ("€" = HEX: C3 A2 C2 82 C2 AC):
function testUtf8Char(): RawUTF8;
begin
Result := '€';
end;
function BuildJsonObject2(): RawUTF8;
var
JSONObj: variant;
begin
with TDocVariantData(JSONObj) do
begin
AddValue('c', testUtf8Char());
Result := ToJSON();
end;
end;
Same happen when i use AddValueFromText, ("€" = HEX: C3 A2 C2 82 C2 AC)
function BuildJsonObject3(): RawUTF8;
var
JSONObj: variant;
begin
with TDocVariantData(JSONObj) do
begin
AddValueFromText('c', '€');
Result := ToJSON();
end;
end;
And one more question, how AnsiString(CP_UTF8) store data internally?
strRawUtf8 := '€'; // EXPECTED UTF-8 bytes = HEX: E2 82 AC
WriteLn('strRawUtf8 codepage: ' + StringCodePage(strRawUtf8).ToString); // 65001 (UTF-8)
WriteLn('strRawUtf8 hex: ' + BinToHex(strRawUtf8)); // C3 A2 C2 82 C2 AC
Should i use {$codepage utf8}?
aProxyName seems a string, that must contain proxy server address and i guess a port.
Is it possible to set proxy login and password somehow?
---
Basic authentication on proxy, probably, can be implemented via request headers.
Is there an option to allow multiple read single write logic?
Like TMultiReadExclusiveWriteSynchronizer do.
Useful when object must be thread safe with big amount of read operations and rare write operations.
Based on description, yes
Thanks!
I saw, but i have no static files on disk.
All UI files are packed to single content file and embedded as resource.
As result, all files are available directly from memory.
Sad, that was nice idea to keep folder empty, hide files from direct access, and provide fast access without disk IO.
Maybe TSQLRestServerURIContext.ReturnBlob will help.
So far so good)
Can't find how to send stream data using THttpServerRequest object.
For example, to send image or font file...
I use TSQLHttpServer, with SQLHttpServer.HttpServer.OnRequest event.
SynCommons.pas contain:
{$ifndef UNICODE}
type
/// low-level API structure, not defined in older Delphi versions
TOSVersionInfoEx = record
<SOME CODE HERE>
end;
{$endif UNICODE}
...
var
...
/// the current Operating System information, as retrieved for the current process
OSVersionInfo: TOSVersionInfoEx;
When project configured to use key -MDelphiUnicode.
TOSVersionInfoEx is not available and application can't compile.
Also there is another issues as well.
-MDelphiUnicode is not supported for now?
---
FPC 3.2.0 with fixes.
Thanks, will try.
Does TSQLHttpServer work on windows and linux as well?
And supports described above features, both for win and linux?
Hello!
As i remember "REST-tester" demo, client-server communication supports few protocols that work pretty good when both client and server are pascal applications.
But, what is the best way to create web server that supports:
- Non administrative privileges (not require admin account (or UAC elevation) / root)
- Multi threading
- Windows + Linux
- Custom method based API (some code that can handle request and return response with custom http headers)
- File transfer (ability to return stream data)
- http/https (https is not mandatory)
- Standalone application (no additional software, like nginx)
Where client applications are:
- Browsers (SPA)
- Other applications that can send http/https requests (not pascal based, without JavaScript support).
THttpApiServer is a way to go? Does it fit?
Well, this message says that new compiler even better and produce smaller code.
Second point, while fpcupdeluxe works, which is awesome, it has "specific look" and does a lot operations that make some caution feeling, especially when use it first time.
I decided to try it in my DEV machine when found, that AB recommended this tool.
You may say, fpcupdeluxe is open source, 100% safe. It downloads sources on the fly and compile it.
So, it's just some portion of caution
About difference, 2kb is nothing.
Especially if compare sizes of .net applications (with all required deps).
you would have to compare the source code between the versions.
If you mean ASM, then unfortunately, i have no enough practice to make decision that additional 2kb comes due compiler or LCL changes, and not comes from fpcupdeluxe.
Here is ASM for both files, in case someone can and would like to check)
test (onlinedisassembler)
test_fpcupdeluxe (onlinedisassembler)
program test;
{$mode objfpc}
{$H+}
begin
// don't show empty heaptrc output
{$IF DECLARED(GlobalSkipIfNoLeaks)}
GlobalSkipIfNoLeaks := True;
{$ENDIF}
WriteLn('Hello, from lazarus!');
ReadLn;
end.
Release build, disabled debug.
Stock Lazarus 2.0.8 + FPC 3.0.4; cross to win32 = 31,0 KB (exe size)
Lazarus: "fixes2.0" + FPC "fixes3.2" (fpcupdeluxe); cross to win32 = 33,0 KB (exe size)
2 kb diff. What it could be?
It may be a lot of bugs in Lazarus or just one bug with cardinal values in compiler.
Since 2.0.8 is stable version, more likely it was bug in compiler.
Nice, now demo project pass build.
By the way, 2020-04 lazarus-2.0.8-fpc-3.2.0_fixes-44680 not works.
I can't run empty form project due error (don't work if Application.DoubleBuffered).
If i disable it, error appear in another module. I have many problems with empty project.
Suppose that issue comes from compiler itself.
Will test another version.
---
https://synopse.info/forum/viewtopic.ph … 695#p31695
FPC: "fixes3.2" and Lazarus: "fixes2.0" works.