#1 2020-05-18 13:25:21

radexpol
Member
From: Poland, Krk
Registered: 2019-11-29
Posts: 116

2tier to 3tier migration. AdoDataset -> RestDataset solution

My app consists of two parts: client and server (Contact Center App), both client and server have a DB connection, client loads the customer data, displays the tickets, history of calls etc. and allows to administrate the system.
I'm using MS SQL.

Due to my customer requirements to prevent from connecting to DB from client apps I've created the framework that allows to migrate from Ado* components to Rest* compatible equivalent. My company project is very huge and replacing it's logic with classic rest methods is not possible in short time so I decided to create the workaround for fast cutting connection from client app to database, all queries are forwarded through the server. How it works:

  • All queries, commands and stored procedures are executed through the rest server   

   I tried to create the mechanism that is compatible with ADO* so you can write the same sql queries, use the same query parameters as you did with Ado*. The migration process is also very fast, usually you have to rename Ado* to Rest* in pas and dfm files.

  • Connection is secure

   For client-server communication I use the Indy* components with SSL

  • Solution is partially mORMot based

   I used the mORMot json implementation for fast processing

  • Connection pool

   The connection pool of x number of concurrent connections has been implemented for concurrent queries

  • Query responses are cached (optional)

   For every query coming from client the Rest Server creates the cache for future use by other clients. RestServer extracts the tables used in the query, parameters, and commandtext and saves that    data in the memory with the response payload (compressed). If any data in the table of this query changes - the cache items bounded to this table are deleted. How can I know that the data in DB    changed?

The RestServer creates the trigger on each table (on startup), OnDelete, OnUpdate, OnInsert , when the data changes that information goes to the table CACHE:   

   TableName  DateModified
  ----------------------------------
   CARS         2020-02-02 12:45:55
   OWNERS     2020-02-02 11:11:11

the internal thread monitors that table, if date changes all cached queries containing that table are removed.

   The other thread removes all cached queries that has not been used  by any client since xx minutes.

  • Data are compressed

   The response dataset is compressed to save the bandwidth (http gzip)

  • Use cases, parts of code:

Query content:

{"commandtext":"select top 100000 cl_id as [id] , * from [calls] cl\r\njoin devices on dv_id = cl_dv_id\r\nwhere 
cl_id>:cl","commandtimeout":30,"usecache":true,"parameters":[{"Name":"cl","DataType":"ftInteger","Direction":"pdInput","Size":-1,"Value":777}]}

Server side:

type
  TJsonString = RawUTF8;
  TJsonList = variant;
  TJsonObject = variant;
  TRemoteCommand = (rcClearCache, rcCacheStats);

  TCacheItem = class(TPersistent)
  private
    fTables: TStringList;
    fParameters: TJSONList;
    FCommandText: string;
    FJsonResponse: TJsonString;
    FLastAccess: TDateTime;
    FRecordCount: integer;
    function GetJsonResponse: TJsonString;
  public
    property RecordCount: integer read FRecordCount write FRecordCount;
    property CommandText: string read FCommandText write FCommandText;
    property Parameters: TJSONList read FParameters write FParameters;
    property Tables: TStringList read FTables;
    property JsonResponse: TJsonString read GetJsonResponse write FJsonResponse;
    property LastAccess: TDateTime read FLastAccess;
    constructor Create;
    destructor Destroy; override;
  end;

  TQueryCache = class
  private
    items: TObjectList;
    criticalSection: TMultiReadExclusiveWriteSynchronizer;
    function GetCacheItem(index: Integer): TCacheItem;
  public
    procedure RemoveTableCache(const ATableName: string);
    function FindCacheItem(const ACommandText: string; AJsonParameters: TJSONList): TCacheItem;
    procedure Add(const ACommandText: string; var AJsonParameters: TJSONObject; QeuryTables: TStrings; const ARecordCount: Integer; const AJsonResponse: string);
    function Count: integer;
    property CacheItems[index: Integer]: TCacheItem read GetCacheItem;
    procedure GarbageCleanup;
    function Cleanup: integer;
    constructor Create;
    destructor Destroy; override;
    procedure ReadLock;
    procedure ReadUnlock;
  end;

  TCacheWorker = class(TThread)
  private
    dsCache: TAdoDataset;
    cache: TQueryCache;
    connection: TAdoConnection;
    tableList: TStringList;
    procedure ProcessTable(const ATableName: string; const AChangeDate: TDateTime);
    procedure SetConnectionString(const Value: string);
    function GetConnectionString: string;
  protected
    procedure Execute; override;
  public
    property ConnectionString: string read GetConnectionString write SetConnectionString;
    constructor Create(ACache: TQueryCache); reintroduce; overload;
    destructor Destroy; override;
  end;

Client code:

 

TRestConnection = class(TComponent)
  private
    FClients: TList;
    FDataSets: TList;
    FRestAddress: string;
    FCommandResource: string;
    FQueryResource: string;
    FStoredProcedureResource: string;
    fParametersResource: string;
    FSSL: boolean;
    function DefaultRestAddress: Boolean;
    function DefaultCommandResource: Boolean;
    function DefaultQueryResource: Boolean;
    function DefaultStoredProcedureResource: Boolean;
    function DefaultParametersResource: Boolean;
    function DefaultSSL: Boolean;
    procedure ClearRefs;
    function GetDataSetCount: Integer;
    function GetDataSet(Index: Integer): TDataSet;
  published
    property RestAddress: string read FRestAddress write FRestAddress stored DefaultRestAddress;
    property QueryResource: string read FQueryResource write FQueryResource stored DefaultQueryResource;
    property CommandResource: string read FCommandResource write FCommandResource stored DefaultCommandResource;
    property ParametersResource: string read FParametersResource write FParametersResource stored DefaultParametersResource;
    property StoredProcedureResource: string read FStoredProcedureResource write FStoredProcedureResource stored DefaultStoredProcedureResource;
    property SSL: boolean read FSSL write FSSL stored DefaultSSL;
end;

  TRestDataset = class(TkbmCustomMemTable)
  private
    vJson: TJsonObject;
    httpRequest: TIdHTTP;
    SSLHandler: TIdSSLIOHandlerSocketOpenSSL;
    CompressorZlib: TIdCompressorZlib;
    FConnection: TRestConnection;
    FCommandText: String;
    FParameters: TParameters;
    FCommandTimeout: integer;
    FParamCheck: Boolean;
    FUseCache: Boolean;
    FTables: TStringList;
    FAfterOpen: TDataSetNotifyEvent;
    function GetParameters: TParameters;
    procedure SetParameters(const Value: TParameters);
    procedure Connected(Sender: TObject);
    procedure SetConnection(const Value: TRestConnection);
  protected
    CommandType: TCommandType;
    procedure SetCommandText(const Value: String);
    function GetCommandText: String;
    procedure InternalInitFieldDefs; override;
    procedure DoBeforeOpen; override;
    procedure DoAfterOpen; override;
    procedure DoBeforePost; override;
    procedure DoBeforeDelete; override;
    function PSGetKeyFields: string; override;
    procedure AssignCommandText(const Value: WideString; Loading: Boolean = False);
    function ComponentLoading: Boolean;
    procedure InternalRefreshParameters;
    procedure DoAfterScroll; override;
    procedure DoBeforeScroll; override;
    procedure InternalRefresh; override;
  public
    constructor Create(AOwner: TComponent); override;
    destructor Destroy; override;
  published
    property Active;
    property BeforeOpen;
    property BeforeDelete;
    property AfterDelete;
    property BeforeScroll;
    property AfterScroll;
    property OnCalcFields;
    property AfterInsert;
    property BeforeEdit;
    property AfterEdit;
    property BeforePost;
    property AfterPost;
    property AfterOpen: TDataSetNotifyEvent read FAfterOpen write FAfterOpen;
    property Connection: TRestConnection read FConnection write SetConnection;
    property CommandText: String read GetCommandText write SetCommandText;
    property Parameters: TParameters read GetParameters write SetParameters;
    property CommandTimeout: Integer read FCommandTimeout write FCommandTimeout default 30;
    property ParamCheck: Boolean read FParamCheck write FParamCheck default True;
    property UseCache: Boolean read FUseCache write FUseCache default True;
  end;

   TRestCommand = class(TComponent)
  private
    FCommandText: String;
    FConnection: TRestConnection;
    httpRequest: TIdHTTP;
    SSLHandler: TIdSSLIOHandlerSocketOpenSSL;
    FParameters: TParameters;
    FParamCheck: Boolean;
    FCommandTimeout: Integer;
    function GetParameters: TParameters;
    procedure SetCommandText(const Value: String);
    procedure SetParameters(const Value: TParameters);
    procedure SetConnection(const Value: TRestConnection);
    procedure Connected(Sender: TObject);
  protected
    CommandType: TCommandType;
    procedure AssignCommandText(const Value: WideString; Loading: Boolean = False);
    function ComponentLoading: Boolean;
    procedure InternalRefreshParameters;
  public
    constructor Create(AOwner: TComponent); override;
    destructor Destroy; override;
    procedure Execute; overload;
  published
    property Connection: TRestConnection read FConnection write SetConnection;
    property CommandText: String read FCommandText write SetCommandText;
    property Parameters: TParameters read GetParameters write SetParameters;
    property CommandTimeout: Integer read FCommandTimeout write FCommandTimeout default 0;
    property ParamCheck: Boolean read FParamCheck write FParamCheck default True;
  end;

  TRestStoredProc = class(TRestDataset)
  public
    constructor Create(AOwner: TComponent); override;
    procedure ExecProc;
  published
    property ProcedureName: String read GetCommandText write SetCommandText;
  end;

Finally I was able to meet the customer requirements. Solution is very fast and quite pleasant in use and maybe useful for someone, so the question is if that solution seems to be useful for anyone except me, so I could create the github repo and extract that mechanism from my project to create the clean one. Unfortunately I cannot fully support this idea because I have a lot of tasks at job.

Last edited by radexpol (2020-05-18 13:44:24)

Offline

#2 2020-05-18 14:12:25

macfly
Member
From: Brasil
Registered: 2016-08-20
Posts: 374

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

I have an application with similar requirements and I have often thought about doing this, for a quick and less painful migration.

Very good to know that you managed to make this migration successfully. This is encouraging.

Did you implemented a DataSet with the same properties as TADOQuery / Table, or Extended these classes?

Offline

#3 2020-05-18 14:38:28

radexpol
Member
From: Poland, Krk
Registered: 2019-11-29
Posts: 116

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

macfly wrote:

I have an application with similar requirements and I have often thought about doing this, for a quick and less painful migration.

Very good to know that you managed to make this migration successfully. This is encouraging.

Did you implemented a DataSet with the same properties as TADOQuery / Table, or Extended these classes?

Look at the code from my post. You'll see the RestDataset properties.

Offline

#4 2020-05-18 14:52:34

macfly
Member
From: Brasil
Registered: 2016-08-20
Posts: 374

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

Sorry, I did a quick read and didn't see that there was more code on the scroll. sad

You used a memtable as base.

Cool, this seems like a good solution to emulate ADO datasets.

Offline

#5 2020-05-18 19:30:18

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,706
Website

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

@radexpol
Please follow the forum rules and don't post huge piece of code in the thread itself.
https://synopse.info/forum/misc.php?action=rules

Offline

#6 2020-05-19 09:21:05

JD
Member
Registered: 2015-08-20
Posts: 118

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

@radexpol

Thanks for sharing your experience. I am quite interested in how you solved your problem. If you do go ahead and create a github repo, it will certainly be most welcome from people like me.

JD

Last edited by JD (2020-05-19 09:21:50)

Offline

#7 2020-05-19 10:33:17

tech
Member
Registered: 2014-01-13
Posts: 110

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

Nice, i'm in the same situation and have a huge business application developed with firebird and UIB components.
Needs :

  • Use the app over vpn network whitch is a nigthmare because firebird protocol is to much slower

  • Have a possibility of changing the RDBMS to meet some customers obligations like MS SQL or POSTGRESQL

Problems :

  • The app is about 10 years development and we can't refactor it quickly to adopt mormot and enjoy its many advantages

  • DB layer is hardly coded with a lot of SQL queries.

@radexpol
Can your solution resolve some points of my needs ?

Regards,

Offline

#8 2020-05-19 12:28:18

radexpol
Member
From: Poland, Krk
Registered: 2019-11-29
Posts: 116

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

Nice, i'm in the same situation and have a huge business application developed with firebird and UIB components.
Needs :

  • Use the app over vpn network whitch is a nigthmare because firebird protocol is to much slower

    You won't have such problems, because the server makes queries to database not a client, the multiple connections are always open and the data are compressed

  • Have a possibility of changing the RDBMS to meet some customers obligations like MS SQL or POSTGRESQL

    It is not so simple to migrate from one database to another, and this does not refers to my solution but sql syntax different for each database server. For example the MS SQL uses cursors for loops, the firebird <for select into>, if you use simple queries such as <select * from users where id >:id, my solution should meet your needs

Problems :

  • The app is about 10 years development and we can't refactor it quickly to adopt mormot and enjoy its many advantages

    The same situation smile - 15 years

  • DB layer is hardly coded with a lot of SQL queries.

@radexpol
Can your solution resolve some points of my needs ?

Yes, but extra code is required if you use IbTable, IbEvents. Some modifications on the server side also must be done, because my solution is based on ado* components not Ibx* and you will have to replace all Ib* to Rest*

Regards,

Offline

#9 2020-05-19 13:10:37

EMartin
Member
From: Buenos Aires - Argentina
Registered: 2013-01-09
Posts: 337

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

I had similar needs but migrating from 3tier (with RemObjects) to 3tier (with mORMot). I implemented TSynRestDataset but using full mORMot features -not Indy, not OpenSSL, mORMot validators- (see https://synopse.info/forum/viewtopic.php?id=2712), we make small adaptations in out client applications (5, 10, 15 years) and all works fine. The mORMot framework greatness is infinite.


Esteban

Offline

#10 2020-05-19 13:20:17

radexpol
Member
From: Poland, Krk
Registered: 2019-11-29
Posts: 116

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

EMartin wrote:

I had similar needs but migrating from 3tier (with RemObjects) to 3tier (with mORMot). I implemented TSynRestDataset but using full mORMot features -not Indy, not OpenSSL, mORMot validators- (see https://synopse.info/forum/viewtopic.php?id=2712), we make small adaptations in out client applications (5, 10, 15 years) and all works fine. The mORMot framework greatness is infinite.

Yeah, but my SQL Queries on the client side contains millions of lines using MS SQL specified syntax keywords with declaring variables, creating ms sql temporary memory tables etc:) We are using it in reports, statistics, exec query as dynamic string (sp_executesql) dynamically changed sql texts etc.

We built completely different solutions I think.

Offline

#11 2020-05-19 13:27:14

EMartin
Member
From: Buenos Aires - Argentina
Registered: 2013-01-09
Posts: 337

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

Sure, I mean that the mORMot framework allow custom solutions, although I prefer to use mORMot features, RemObjects (my old version) also uses Indy and OpenSSL but the TWinHTTP is very superior.


Esteban

Offline

#12 2020-05-19 15:01:02

tech
Member
Registered: 2014-01-13
Posts: 110

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

@radexpol

Thanks for the answer, I'll explore your solution asap.

Offline

#13 2020-05-19 15:53:16

radexpol
Member
From: Poland, Krk
Registered: 2019-11-29
Posts: 116

Re: 2tier to 3tier migration. AdoDataset -> RestDataset solution

Some logs from app:

[17:37:16][INFO] [RESTProtocol] Request from 127.0.0.1, command: /api/queryparameters
[17:37:16][INFO] [RESTProtocol] Retrieving query parameters. Query:select top 10000 * from calls where cl_id > :no
[17:37:16][INFO] [RESTProtocol] Retrieving complete

------------- Uncached query -----------

[17:37:36][INFO] [RESTProtocol] Request from 127.0.0.1, command: /api/querydata
[17:37:36][INFO] [RESTProtocol] Query from 127.0.0.1: select top 10000 * from calls where cl_id > :no. Parameters: [{"Name":"no","DataType":"ftInteger","Direction":"pdInput","Size":4,"Value":1}]
[17:37:36][INFO] [RESTProtocol] Query not found in cache, executing...
[17:37:37][INFO] [RESTProtocol] Query executed in 1011 ms. Recordcount: 10000

------------- Cached query -----------

[17:37:58][INFO] [RESTProtocol] Request from 127.0.0.1, command: /api/querydata
[17:37:58][INFO] [RESTProtocol] Query from 127.0.0.1: select top 10000 * from calls where cl_id > :no. Parameters: [{"Name":"no","DataType":"ftInteger","Direction":"pdInput","Size":4,"Value":1}]
[17:37:58][INFO] [RESTProtocol] Query found in cache. Recordcount: 10000

The table [calls] contains about 25 columns, integers, date, strings. Log above show the caching sample, first query build the cache response for future use by other clients, and second query does not send request to the database, just send the previous cached data instead.

The GET method on this resource (/api/querydata) display helpful tools for managing cache:

https://localhost/api/querydata

command=clearcache - clear all cached queries
     Sample: http://localhost/api/querydata?command=clearcache
command=cachestats - show statistics of cached queries
     Sample: http://localhost/api/querydata?command=cachestats

For example the cache statistics link for test above looks like that:

Query:
select top 10000 * from calls where cl_id > :no
Parameters:
[{"Name":"no","DataType":"ftInteger","Direction":"pdInput","Size":4,"Value":1}]
Last Access:
2020-05-19 17:37:58
Payload size:
258701


The payload size is compressed json data - stream ready to be send to other clients.

Last edited by radexpol (2020-05-19 16:10:28)

Offline

Board footer

Powered by FluxBB