#1 2016-04-15 12:40:28

emk
Member
Registered: 2013-10-24
Posts: 96

Serialize JSON for a dataset which already has JSON content

Hello,

I'm trying to make an method based service which optionally accepts as an argument a Firebird SQL script "execute block as.. insert...update..delete" (brings updates from client).
This script is not executed by TSQLDBProperties (because can't execute scripts); it's executed by a TZConnection or IBObjects connection (IBO has native access to Firebird, bypassing TDataset; it has also a TIB_Cursor which is native, very fast and doesn't allocate memory) - I didn't decided yet, ZEOS or IBO.

My method based service returns data from a log table (log for replication via db-triggers). My log table could look like:

LOGID - int64
TRANSACTIONID - int64
TIME - time
LOG_TEXT - varchar(3000)

First time I wanted to save in LOG_TEXT field the SQL statements recorded by triggers, but I think it's better to have it in JSON format so optionally I can parse it and I can present it to the user a history of an object like Invoice, Order.. etc. So LOG_TEXT could be:

{"Table" = "Invoice", "Method" = "Insert", "ID" = 1234, "No_of_products" = 10, "Amount" = 100}
or
{"Table" = "Invoice", "Method" = "Update", "ID" = 1234, "No_of_products" = 9, "Amount" = 90}
or
{"Table" = "Invoice", "Method" = "Delete", "ID" = 1234}

What should I do for my method based service not to escape the JSON twice and maybe to combine to nice JSON like this:

{"LOGID" = 10010, "TRANSACTIONID = 5678, "TIME" = int64_time, "Table" = "Invoice", "Method" = "Insert", "ID" = 1234, "No_of_products" = 10, "Amount" = 100}
...

Offline

#2 2016-04-16 08:49:34

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: Serialize JSON for a dataset which already has JSON content

AFAIR TSQLDBProperties does support "execute block...." since we use it for batch insert/update/delete at ORM level.

If you use TSQLDBProperties, you could retrieve the JSON directly from a SELECT.
Consider using Zeos/ZDBC, since it has direct support of JSON with TSQLDBProperties, and is very optimized for speed. Just use the 7.2/7.3 branch.

To manage JSON in a custom way, take a look at TDocVariant in the documentation.
It is a fast and quick way of manipulating JSON.

Offline

#3 2016-04-16 14:54:31

emk
Member
Registered: 2013-10-24
Posts: 96

Re: Serialize JSON for a dataset which already has JSON content

"AFAIR TSQLDBProperties does support "execute block...." since we use it for batch insert/update/delete at ORM level."

That's great news. Thank you. I'll try it.

"To manage JSON in a custom way, take a look at TDocVariant in the documentation.
It is a fast and quick way of manipulating JSON."

I'll look also, but I think my solution resides in TDynArray since my clients for this replication service are only Delphi apps. My priority is to be fast as possible and eventually smallest content; so I wanted to get rid of 2 times JSON escaping:

Log = record
           LOGID: Int64;
           TRANSACTIONID: Int64;
           LOG_TEXT: RawUTF8;
         end;
LogA = array of Log;
DynLogA = TDynArray;

And as a result of my method based service will send DynLogA, which will be binary serialized compact, like you said in documentation, as Base64. So that's great. I have to try it.

"If you use TSQLDBProperties, you could retrieve the JSON directly from a SELECT.
Consider using Zeos/ZDBC, since it has direct support of JSON with TSQLDBProperties, and is very optimized for speed. Just use the 7.2/7.3 branch."

I'm considering Zeos (with or without TSQLDBProperties) because I need direct UTF8 buffer, so I can get rid of conversion.

Offline

#4 2016-04-16 21:21:59

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: Serialize JSON for a dataset which already has JSON content

Or just try SynDbRemote unit...

Offline

#5 2016-04-16 22:34:19

emk
Member
Registered: 2013-10-24
Posts: 96

Re: Serialize JSON for a dataset which already has JSON content

SynDBRemote was the first option I studied but it's not good for me since it creates one HttpServer per SQLDBProperties and I have thousands of databases which I need to make them online. So, step2 - I'm looking at services. I'll take your advice and create one TZConnection (or TSQLDBProperties if can execute "execute block...") per database guarded by a TCriticalSection and with one housekeeping thread who will disconnect the connection after let's say 15min.

TDynArray of record will return binary coded as Base64 in Delphi 10? because in one place I read this but in another says that only <Delphi2010 versions.

Last edited by emk (2016-04-16 23:52:42)

Offline

#6 2016-04-17 08:07:49

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: Serialize JSON for a dataset which already has JSON content

With a method-based service, no need to Base64 the binary: just compress it (e.g. using SynLZ), and send it to the client as a binary blob.

Offline

#7 2016-04-17 11:28:36

emk
Member
Registered: 2013-10-24
Posts: 96

Re: Serialize JSON for a dataset which already has JSON content

I was able to modify Samples-06 and I was able to return a dynamic array of records. Looking from browser, Delphi 10 serializes this like normal JSON. By the way these days I have to make new samples starting from Samples-06 making method based service very clear how to return different structures of data (a result but not with names pipes, a record, an array of records, a file, ...) because o lot of beginers in mORMot like me spin their head around with so many options. This is my conclusion reading this forum together with my small learning experience. People they know what is a rest server and they want to know quickly how to create a SOA with visible (JSON) results from browser. When it's ready I'll send an email, maybe you publish them.

By the way, speaking of so many options, which functions/procedures I have to use so an array of records (or similar fast non-compatible "TDataset") aka a result set of a query should be returned by a method base service; the fastest way. What do you use to serialize to binary a result set of a query? Thank you.

Offline

#8 2016-04-17 11:46:08

emk
Member
Registered: 2013-10-24
Posts: 96

Re: Serialize JSON for a dataset which already has JSON content

I think I found it: TSQLDBStatement. FetchAllToBinary(Dest: TStream; MaxRowCount: cardinal=0; DataRowPosition: PCardinalDynArray=nil): cardinal; virtual;

but what means "DataRowPosition" ? documentation is lacking for this.

1. What is the reverse function for this, for executing on client?
2. It's the best/fastest way for binary serialization? (so many options smile )
2. I have to manually compress "Dest" with SynLZ or it's enough if I register compression and on the client I set header accepting this compression?

Offline

#9 2016-04-17 20:20:30

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: Serialize JSON for a dataset which already has JSON content

Check

  /// read-only virtual TDataSet able to access a binary buffer as returned
  // by TSQLStatement.FetchAllToBinary method or directly a TSQLStatement
  TSynBinaryDataSet = class(TSynVirtualDataSet)
  ...

or

/// fetch a SynDB's TSQLDBStatement.FetchAllToBinary buffer into a VCL DataSet
// - just a wrapper around TSynBinaryDataSet.Create + Open
// - if you need a writable TDataSet, you can use the slower ToClientDataSet()
// function as defined in SynDBMidasVCL.pas
function BinaryToDataSet(aOwner: TComponent;
  const aBinaryData: RawByteString): TSynBinaryDataSet;

in SynDBVcl.pas unit.

See also sample "17 - TClientDataset use", which shows how to fit a TDBGrid with data coming from several sources, including FetchAllToBinary.
As you would see in the timing, FetchAllToBinary has very high performance and low memory use.

If you transmit the binary buffer over the wire, using a method-based service, directly as a blob (may be after SynLZ compression + encryption), you may have exactly what you need.

Offline

Board footer

Powered by FluxBB