#1 2016-03-11 07:39:55

squirrel
Member
Registered: 2015-08-13
Posts: 155

Expand or Collapse Json

ISQLDBROWS.FetchAllAsJSON gives the option to specifiy the format of the json as either expanded or not.  Is there a function or way to expand or collapse a rawutf8 dataset so that it matches the same formats as returned by either ISQLDBROWS.FetchAllAsJSON(true) or ISQLDBROWS.FetchAllAsJSON(false)?

I am asking if its possible to go both ways, but the initial need would be to collapse data returned using ISQLDBROWS.FetchAllAsJSON(true) so that it matches the format of ISQLDBROWS.FetchAllAsJSON(false).  This is to ensure that only one format is consistantly used.

The issue comes in when using an array of record in TdocVariant and trying to export it as json.  That only produces expanded format.

type
  TMyType = record
    DateData: TDateTime;
    IntData: Integer;
    StrData: string;
  end;
  TMyTypeArray= array of TMyType;
var 
  doc: TDocVariantData; 
  jsonResult: RawUTF8;
  myType: TMyTypeArray;
begin
    [...] //fill myType with values and do processing
  doc.InitFromTypeInfo(mytype, TypeInfo(TMyTypeArray), true, []);
  jsonResult:= doc.ToJSON; //this is expanded, and must be collapsed

Last edited by squirrel (2016-03-14 08:00:55)

Offline

#2 2016-03-14 08:31:25

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

It feels a bit clumsy, but this should copy the data to allow exporting to the same json format as when expanded=false is used.  Does anyone have suggestions as to how this can be improved?

function CollapseJSON(const data: RawUTF8): RawUTF8;
var
  Source, Target: TDocVariantData;
  TargetValues, valueList : variant;
  fieldCnt, rowCnt, I, J: integer;
begin
  Source.InitJSON(data);
  Target.InitFast;
  TargetValues := _ArrFast([]);
  fieldCnt := 0;
  rowCnt := Source.Count;

  if rowCnt > 0 then
  begin
    valueList := Source.Value[0];
    fieldCnt := DocVariantData(valueList)^.Count;
    Target.Value['fieldCount'] := fieldCnt;
    for I := 0 to fieldCnt -1 do
      DocVariantData(TargetValues)^.AddItem(DocVariantData(valueList)^.Names[i]);

    for I := 0 to rowCnt -1 do
    begin
      valueList := Source.Value[i];
      for J := 0 to fieldCnt -1 do
        DocVariantData(TargetValues)^.AddItem(DocVariantData(valueList)^.Value[J]);
    end;
  end;
  Target.AddValue('values', TargetValues);
  Target.Value['rowCount'] := rowCnt;

  Result := Target.ToJSON;
end;

function RecordToCollapsedJSON(const data; const RecTypeInfo: pointer): RawUTF8;
var
  Source: TDocVariantData;
begin
  Source.InitFromTypeInfo(data, RecTypeInfo, true, []);
  result := CollapseJSON(Source.ToJSON);
end;

Using the initial example, this can then be used like so:

  result := RecordToCollapsedJSON(mytype, TypeInfo(TMyTypeArray));

Last edited by squirrel (2016-03-15 05:52:35)

Offline

#3 2016-03-14 10:23:56

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

Using a TTextWriter and a process similar to JsonReformat() should be much faster.
I definitively need to add such a method to the framework.

Offline

#4 2016-03-14 11:29:24

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

Thanks, I'll take a look at TTextWriter again.

Since the json data is usually used to display information in tables, I have to process every dataset at client side into a more recognisable "table" structure, which is a slow process in javascript.

It would be useful to be able to provide a format or template to a function such as JsonReformat.  When the data have to be displayed in a table or shared with third parties, this structure is very logical, since the fieldnames are not part of the data and every row is a set on its own:

{
	fieldCount: 3,
	rowCount: 2,
	fieldNames: 
	    [ 
		"DateData", "IntData", "StrData"
  	    ],
	values: 
	    [	
		{  "2016-03-14T13:23:25",  1,  "Hello"  },
		{  "2016-03-14T13:23:45",  2,  "Bye"  }
  	    ]
}

Last edited by squirrel (2016-03-14 11:33:50)

Offline

#5 2016-03-14 14:06:13

mpv
Member
From: Ukraine
Registered: 2012-03-24
Posts: 1,571
Website

Re: Expand or Collapse Json

IMHO a fastest way is to serialize dataset to nonexpanded format FetchAllAsJSON(false) and transform it to a format you need on the client side using JavaScript. This will be definitely much faster then serialize to expanded format (traffic, client-side JSON.parse).On the client side transformation will take a microseconds if properly written

Offline

#6 2016-03-15 08:13:45

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

@mpv: Yes that might be the case when working on own projects, but I provide data for third parties.  They expect some standards, such as that a record only contain data for 1 record and not all the fields of all the records as one big list.

Last edited by squirrel (2016-03-15 08:14:17)

Offline

#7 2016-03-15 10:52:28

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

I've added the TDocVariantData.ToNonExpandedJSON method which may help in your case.

Offline

#8 2016-03-15 11:04:32

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

Excellent!  Thank you!  smile

Offline

#9 2016-03-15 11:41:37

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

I've added a simple parameter which would allow this function to return datasets as datasets, ie with fieldnames and rows:

function TDocVariantData.ToNonExpandedJSON(const SplitRows: boolean = false): RawUTF8;
var fields: TRawUTF8DynArray;
    fieldsCount: integer;
    W: TTextWriter;
    r,f: integer;
    row: PDocVariantData;
begin
  if VKind<>dvArray then begin
    result := '';
    exit;
  end;
  if VCount=0 then begin
    result := '[]';
    exit;
  end;
  with _Safe(VValue[0])^ do
    if VKind=dvObject then begin
      fields := VName;
      fieldsCount := VCount;
    end else
      fieldsCount := 0;
  if fieldsCount=0 then
    raise EDocVariant.Create('ToNonExpandedJSON: Value[0] is not an object');
  W := DefaultTextWriterJSONClass.CreateOwnedStream;
  try
    if SplitRows then
      W.Add('{"fieldCount":%,"rowCount":%,"fields":[',[fieldsCount,VCount])
    else
      W.Add('{"fieldCount":%,"rowCount":%,"values":[',[fieldsCount,VCount]);
    for f := 0 to fieldsCount-1 do begin
      W.Add('"');
      W.AddJSONEscape(pointer(fields[f]));
      W.Add('"',',');
    end;
    if SplitRows then
    begin
      W.CancelLastComma;
      W.Add('],"values":[',[]);
    end;
    for r := 0 to VCount-1 do begin
      if SplitRows then
        W.Add('[',[]);
      row := _Safe(VValue[r]);
      if (r>0) and ((row^.VKind<>dvObject) or (row^.VCount<>fieldsCount)) then
        raise EDocVariant.CreateUTF8('ToNonExpandedJSON: Value[%] not object',[r]);
      for f := 0 to fieldsCount-1 do
        if (r>0) and not IdemPropNameU(row^.VName[f],fields[f]) then
          raise EDocVariant.CreateUTF8('ToNonExpandedJSON: Value[%] field=% expected=%',
            [r,row^.VName[f],fields[f]]) else begin
          W.AddVariant(row^.VValue[f],twJSONEscape);
          W.Add(',');
        end;
      if SplitRows then
      begin
        W.CancelLastComma;
        W.Add('],',[]);
      end;
    end;
    W.CancelLastComma;
    W.Add(']','}');
    W.SetText(result);
  finally
    W.Free;
  end;
end;

It will return the nonexpanded json like this:

{
	fieldCount: 3,
	rowCount: 2,
	fields: 
	    [ 
		"DateData", "IntData", "StrData"
  	    ],
	values: 
	    [	
		[  "2016-03-14T13:23:25",  1,  "Hello"  ],
		[  "2016-03-14T13:23:45",  2,  "Bye"  ]
  	    ]
}

Offline

#10 2016-03-17 08:45:36

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

@ab I noticed that TDocVariantData.InitFromTypeInfo works as expected for TDateTime values, but when sending a TDate type, it is treated as the underlying float value.  Is this correct?

Offline

#11 2016-03-17 10:04:37

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

Yes, TDate are not identified as such. Nor TTime.
Only TDateTime is recognized.

Offline

#12 2017-01-27 12:54:57

igors233
Member
Registered: 2012-09-10
Posts: 241

Re: Expand or Collapse Json

With TDocVariantData.ToNonExpandedJSON saving TDocVariantData in compact form, is there a way to load it back to TDocVariantData? AFAIK loading through InitJSONInPlace only works with expanded format.
Right now I'm exchanging data between server and client in full format and it's around 8mb of data, even zipped it's large amount since clients are on mobile network. In compact form I'm expecting half of that which is a big save.

Offline

#13 2017-01-27 13:40:41

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

Perhaps  you could just use  TSqlTableJson for this.

Offline

#14 2017-01-28 02:56:25

igors233
Member
Registered: 2012-09-10
Posts: 241

Re: Expand or Collapse Json

ab wrote:

Perhaps  you could just use  TSqlTableJson for this.

Thanks, it seems to work in simple test. I would still suggest to add it into ToDo list, it would be faster with direct load from json and it would complement ToNonExpandedJSON.

Offline

#15 2017-01-28 04:04:25

igors233
Member
Registered: 2012-09-10
Posts: 241

Re: Expand or Collapse Json

igors233 wrote:

Thanks, it seems to work in simple test.

On a more detailed test with real data, there are differences, json passed through TSQLTableJSON and saved again as json won't have any nulls (depending on field type it will be "" or 0). Also if record contains child array object it will prefix " with \.
For example:

  rec.InitJSON('[{"Field1":"Value1","Field2":null,"Field3":["elem1","elem2",3]}]', JSON_OPTIONS_FAST);
  json := rec.ToNonExpandedJSON;
  t := TSQLTableJSON.Create('', json);
  t.ToDocVariant(v, False);
  json := VariantSaveJSON(v);
  rec.InitJSON(json, JSON_OPTIONS[True]);
  json := rec.ToJSON;

Original json:
[{"Field1":"Value1","Field2":null,"Field3":["elem1","elem2",3]}]
becomes:
[{"Field1":"Value1","Field2":null,"Field3":"[\"elem1\",\"elem2\",3]"}]

Offline

#16 2021-01-15 13:13:22

squirrel
Member
Registered: 2015-08-13
Posts: 155

Re: Expand or Collapse Json

What would be the most efficient way to format ISQLDBROWS into json that uses the format below? 

{
	"fieldCount": 3,
	"rowCount": 2,
	"fieldNames": ["DateData", "IntData", "StrData"],
	"values": [	
	    ["2016-03-14T13:23:25",  1,  "Hello"],
	    ["2016-03-14T13:23:45",  2,  "Bye"]
  	]
}

I basically need to create a custom version of FetchAllToJSON.  Should I create a new TextWriter and manually step through all the records and fields, or is there maybe a function that can return the entire row in json?

Last edited by squirrel (2021-01-15 13:16:06)

Offline

#17 2021-01-15 13:15:42

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

You need to write your custom version.

Offline

#18 2021-01-15 21:27:21

mpv
Member
From: Ukraine
Registered: 2012-03-24
Posts: 1,571
Website

Re: Expand or Collapse Json

I already have a such serializer. Many years ago I think it simplify a life for JS clients (browser, nodejs). But practically this only increase a memory consumption ( array object for each row).  The simple JS class what wrap a one dimention array will be faster and uses less memory

Offline

#19 2021-01-16 09:34:30

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,661
Website

Re: Expand or Collapse Json

Yes, the non-expanded format is efficient and not difficult to decode.
I don't see the relevance of trying other non-standard formats.

Offline

#20 2021-01-16 10:23:02

mpv
Member
From: Ukraine
Registered: 2012-03-24
Posts: 1,571
Website

Re: Expand or Collapse Json

@squirrel - here is my implementation of array-of-array serializer for TSQLTableJSON descendant  - https://git-pub.intecracy.com/unitybase … L4191-4305
Hope it's help

Offline

Board footer

Powered by FluxBB