You are not logged in.
Pages: 1
ISQLDBROWS.FetchAllAsJSON gives the option to specifiy the format of the json as either expanded or not. Is there a function or way to expand or collapse a rawutf8 dataset so that it matches the same formats as returned by either ISQLDBROWS.FetchAllAsJSON(true) or ISQLDBROWS.FetchAllAsJSON(false)?
I am asking if its possible to go both ways, but the initial need would be to collapse data returned using ISQLDBROWS.FetchAllAsJSON(true) so that it matches the format of ISQLDBROWS.FetchAllAsJSON(false). This is to ensure that only one format is consistantly used.
The issue comes in when using an array of record in TdocVariant and trying to export it as json. That only produces expanded format.
type
TMyType = record
DateData: TDateTime;
IntData: Integer;
StrData: string;
end;
TMyTypeArray= array of TMyType;
var
doc: TDocVariantData;
jsonResult: RawUTF8;
myType: TMyTypeArray;
begin
[...] //fill myType with values and do processing
doc.InitFromTypeInfo(mytype, TypeInfo(TMyTypeArray), true, []);
jsonResult:= doc.ToJSON; //this is expanded, and must be collapsed
Last edited by squirrel (2016-03-14 08:00:55)
Offline
It feels a bit clumsy, but this should copy the data to allow exporting to the same json format as when expanded=false is used. Does anyone have suggestions as to how this can be improved?
function CollapseJSON(const data: RawUTF8): RawUTF8;
var
Source, Target: TDocVariantData;
TargetValues, valueList : variant;
fieldCnt, rowCnt, I, J: integer;
begin
Source.InitJSON(data);
Target.InitFast;
TargetValues := _ArrFast([]);
fieldCnt := 0;
rowCnt := Source.Count;
if rowCnt > 0 then
begin
valueList := Source.Value[0];
fieldCnt := DocVariantData(valueList)^.Count;
Target.Value['fieldCount'] := fieldCnt;
for I := 0 to fieldCnt -1 do
DocVariantData(TargetValues)^.AddItem(DocVariantData(valueList)^.Names[i]);
for I := 0 to rowCnt -1 do
begin
valueList := Source.Value[i];
for J := 0 to fieldCnt -1 do
DocVariantData(TargetValues)^.AddItem(DocVariantData(valueList)^.Value[J]);
end;
end;
Target.AddValue('values', TargetValues);
Target.Value['rowCount'] := rowCnt;
Result := Target.ToJSON;
end;
function RecordToCollapsedJSON(const data; const RecTypeInfo: pointer): RawUTF8;
var
Source: TDocVariantData;
begin
Source.InitFromTypeInfo(data, RecTypeInfo, true, []);
result := CollapseJSON(Source.ToJSON);
end;
Using the initial example, this can then be used like so:
result := RecordToCollapsedJSON(mytype, TypeInfo(TMyTypeArray));
Last edited by squirrel (2016-03-15 05:52:35)
Offline
Thanks, I'll take a look at TTextWriter again.
Since the json data is usually used to display information in tables, I have to process every dataset at client side into a more recognisable "table" structure, which is a slow process in javascript.
It would be useful to be able to provide a format or template to a function such as JsonReformat. When the data have to be displayed in a table or shared with third parties, this structure is very logical, since the fieldnames are not part of the data and every row is a set on its own:
{
fieldCount: 3,
rowCount: 2,
fieldNames:
[
"DateData", "IntData", "StrData"
],
values:
[
{ "2016-03-14T13:23:25", 1, "Hello" },
{ "2016-03-14T13:23:45", 2, "Bye" }
]
}
Last edited by squirrel (2016-03-14 11:33:50)
Offline
IMHO a fastest way is to serialize dataset to nonexpanded format FetchAllAsJSON(false) and transform it to a format you need on the client side using JavaScript. This will be definitely much faster then serialize to expanded format (traffic, client-side JSON.parse).On the client side transformation will take a microseconds if properly written
Offline
@mpv: Yes that might be the case when working on own projects, but I provide data for third parties. They expect some standards, such as that a record only contain data for 1 record and not all the fields of all the records as one big list.
Last edited by squirrel (2016-03-15 08:14:17)
Offline
Excellent! Thank you!
Offline
I've added a simple parameter which would allow this function to return datasets as datasets, ie with fieldnames and rows:
function TDocVariantData.ToNonExpandedJSON(const SplitRows: boolean = false): RawUTF8;
var fields: TRawUTF8DynArray;
fieldsCount: integer;
W: TTextWriter;
r,f: integer;
row: PDocVariantData;
begin
if VKind<>dvArray then begin
result := '';
exit;
end;
if VCount=0 then begin
result := '[]';
exit;
end;
with _Safe(VValue[0])^ do
if VKind=dvObject then begin
fields := VName;
fieldsCount := VCount;
end else
fieldsCount := 0;
if fieldsCount=0 then
raise EDocVariant.Create('ToNonExpandedJSON: Value[0] is not an object');
W := DefaultTextWriterJSONClass.CreateOwnedStream;
try
if SplitRows then
W.Add('{"fieldCount":%,"rowCount":%,"fields":[',[fieldsCount,VCount])
else
W.Add('{"fieldCount":%,"rowCount":%,"values":[',[fieldsCount,VCount]);
for f := 0 to fieldsCount-1 do begin
W.Add('"');
W.AddJSONEscape(pointer(fields[f]));
W.Add('"',',');
end;
if SplitRows then
begin
W.CancelLastComma;
W.Add('],"values":[',[]);
end;
for r := 0 to VCount-1 do begin
if SplitRows then
W.Add('[',[]);
row := _Safe(VValue[r]);
if (r>0) and ((row^.VKind<>dvObject) or (row^.VCount<>fieldsCount)) then
raise EDocVariant.CreateUTF8('ToNonExpandedJSON: Value[%] not object',[r]);
for f := 0 to fieldsCount-1 do
if (r>0) and not IdemPropNameU(row^.VName[f],fields[f]) then
raise EDocVariant.CreateUTF8('ToNonExpandedJSON: Value[%] field=% expected=%',
[r,row^.VName[f],fields[f]]) else begin
W.AddVariant(row^.VValue[f],twJSONEscape);
W.Add(',');
end;
if SplitRows then
begin
W.CancelLastComma;
W.Add('],',[]);
end;
end;
W.CancelLastComma;
W.Add(']','}');
W.SetText(result);
finally
W.Free;
end;
end;
It will return the nonexpanded json like this:
{
fieldCount: 3,
rowCount: 2,
fields:
[
"DateData", "IntData", "StrData"
],
values:
[
[ "2016-03-14T13:23:25", 1, "Hello" ],
[ "2016-03-14T13:23:45", 2, "Bye" ]
]
}
Offline
@ab I noticed that TDocVariantData.InitFromTypeInfo works as expected for TDateTime values, but when sending a TDate type, it is treated as the underlying float value. Is this correct?
Offline
With TDocVariantData.ToNonExpandedJSON saving TDocVariantData in compact form, is there a way to load it back to TDocVariantData? AFAIK loading through InitJSONInPlace only works with expanded format.
Right now I'm exchanging data between server and client in full format and it's around 8mb of data, even zipped it's large amount since clients are on mobile network. In compact form I'm expecting half of that which is a big save.
Offline
Perhaps you could just use TSqlTableJson for this.
Thanks, it seems to work in simple test. I would still suggest to add it into ToDo list, it would be faster with direct load from json and it would complement ToNonExpandedJSON.
Offline
Thanks, it seems to work in simple test.
On a more detailed test with real data, there are differences, json passed through TSQLTableJSON and saved again as json won't have any nulls (depending on field type it will be "" or 0). Also if record contains child array object it will prefix " with \.
For example:
rec.InitJSON('[{"Field1":"Value1","Field2":null,"Field3":["elem1","elem2",3]}]', JSON_OPTIONS_FAST);
json := rec.ToNonExpandedJSON;
t := TSQLTableJSON.Create('', json);
t.ToDocVariant(v, False);
json := VariantSaveJSON(v);
rec.InitJSON(json, JSON_OPTIONS[True]);
json := rec.ToJSON;
Original json:
[{"Field1":"Value1","Field2":null,"Field3":["elem1","elem2",3]}]
becomes:
[{"Field1":"Value1","Field2":null,"Field3":"[\"elem1\",\"elem2\",3]"}]
Offline
What would be the most efficient way to format ISQLDBROWS into json that uses the format below?
{
"fieldCount": 3,
"rowCount": 2,
"fieldNames": ["DateData", "IntData", "StrData"],
"values": [
["2016-03-14T13:23:25", 1, "Hello"],
["2016-03-14T13:23:45", 2, "Bye"]
]
}
I basically need to create a custom version of FetchAllToJSON. Should I create a new TextWriter and manually step through all the records and fields, or is there maybe a function that can return the entire row in json?
Last edited by squirrel (2021-01-15 13:16:06)
Offline
I already have a such serializer. Many years ago I think it simplify a life for JS clients (browser, nodejs). But practically this only increase a memory consumption ( array object for each row). The simple JS class what wrap a one dimention array will be faster and uses less memory
Offline
@squirrel - here is my implementation of array-of-array serializer for TSQLTableJSON descendant - https://git-pub.intecracy.com/unitybase … L4191-4305
Hope it's help
Offline
Pages: 1