#2 mORMot 2 » Invalid JSON from ISqlDBStatement.ExecutePreparedAndFetchAllAsJson() » 2023-08-21 10:54:24

ma64
Replies: 2

Hello Arnaud, there is a small bug inside the method TSqlDBPostgresStatement.ColumnToJson() (unit mormot.db.sql.postgres; latest version of mORMot2). Values from PostgreSQL columns of type boolean are getting written as t and f instead of true and false, leading to invalid JSON.
An example of a call to the method ISqlDBStatement.ExecutePreparedAndFetchAllAsJson(), which internally calls ColumnToJson(), that shows this bug is

'{"fieldCount":1,"values":["autolinked",f,f,f,f,f,f],"rowCount":6}'

whereas it should be

'{"fieldCount":1,"values":["autolinked",false,false,false,false,false,false],"rowCount":6}'

The problem results from the fact, that the boolean column is seen as a column of type ftInt64 inside of ColumnToJson(),
so that the PostgreSQL value is wrongly written verbatim into the JSON. This can be fixed quickly by checking the real column type (as shown below), but maybe this is not an optimal solution, since this check is repeated for every row:

case ColumnType of
  ...
  
  ftInt64,
  ftDouble,
  ftCurrency:
    begin
    if PQ.ftype(fRes, Col) = BOOLOID then begin
      if putf8char(P)^ = 't' then
        W.AddNoJsonEscapeUtf8('true')
      else
        W.AddNoJsonEscapeUtf8('false');
    end else
      // note: StrLen is slightly faster than PQ.GetLength for small content
      W.AddNoJsonEscape(P, StrLen(P));
    end;

  ...

Kind regards
Michael

#3 Re: mORMot 1 » Small addition to get the location of an error in a JSON string » 2022-08-05 09:59:04

Ok, thanks. I will try to find a better solution based on your helpful comments.

#4 mORMot 1 » Small addition to get the location of an error in a JSON string » 2022-08-04 17:22:47

ma64
Replies: 2

Hello Arnaud,

I’ve been recently faced with the task to show the first error in an invalid JSON string and I came up with the following solution.

  1. In mormot.core.json I put a “var” in front of the parameter of the function TJsonGotoEndParser.GotoEnd() so that the index of the character that triggered the parser error gets returned:

    function TJsonGotoEndParser.GotoEnd(var P: PUtf8Char): PUtf8Char;
  2. Again in mormot.core.json I added an overloaded version of the function IsValidJsonBuffer() that returns the index of the erroneous JSON character in the third parameter:

    function IsValidJsonBuffer(P: PUtf8Char; strict: boolean; var ErrorPosition: uint64): boolean;
    var
      parser: TJsonGotoEndParser;
      pstart: putf8char;
    begin
      {%H-}parser.Init(strict, nil);
      pstart := P;
      result := parser.GotoEnd(P) <> nil;
      if not result then
        ErrorPosition := P - pstart;
    end;
  3. These small changes made it possible to write the following code that shows a small portion of the JSON surrounding the erroneous character as well as a marker pointing at that character:

    class function TJsonValidator.ValidateFile(const JsonFileName: string): boolean;
    var
      content: putf8char;
      mapping: THandle;
      fileSize: uint64;
      res: boolean;
      errPos, excStartPos, excEndPos: uint64;
      excerpt, marker: utf8string;
    begin
      content := MapFile(CommandLineParams.JsonFile, mapping, fileSize);
      result := IsValidJsonBuffer(pointer(content), true, errPos);
      WriteLn('');
      if result then
        WriteLn('File contains valid JSON.')
      else begin
        WriteLn('File contains invalid JSON:');
        excStartPos := Max(errPos - 20, 0);
        excEndPos := Min(errPos + 20, fileSize);
        SetLength(excerpt, excEndPos - excStartPos + 1);
        if Length(excerpt) > 0 then begin
          WriteLn('');
          Move((content + excStartPos)^, excerpt[1], Length(excerpt));
          WriteLn(Utf8ToString(excerpt));
          marker := StringOfChar(ansichar(' '), Length(excerpt));
          marker[errPos - excStartPos + 1] := '^';
          WriteLn(Utf8ToString(marker));
        end;
      end;
    end;
    
    class function TJsonValidator.MapFile(const FileName: string; out Mapping: THandle; out Size: uint64): putf8char;
    var
      fileHandle: THandle;
      lFileSize: LARGE_INTEGER;
      content: putf8char;
    begin
      if not FileExists(FileName) then
        raise Exception.Create('File not found');
      fileHandle := FileOpen(FileName, fmOpenRead or fmShareDenyWrite);
      Win32Check(fileHandle <> 0);
      try
        lFileSize.LowPart := GetFileSize(fileHandle, @lFileSize.HighPart);
        Size := puint64(@lFileSize)^;
        if Size = 0 then
          raise Exception.Create('File is empty');
        Mapping := CreateFileMapping(fileHandle, nil, PAGE_READONLY, 0, 0, nil);
        Win32Check(Mapping <> 0);
      finally
        FileClose(fileHandle);
      end;
      result := MapViewOfFile(Mapping, FILE_MAP_READ, 0, 0, 0);
      Win32Check(result <> nil);
    end;
    
    class procedure TJsonValidator.UnmapFile(Mapping: THandle; Buffer: putf8char);
    begin
      UnmapViewOfFile(Buffer);
      CloseHandle(Mapping);
    end;

Do you think this would be a helpful addition to mORMot?

#5 Re: mORMot 1 » DocVariantData trouble » 2015-11-28 19:41:15

Works for me too!
Thank you, ab, for your extraordinary support.

#6 Re: mORMot 1 » DocVariantData trouble » 2015-11-24 21:09:24

Thank you, mpv.
I've now been able to create a ticket for this DocVariant issue.

#7 Re: mORMot 1 » DocVariantData trouble » 2015-11-23 21:21:36

Hello ab,
how can one get registered for accessing the fossil system? I haven't been able to login there with my forum user id and password and hence couldn't open a ticket for the FPC 3.1.1 DocVariant issue that is mentioned in this post.
Regards, Michael

#8 mORMot 1 » App crashes when using DocVariant in Lazarus » 2015-10-23 08:58:49

ma64
Replies: 0

Hello,

I'm working on an application in Lazarus (latest version for Windows available from www.getlazarus.org, i.e. with FPC 3.1.1, Rev. 49965) and noticed a crash when accessing a newly created DocVariant. To reproduce the problem please run the following code:

program project1;

{$IFDEF FPC}
{$mode delphi}
{$ELSE}
{$APPTYPE CONSOLE}
{$ENDIF}

uses
  SynCommons;

var
  j: variant;

begin
  j := TDocVariant.New;
  j.Add('abc');
  WriteLn(VariantSaveJson(j));
  ReadLn;
end.       

In Delphi the expected JSON result is shown, while Lazarus throws an "External: SIGSEGV" exception when executing the .Add().

#9 Re: mORMot 1 » How to create a multi-column unique index? » 2011-01-30 15:24:36

You're right. Index name checks on the SQL level would require special commands for each supported database engine.
I think it would be a good compromise to add an optional parameter for the index name.

#10 Re: mORMot 1 » How to create a multi-column unique index? » 2011-01-28 17:40:18

Great, Thanks!

CreateSQLIndex() indeed can be seen as special case of CreateSQLMultiIndex().

There is however one small issue with both implementations (yours and mine) of CreateSQLMultiIndex(). Imagine a table with the three fields "cat", "fish" and "catfish" and two indexes created with the following commands:

CreateSQLMultiIndex(TAnimals, [cat, fish], true);
CreateSQLIndex(TAnimals, [catfish], true);

These two commands lead to the same name for both indexes and hence to a database error. Even if a separator is used there is still a chance that an index name collision occurs in case the field names itself contain the separator character. To prevent duplicate index name errors one could perform a

SELECT * FROM sqlite_master WHERE type='index' AND name={name of the index to create}

and if a row is returned then modify the name of the new index. Alternatively indexes generally could be named with increasing numbers (say from "Index000" to "Index999") and before creating a new index the last used index name could be determined with

SELECT MAX(name) FROM sqlite_master WHERE type='index'

Do you think a security check against duplicate index names should be implemented?

#11 Re: mORMot 1 » How to create a multi-column unique index? » 2011-01-28 02:37:09

Thanks for explaining!

Since indexes on multiple fields are a fairly common thing when working with databases, a dedicated framework function for the creation of such indexes might be helpful for a lot of users. Here's the method (a modification of the existing CreateSQLIndex()) that I added to SQLite3Commons.pas for that purpose:

function TSQLRestServer.CreateSQLMultiIndex(Table: TSQLRecordClass; const FieldNames: array of RawUTF8; Unique: boolean): boolean;
var
  SQL, SQLTableName: RawUTF8;
  TableIndex: integer;

  function InvalidFieldName(const FieldNames: array of RawUTF8): boolean;
  var
    i: integer;
  begin
    result := false;
    for i := 0 to High(FieldNames) do
      if Table.FieldIndex(FieldNames[i]) < 0 then
        begin
        result := true;
        exit;
        end;
  end;

  function Concat(const FieldNames: array of RawUTF8; Separator: AnsiChar): RawUTF8;
  var
    i, iMax: integer;
  begin
    result := '';
    iMax := High(FieldNames);
    if iMax >= 0 then
      begin
      result := FieldNames[0];
      for i := 1 to iMax do
        result := result + Separator + FieldNames[i];
      end;
  end;

begin
  TableIndex := Model.GetTableIndex(Table);
  if (TableIndex<0) or InvalidFieldName(FieldNames) or
     ((fStaticData<>nil) and (fStaticData[TableIndex]<>nil)) then begin
    result := false; // invalid Table or in Static data (index not needed)
    exit;
  end;
  SQLTableName := Table.SQLTableName;
  if Unique then
    SQL := 'UNIQUE ' else
    SQL := '';
  SQL := FormatUTF8('CREATE %INDEX IF NOT EXISTS Index%_% ON %(%);', [SQL, SQLTableName, Concat(FieldNames, '_'), SQLTableName, Concat(FieldNames, ',')]);
  result := EngineExecuteAll(SQL);
end;

Maybe you find it worth to be integrated into the official code base.

#12 mORMot 1 » How to create a multi-column unique index? » 2011-01-27 03:28:39

ma64
Replies: 6

Does the framework provide a built-in function for creating unique indexes on multiple columns?
I tried to use the overloaded version of CreateSQLIndex() that takes an array of field names as its second parameter, but instead of a multi-column index it created a separate index for each of the specified fields.

Example:

  CreateSQLIndex(TUsers, ['Surname', 'Name', 'Email'], true)

results in the three SQL commands

  CREATE UNIQUE INDEX IndexUsersSurname ON Users(Surname)
  CREATE UNIQUE INDEX IndexUsersName ON Users(Name)
  CREATE UNIQUE INDEX IndexUsersEmail ON Users(Email)

instead of the one that I expected:

  CREATE UNIQUE INDEX IndexUsers_Surname_Name_Email ON Users(Surname,Name,Email)

Did I overlook something?

Regards

Board footer

Powered by FluxBB