You are not logged in.
Pages: 1
I'm having a problem with a stored procedure that returns 300+ columns. FetchAllToBinary throws a 'Too Many Columns' exception. Unfortunately I'm unable to modify the procedure to return less columns. Any suggestions?
FillChar(Null,sizeof(Null),0);
result := 0;
W := TFileBufferWriter.Create(Dest);
try
W.WriteVarUInt32(FETCHALLTOBINARY_MAGIC);
FMax := ColumnCount;
W.WriteVarUInt32(FMax);
if FMax>0 then begin
// write column description
SetLength(ColTypes,FMax);
dec(FMax);
for F := 0 to FMax do begin
W.Write(ColumnName(F));
ColTypes[F] := ColumnType(F,@FieldSize);
W.Write1(ord(ColTypes[F]));
W.WriteVarUInt32(FieldSize);
end;
// initialize null handling
NullRowSize := (FMax shr 3)+1;
if NullRowSize>sizeof(Null) then <-------------------------
raise ESQLDBException.CreateUTF8(
'%.FetchAllToBinary: too many columns',[self]);
Offline
Sadly it is hardcoded...
Yes, TSqlDbProxyStatementColumns is a set limited to 256 elements. Changing this to an array is a way around this limit.
Offline
Meet same problem here, Syndb support fetch unlimted columns, but this FetchAllToBinary break the completeness.
And FetchAllToJSON() will loose ColumnValueDBSize (original DBMS declared fieldsize ) information , FetchAllToBinary is the only way in my enviorment. hope it will be corrected.
Offline
I have hit same limit for a table with more than 256 columns, will there be a change or correction for the hard coded limit?
Offline
256 columns is already too much! Time for DB refactoring!
See https://dba.stackexchange.com/a/3976
But anyway, since most SQL DB allows up to 1000 columns per table - with performance impact for sure - I allowed any column count limit for SynDB remote binary serialization.
See https://synopse.info/fossil/info/3fb2ed0db6
Offline
Thank you very much, it worked perfectly. Tested with 260 columns.
Offline
Pages: 1