#1 Re: Source Code repository » SynCommons: TDynArray and Record compare/load/save using fast RTTI » 2012-10-15 22:35:39

Thank you for the links. I tried reading and reading again but I fail to be able to use it. Why not make a small program with code that loads text file strings and use a custom compare that may help others? not part of SynCommons but stand alone small program to show how to use it. Thank you

#2 Re: Source Code repository » SynCommons: TDynArray and Record compare/load/save using fast RTTI » 2012-10-15 02:03:47

Thank you for this valuable code. I don't know how to use it though and the examples are not really helpful. I have a Record that is made up of strings and integers.

I collect these records from files (2 to 3 separate files).
How do I load one file at a time to add to the array / Wrapper?

I need to compare fields (string) which may not be first field.
The testing code here shows a compare function with parameters:

function FVSort(const A,B): integer;
begin
  result := SysUtils.StrComp(PChar(pointer(TFV(A).Detailed)),PChar(pointer(TFV(B).Detailed)));
end;
...............
AFP.CreateOrderedIndex(Index,FVSort);
  for i := 1 to AUP.Count-1 do
    Check(AF[Index[i]].Detailed>AF[Index[i-1]].Detailed);
  AFP.Compare := FVSort;
  AFP.Sort;

but when I try this Delphi 7 will not let me do that without parameters?
Do I need to change something?

function CompareSubLine(const Item1, Item2): integer;
begin
  Result := SysUtils.AnsiCompareStr(((TCustomRecord(item1).fSubline)),
                                    ((TCustomRecord(item2).fSubline)));
end;

I am also confused when do I "setLength" on Array of record and populate it and when do I use DA.init? How do I populate it from a file where I use a call to parse file lines to split items into the record fields? The files are parsed differently.

Right now I use 3 arrays: one array to get one file, second array to get second file, then third array is to combine first and second array into it. Problem with this is huge memory overhead and not very efficient as at some point I have doubled the data.

Thank you again

Board footer

Powered by FluxBB