You are not logged in.
In our Source Code repository you would have perhaps noticed the SynLZ.pas and SynLZO.pas units.
These are two in-memory compression units, optimized for speed.
The SynLZO unit implements Synopse LZO Compression:
* it's an upgraded and fully rewriten version of our MyLZO unit;
* SynLZO is a very FAST portable lossless data compression library written in optimized pascal code for Delphi 3 up to Delphi 2010 with a tuned asm version available;
* offers *extremely* fast compression and decompression with good compression rate, in comparaison with its speed;
* original LZO written in ANSI C - pascal+asm conversion by A.Bouchez;
* simple but very fast direct file compression: SynLZO compressed files read/write is faster than copying plain files!
The SynLZ unit implements SynLZ Compression:
* SynLZ is a NEW lz-based algorithm implementing very FAST lossless data compression library;
* it's written in optimized pascal code for Delphi 3 up to Delphi 2010, with a tuned asm version available;
* symetrical compression and decompression speed (which is very rare above all other compression algorithms in the wild);
* good compression rate (usualy better than LZO);
* fastest averrage compression speed (ideal for xml/text communication, e.g.).
Even if SynLZO is a pascal conversion of the LZO algorithm, The SynLZ unit implements a new compression algorithm with the following features:
* hashing+dictionary compression in one pass, with no huffman table;
* optimized 32bits control word, embedded in the data stream;
* in-memory compression (the dictionary is the input stream itself);
* compression and decompression have the same speed (both use hashing);
* thread safe and lossless algorithm;
* supports overlapping compression and in-place decompression;
* code size for compression/decompression functions is smaller than LZO's.
Conversion notes:
- this format is NOT stream compatible with any lz* official format
=> use it internally in your application, not as exchange format
- very small code size (less than 1KB for both compressor/decompressor)
- the uncompressed data length is stored in the beginning of the stream and can be retrieved easily for proper out_p memory allocation
- please give correct data to the decompressor (i.e. first CRC in_p data)
=> we recommend our very fast Adler32 procedure, or a zip-like container
- a 2nd more tuned algorithm is included, but is somewhat slower in practice
=> use SynLZ[de]compres1*() functions in your applications
- tested and benchmarked with a lot of data types/sizes
=> use the asm code, which is very tuned: SynLZ[de]compress1asm()
- tested under Delphi 7, Delphi 2007 and Delphi 2009
- a hashing limitation makes SynLZ sometimes unable to pack continuous blocks of same byte -> SynLZ is perfect for xml/text, but SynLZO is prefered for database files
- if you include it in your application, please give me some credits: "use SynLZ compression by http://synopse.info"
- use at your own risk!
Some benchmark on a Sempron computer:
- SynLZ compression is 20 times faster than zip, decompression 3 times;
- same compression ratio as lzo algo, but faster (up to 2x) on compression;
- the R and W intermediate speed are at the compressed stream level, i.e. the speed which used for disk usage -> you see that SynLZ behaves very well for real-time data compression, for backup purpose e.g. (a typical SATA disk drive has a speed of 50-100 MB/s - best SDD speed is about 200 MB/s)
- SynLZ beats the LZO algorithms for compression speed, but is a bit slower on decompression.
KLOG.xml 6034 bytes
lz1 asm 1287 21.3% R 256 MB/s W 54 MB/s R 71 MB/s W 334 MB/s
lz1 pas 1287 21.3% R 184 MB/s W 39 MB/s R 58 MB/s W 274 MB/s
lz2 pas 1274 21.1% R 173 MB/s W 36 MB/s R 57 MB/s W 274 MB/s
lzo C 1347 22.3% R 185 MB/s W 41 MB/s R 111 MB/s W 501 MB/s
zip 806 13.4% R 14 MB/s W 1 MB/s R 14 MB/s W 110 MB/s
MiniLZO.cs 25252 bytes
lz1 asm 5775 22.9% R 246 MB/s W 56 MB/s R 70 MB/s W 306 MB/s
lz1 pas 5775 22.9% R 178 MB/s W 40 MB/s R 58 MB/s W 253 MB/s
lz2 pas 5762 22.8% R 166 MB/s W 37 MB/s R 57 MB/s W 250 MB/s
lzo C 5846 23.2% R 164 MB/s W 38 MB/s R 103 MB/s W 448 MB/s
zip 3707 14.7% R 15 MB/s W 2 MB/s R 22 MB/s W 154 MB/s
TestLZO.exe 158720 bytes
lz1 asm 110686 69.7% R 127 MB/s W 88 MB/s R 80 MB/s W 115 MB/s
lz1 pas 110686 69.7% R 98 MB/s W 68 MB/s R 63 MB/s W 90 MB/s
lz2 pas 109004 68.7% R 88 MB/s W 60 MB/s R 60 MB/s W 88 MB/s
lzo C 108202 68.2% R 40 MB/s W 27 MB/s R 164 MB/s W 241 MB/s
zip 88786 55.9% R 5 MB/s W 3 MB/s R 33 MB/s W 60 MB/s
Browsing.sq3db 46047232 bytes (46MB)
lz1 asm 19766884 42.9% R 171 MB/s W 73 MB/s R 73 MB/s W 171 MB/s
lz1 pas 19766884 42.9% R 130 MB/s W 56 MB/s R 59 MB/s W 139 MB/s
lz2 pas 19707346 42.8% R 123 MB/s W 52 MB/s R 59 MB/s W 139 MB/s
lzo asm 20629084 44.8% R 89 MB/s W 40 MB/s R 135 MB/s W 302 MB/s
lzo C 20629083 44.8% R 66 MB/s W 29 MB/s R 145 MB/s W 325 MB/s
zip 15564126 33.8% R 6 MB/s W 2 MB/s R 30 MB/s W 91 MB/s
TRKCHG.DBF 4572297 bytes (4MB)
lz1 asm 265782 5.8% R 430 MB/s W 25 MB/s R 29 MB/s W 510 MB/s
lz1 pas 265782 5.8% R 296 MB/s W 17 MB/s R 28 MB/s W 483 MB/s
lz2 pas 274773 6.0% R 258 MB/s W 15 MB/s R 27 MB/s W 450 MB/s
lzo C 266897 5.8% R 318 MB/s W 18 MB/s R 41 MB/s W 702 MB/s
zip 158408 3.5% R 25 MB/s W 0 MB/s R 11 MB/s W 318 MB/s
CATENA5.TXT 6358752 bytes
lz1 asm 3275269 51.5% R 132 MB/s W 68 MB/s R 66 MB/s W 129 MB/s
lz1 pas 3275269 51.5% R 103 MB/s W 53 MB/s R 57 MB/s W 112 MB/s
lz2 pas 3277397 51.5% R 95 MB/s W 49 MB/s R 57 MB/s W 112 MB/s
lzo C 3289373 51.7% R 63 MB/s W 33 MB/s R 90 MB/s W 175 MB/s
zip 2029096 31.9% R 4 MB/s W 1 MB/s R 29 MB/s W 91 MB/s
Offline
Here is a mail I just received about these units:
Hi Arnaud,
thank you very much for sending me the files!
I made a quick test and the compressor has amazing speed at a good
compression ratio. It is much faster than ZCompressStream with zcfastest
setting.
A 40MB TBitmap32 image is compressed to 19MB in less a second!
SynLZ is the faster one against SynLZO?It would be helpful if you add the following methods for Stream
(de)compressionprocedure LZCompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Size:= Source.Size; GetMem(Buf, Size); try Source.Seek(0, soFromBeginning); Source.Read(Buf^, Size); Size:= SynLZcompressdestlen(Size); GetMem(Tmp, Size); try ComprSize:= SynLZcompress1asm(Buf, Source.Size, Tmp); Dest.Write(ComprSize, SizeOf(ComprSize)); if Size > 0 then Dest.WriteBuffer(Tmp^, ComprSize); finally FreeMem(Tmp); end; finally FreeMem(Buf); end; end; procedure LZDecompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Source.Read(ComprSize, SizeOf(ComprSize)); GetMem(Tmp, ComprSize); try Source.Read(Tmp^, ComprSize); Size:= SynLZdecompressdestlen(Tmp); GetMem(Buf, Size); try Size:= SynLZdecompress1asm(Tmp, ComprSize, Buf); if Size > 0 then Dest.WriteBuffer(Buf^, Size); finally FreeMem(Buf); end; finally FreeMem(Tmp); end; end;
I have to save/load ComprSize because the stream to decompress has a
header and decompression is done at current stream position.
I had no luck with reading directly from a TMemoryStream.Memory^, so i
have to use Buffers with GetMem. Any chance to ged rid off using GetMem
when i already have a TMemoryStream?
Maybe my problem is because of the current stream position..
Btw. i use D2007 with FastMM4Fanatastic work!
Best regards
Dirk
About speed: yes, in my tests SynLZ is faster than LZO for compression.
There is yet indeed no stream-based compression. Compressing a buffer could work well. There could be other approach (on the fly compression using a fixed side memory buffer, or memory-mapped files e.g.) as well.
Thanks for your code sharing: I'll take a look at that, and optimize it for TMemoryStream if you think it's worth it. There is no need to store the size in your code, since both compressed and uncompressed sizes are already stored in our SynLZ/SynLZO units data.
Offline
Can this be used in commercial applications?
Thanks
As stated by the units content, they are both licensed under a MPL/GPL/LGPL tri-license.
So if you pickup the MPL license, you can use it to any commercial application, even statically linked (i.e. as a .dcu unit used to create the exe).
Please don't forget to put somewhere in your credit window or documentation, a link to http://synopse.info if you use any of these units (like any other components from us).
If your application is GPL, just use the GPL license.
If your application (or library, since LGPL is more library-devoted) is LGPL, just use the LGPL license.
Offline
Hello,
I took the code from above for LZ-compression between streams and it works great, then I tried to make same code to work with LZO, and don't work. I just can't see what I am doing wrong here
procedure LZODecompressStream(const Source, Dest: TStream);
procedure LZOCompressStream(const Source, Dest: TStream);
procedure LZODecompressStream(const Source, Dest: TStream);
var
Size: Integer;
Buf, Tmp: Pointer;
ComprSize: Integer;
begin
Source.Read(ComprSize, SizeOf(ComprSize));
GetMem(Tmp, ComprSize);
try
Source.Read(Tmp^, ComprSize);
Size:= lzopas_decompressdestlen(Tmp);
GetMem(Buf, Size);
try
Size:= lzopas_decompress(Tmp, ComprSize, Buf);
if Size > 0 then
Dest.WriteBuffer(Buf^, Size);
finally
FreeMem(Buf);
end;
finally
FreeMem(Tmp);
end;
end;
procedure LZOCompressStream(const Source, Dest: TStream);
var
Size: Integer;
Buf, Tmp: Pointer;
ComprSize: Integer;
begin
Size:= Source.Size;
GetMem(Buf, Size);
try
Source.Seek(0, soFromBeginning);
Source.Read(Buf^, Size);
Size := lzopas_compressdestlen(Size);
GetMem(Tmp, Size);
try
ComprSize := lzopas_compress(Buf, Source.Size, Tmp);
Dest.Write(ComprSize, SizeOf(ComprSize));
if Size > 0 then
Dest.WriteBuffer(Tmp^, ComprSize);
finally
FreeMem(Tmp);
end;
finally
FreeMem(Buf);
end;
end;
Maybe more eyes can tell me how stupid I really am
btw. is there ompatible C# implementation of LZ or LZO?
-TP-
Offline
I don't think it could be useful to code LZ or LZO in pure C#. I suspect the gain in speed will be none compared to native standard zlib compressing, since the LZ/LZO code and algorithm rely heavily on pointer arithmetic and optimized hashing. The C# overhead will make it slower than standard zlib compress, which, I think, is implemented in the DotNet framework as unmanaged code.
Could you provide me with a whole testing program source code? Perhaps the problem is not in your code, but mine!
There is no "Seek" at the beginning of the decompression routine, it sounds fair since your stream format sounds like compressing multiple chunk of data in the same stream, but perhaps the problem is here (don't read from the good position). I need the whole testing program...
Offline
I don't think it could be useful to code LZ or LZO in pure C#. I suspect the gain in speed will be none compared to native standard zlib...
OK, my bad, I always mix up few things in .NOT world. I mean any .NET comptible implementation which I/we could use. Save or unsafe code, as long as it is working and fast is fine by me
Could you provide me with a whole testing program source code? Perhaps the problem is not in your code, but mine! ... I need the whole testing program...
Seems that it was totally my error, made testing program tough, which maybe you could use also . In which Email address you would like to receive it (seems that this forum don't allow attachments)
-TP-
Offline
Whoooa...
I did jump a gun. Found and Error (or stumble upon one). Now you really wan't the Test code
There is some serious problem. That is why I did see lot of problems before.
-TP-
Offline
Sorry.
I making fool of my self... I'll get back when I get this sorted out
-TP-
Offline
OK,
I forgot this totally,
Bug was between the screen and back of my head
Sorry about this, hope there would be native Stream "interface" in next release. So I would get rid of this "hack"
It is very fast indeed, in memory compression especially Thanks for this!!!
-TP-
Last edited by TPrami (2010-10-29 11:38:21)
Offline
hi, the unit is very good i was looking for a zip compress unit but the example is very good
also i wanted to ask:
lzopas_compressfile('myfile.bmp','myfile.dat',nil,nil,nil);
i used this code to compress the file, it's possible to use a different compression level?
i dont know what to use for the argoment of methodComp so i used 'nil,nil,nil' for the arguments
anyway the compression is very good, the bitmap file was 1.37 mb and now it's 190kb
Offline
Hi novaz,
This function will compress the file using our LZO algorithm, and not the ZIP algorithm.
There is only one compression level with our LZO implementation.
The methodComp argument can stay as nil; in this case, it'll use our LZO implementation.
The compression with LZO is good, but is less effective than with ZIP or other (like LZMA). And the file format resulting of the lzopas_compressfile function is proprietary. But LZO is much faster than ZIP. And our SynLZ algorithm is even faster.
If you need to have a valid zip file format, take a look at our PasZip and SynZip/SynZipFiles units, in our source code repository. The first one is a pure pascal implementation of .zip; the 2nd is a more optimized (with some fast asm) implementation, but using a little more code size. Both have TZipRead and TZipWrite classes, which allow you to manage .zip file format easily.
Offline
hi, thank you very much, synzip is very good
i used:
ZipFileWrite:=TZipWrite.Create('myzipfile.zip');
ZipFileWrite.AddDeflated('file1.bmp',True,6);
ZipFileWrite.AddDeflated('file2.bmp',True,6);
ZipFileWrite.Free;
to extract the file:
ZipFileRead:=TZipRead.Create('myzipfile.zip');
a:=0;
b:=ZipFileRead.Count;
while a<b do begin
ZipFileRead.UnZip(a,'c:\mydir');
a:=a+1;
end;
ZipFileRead.Free;
is this the correct code to create the file zip? The code works,
the unit is very good, i was trying to use only the default Delphi
components for the game, or also small dll, so if i can release the
source it's easy to compile with delphi 6 or delphi 7.
Offline
The code is OK.
Perhaps more easy to read:
with TZipWrite.Create('myzipfile.zip') do
try
AddDeflated('file1.bmp',true,6);
AddDeflated('file2.bmp',true,6);
finally
Free;
end;
with TZipRead.Create('myzipfile.zip') do
try
for i := 0 to Count-1 do
UnZip(i,'c:\mydir');
finally
Free;
end;
But it's exactly the same...
Offline
Thanks for this wonderful unit. I'm facing a big problem though: How to (de)compress extremely large files? The original LZO routines allowed for (de)compressing data in chunks. Thus, extremely large files could be processed with a small memory footprint. Is that possible with SynLZO, too?
Many thanks,
Fredo
Last edited by Morkel (2011-05-23 13:58:14)
Offline
You have compression/decompression in chunks sample code in SynCommons.pas, functions FileSynLZ and FileUnSynLZ.
It works with 128 MB chunks, add crc checksums to each chunk, and will handle a cardinal value as a "magic" signature to recognize your file content.
Offline
Hello,
Could we get the Stream interface for these unit's as default.
I have been pondering some kind of stream Stack interface. I'll try to explain what I am talking about. (not sure what they are actually called)
Like now if you first build the data, then Compress it, then Encrypt it, and maybe at las you calculate some kind of check-sum Hash of the data.
In Standard Delphi you have to make new stream and you'll have the more than less make copies and handle those operations.
I was thinking of Data handler stack in which you would feed the data, and it internally passes those pieces of data forward in the stack. And does the operations would be done without making an copy of the whole data, and copying it into some compression stream and then to encryption stream and so on...
I think this is possible in some way in the .NOT...
Just rambling on about this what have been bothering me in some projects I'd had in the past...
This would be more than less important, because almost all the lib's out there have stream interface for the data...
-TP-
Offline
hey
with TZipWrite.Create('myzipfile.zip') do
try
AddDeflated('file1.bmp',true,6);
AddDeflated('file2.bmp',true,6);
finally
Free;
end;
on decompression with winrar gives
`! G:\UFastMM4\Test applications\sync compression\synopseCompresssion.zip: The archive is either in unknown format or damaged`
error ..can you tell me how to get rid of it?
Offline
Do you use the latest version from http://synopse.info/fossil ?
Can it be opened by Windows? (i.e. via a double-click in the Explorer)
Try to repair it using WinRar and send us both original and repaired file for comparison.
Send it to webcontact01 at synopse dot info.
Thanks
Offline
"Do you use the latest version from http://synopse.info/fossil ?"
i downloaded from -http://synopse.info/fossil/dir?ci=tip
1.SynTaskDialog.pas
2.SynZip.pas
3.SynZipFiles.pas
4.Synopse.inc
5.deflate.obj
6.trees.obj
ok, as i was writing the reply i got my answer(mistake)
i in my code had missed 'Free'
so i was only adding the files but i was not TZipWrite..im sory
i just figured out that in my code i had made changes
Thank you
Offline
Hi, is there a way to determine if (maybe invalid) data can be decompressed with SynLZ without access violation?
Something like
if is_synlzcompressed(str) then
synlz_decompress(str)
else
showmessage('invalid data');
Or is
try
synlz_decompress(str);
except
showmessage('invalid data');
end;
the only way?
Thanks
Offline
Thanks, hashing is a good idea
Offline
Hi, I'm trying to compress / decompress streams using the code snippet from this thread but always get an access violation in SynLZDecompress1pas in Delphi XE6 Is there any other way to compress / decompress memory stream data with SynLZ or SynZIP? (I have downloaded the newest versions before testing).
Thx
Moehre
Offline
Hi, I've seen that you mentioned SynCommons to use streams with SynLZ - and found the StreamSynLZ and StreamUnSynLZ functions, which work for me in my test app Only one question: what is this "magic number" usefull for?
I have tried to write / read directly to a filestream giving this magic number but it seems not to work?! It would reduce memory consumption, so it would be nice to use it. As a workaround I use 2 memory streams and store the stream size as a header in the filestream. Is there a way to use a filestream for StreamSynLZ / StreamUnSynLZ directly? (I only need the uncompressed stream internally)
My example app (in short form);
function TestWrite;
var
SL: TStringList;
msIN,msOUT: TMemoryStream;
fs: TFileStream;
sLen: Int64;
begin
SL := TStringList.Create;
msIN := TMemoryStream.Create;
SL.Add('... many lines of blala here');
...
SL.WriteToStream(msIN);
msIN.Position := 0;
SynCommons.StreamSynLZ(msIN,msOUT,4711);
fs := TFileStream.Create('F:\TEST.SYN',fmCreate);
sLen := msOUT.Size;
fs.Write(sLen,SizeOf(sLen));
fs.CopyFrom(msOUT,sLen);
SL.Clear;
msIN.Clear;
msOUT.Clear;
SL.Add('insert new text as second chunk...');
SL.SaveToStream(msIN);
msIN.Position := 0;
SynCommons.StreamSynLZ(msIN,msOUT,0815);
msOUT.Position := 0;
sLen := msOUT.Size;
fs.Write(sLen,SizeOf(sLen));
fs.CopyFrom(msOUT,sLen);
fs.Free;
msIN.Free;
msOUT.Free;
SL.Free;
end;
function TestRead;
var
SL: TStringList;
msIN,msOUT: TMemoryStream;
sLen: Int64;
fs: TFileStream;
begin
SL := TStringList.Create;
msIN := TMemoryStream.Create;
fs := TFileStream.Create('F:\TEST.SYN',fmOpenRead);
fs.Read(sLen,SizeOf(sLen));
msIN.CopyFrom(fs,sLen);
msIN.Position := 0;
msOUT := SynCommons.StreamUnSynLZ(msIN,4711);
msOUT.Position := 0;
SL.ReadFromStream(msOUT); // breakpoint to check data
SL.Clear;
msIN.Clear;
msOUT.Free;
fs.Read(sLen,SizeOf(sLen));
msIN.CopyFrom(fs,sLen);
msIN.Position := 0;
msOUT := SynCommons.StreamUnSynLZ(msIN,0815);
msOUT.Positon := 0;
SL.ReadFromStream(msOUT); // Check data again
SL.Free;
fs.Free;
msIN.Free;
msOUT.Free;
end;
Thx in advance
Moehre
Offline
a hashing limitation makes SynLZ sometimes unable to pack continuous blocks of same byte -> SynLZ is perfect for xml/text, but SynLZO is prefered for database files
I would like to use the LZ compression to shrink Bitmap images in my application.
Because the bitmaps are screenshots it will have many repeating bytes.
Does the above limitation that I cannot use `SynLZ`, but must use `SynLZO` instead?
Last edited by Johan (2014-11-04 11:37:05)
Offline
Just use a TMemoryStream as temporary buffer, then use SynLZ for processing the data.
See for instance StreamSynLZ/StreamUnSynLZ in SynCommons.pas unit.
Those functions are adding a hash to check the data integrity before decompression.
Offline
Hi, thank you very much, SynLZ is very fast.
I used FlexCompress and SynLZ for Oracle Database Backups on client PC:
InputDataSize := Integer(CurrentBufPtr)-Integer(FBuffer);
if TabArchiver.CompressionLevel=FlexCompress.clNone then begin
CompressedSize := SynLZcompress1(FBuffer, InputDataSize, SynLZBuffer);
ArchMemStream.WriteBuffer(SynLZBuffer^, CompressedSize);
end else
ArchMemStream.WriteBuffer(FBuffer^, InputDataSize);
TabArchiver.ExtractToStream(FileName, LoadStream);
L:=LoadStream.Size;
if TabArchiver.LastCompressionMode=0 then begin
P:=LoadStream.Memory;
aUnCompressedSize:=SynLZdecompressdestlen(P);
SynLZStream.Size := aUnCompressedSize;
if (SynLZdecompress1(P,L,SynLZStream.Memory)<>aUnCompressedSize) then
RaiseMsgError('Invalid SynLZ decompress');
CurrentBufPtr:=SynLZStream.Memory;
L:=SynLZStream.Size;
end else
CurrentBufPtr:=LoadStream.Memory;
Size of backup FlexCompress(PPM+Max) is 2 times smaller than SynLZ.
Size of backup SynLZ is 3 times smaller than without compression
(input buffer is very packed).
SynLZ compression+FlexCompress(Rijndael_256) is 18 times faster than FlexCompress(PPM+Max+Rijndael_256);
decompression time is 10 times faster, Rijndael_256 used in any case.
Compression time(SynLZ+Rijndael_256)=10% of Backup time. 90%=Fetch records
Compression time(PPM+Max+Rijndael_256)=70% of Backup time.
Last edited by okroma007 (2015-03-22 08:52:33)
Offline
Hi,
I want to fast backup some SQLite database files. Sizes mostly smaller than 2GB. Since it is indicated that lzopas_compressfile is faster than copying files, I wanted to give it a try. Unfortunately, I have found below comment in 1.18.2639 version of the framework in SynLZO unit.
{.$define LZOFILE}
{ attempt to use in-place file compression using memory mapping mechanism
-> still not fully functional -> so do not define }
I wonder if the function is not to be used at all.
Thanks.
Offline
I see that function declaration as follows:
function FileSynLZ(const Source, Dest: TFileName; Magic: Cardinal): boolean;
I am not sure what value I should provide for parameter "Magic". Is it possible to have an example code, please?
Thanks.
Offline
As stated by the doc: "you should specify a Magic number to be used to identify the compressed file format".
Put whatever you want, some big random 32 bit integer, which would identify your file content, like the file extension does, but here it is internally to the file.
FileUnLz() would check this number, to ensure the file is with the expected format.
Offline
Even I checked, I couldn't find it in documents. One last thing I would like to know about FileSynLZ/FileUnSynLZ is can I deflate a file using some other tool which is compressed using FileSynLZ?
I have already tried 7zip V15.14 and Winrar V5.30 and failed.
Offline
I tried SynZip already. Need something fast. Compression ratio is not my main concern. Though, even not as much as SynZip, FileSynLZ does compress.
Thanks.
Offline
Hello,
I highly doubt, but it is better to ask. Is it somehow possible to use FileSynLz() compress multiple files like TZipWrite does?
If not, is it possible to use TZipWrite only as a container? In other words, can I have TZipWrite to create a zip file having several FileSynLz() created files without trying to compress them again.
Thanks.
Offline
FileSynLz is a single-file compression algorithm.
You can use the SynLZ algorithm within a .zip container within SynZipFiles.
But it is a proprietary extension, not recognized by anything else than this unit!
Offline