In our Source Code repository you would have perhaps noticed the SynLZ.pas and SynLZO.pas units.
These are two in-memory compression units, optimized for speed.
The SynLZO unit implements Synopse LZO Compression:
* it's an upgraded and fully rewriten version of our MyLZO unit;
* SynLZO is a very FAST portable lossless data compression library written in optimized pascal code for Delphi 3 up to Delphi 2010 with a tuned asm version available;
* offers *extremely* fast compression and decompression with good compression rate, in comparaison with its speed;
* original LZO written in ANSI C - pascal+asm conversion by A.Bouchez;
* simple but very fast direct file compression: SynLZO compressed files read/write is faster than copying plain files!
The SynLZ unit implements SynLZ Compression:
* SynLZ is a NEW lz-based algorithm implementing very FAST lossless data compression library;
* it's written in optimized pascal code for Delphi 3 up to Delphi 2010, with a tuned asm version available;
* symetrical compression and decompression speed (which is very rare above all other compression algorithms in the wild);
* good compression rate (usualy better than LZO);
* fastest averrage compression speed (ideal for xml/text communication, e.g.).
Even if SynLZO is a pascal conversion of the LZO algorithm, The SynLZ unit implements a new compression algorithm with the following features:
* hashing+dictionary compression in one pass, with no huffman table;
* optimized 32bits control word, embedded in the data stream;
* in-memory compression (the dictionary is the input stream itself);
* compression and decompression have the same speed (both use hashing);
* thread safe and lossless algorithm;
* supports overlapping compression and in-place decompression;
* code size for compression/decompression functions is smaller than LZO's.
- this format is NOT stream compatible with any lz* official format
=> use it internally in your application, not as exchange format
- very small code size (less than 1KB for both compressor/decompressor)
- the uncompressed data length is stored in the beginning of the stream and can be retrieved easily for proper out_p memory allocation
- please give correct data to the decompressor (i.e. first CRC in_p data)
=> we recommend our very fast Adler32 procedure, or a zip-like container
- a 2nd more tuned algorithm is included, but is somewhat slower in practice
=> use SynLZ[de]compres1*() functions in your applications
- tested and benchmarked with a lot of data types/sizes
=> use the asm code, which is very tuned: SynLZ[de]compress1asm()
- tested under Delphi 7, Delphi 2007 and Delphi 2009
- a hashing limitation makes SynLZ sometimes unable to pack continuous blocks of same byte -> SynLZ is perfect for xml/text, but SynLZO is prefered for database files
- if you include it in your application, please give me some credits: "use SynLZ compression by http://synopse.info"
- use at your own risk!
Some benchmark on a Sempron computer:
- SynLZ compression is 20 times faster than zip, decompression 3 times;
- same compression ratio as lzo algo, but faster (up to 2x) on compression;
- the R and W intermediate speed are at the compressed stream level, i.e. the speed which used for disk usage -> you see that SynLZ behaves very well for real-time data compression, for backup purpose e.g. (a typical SATA disk drive has a speed of 50-100 MB/s - best SDD speed is about 200 MB/s)
- SynLZ beats the LZO algorithms for compression speed, but is a bit slower on decompression.
KLOG.xml 6034 bytes lz1 asm 1287 21.3% R 256 MB/s W 54 MB/s R 71 MB/s W 334 MB/s lz1 pas 1287 21.3% R 184 MB/s W 39 MB/s R 58 MB/s W 274 MB/s lz2 pas 1274 21.1% R 173 MB/s W 36 MB/s R 57 MB/s W 274 MB/s lzo C 1347 22.3% R 185 MB/s W 41 MB/s R 111 MB/s W 501 MB/s zip 806 13.4% R 14 MB/s W 1 MB/s R 14 MB/s W 110 MB/s MiniLZO.cs 25252 bytes lz1 asm 5775 22.9% R 246 MB/s W 56 MB/s R 70 MB/s W 306 MB/s lz1 pas 5775 22.9% R 178 MB/s W 40 MB/s R 58 MB/s W 253 MB/s lz2 pas 5762 22.8% R 166 MB/s W 37 MB/s R 57 MB/s W 250 MB/s lzo C 5846 23.2% R 164 MB/s W 38 MB/s R 103 MB/s W 448 MB/s zip 3707 14.7% R 15 MB/s W 2 MB/s R 22 MB/s W 154 MB/s TestLZO.exe 158720 bytes lz1 asm 110686 69.7% R 127 MB/s W 88 MB/s R 80 MB/s W 115 MB/s lz1 pas 110686 69.7% R 98 MB/s W 68 MB/s R 63 MB/s W 90 MB/s lz2 pas 109004 68.7% R 88 MB/s W 60 MB/s R 60 MB/s W 88 MB/s lzo C 108202 68.2% R 40 MB/s W 27 MB/s R 164 MB/s W 241 MB/s zip 88786 55.9% R 5 MB/s W 3 MB/s R 33 MB/s W 60 MB/s Browsing.sq3db 46047232 bytes (46MB) lz1 asm 19766884 42.9% R 171 MB/s W 73 MB/s R 73 MB/s W 171 MB/s lz1 pas 19766884 42.9% R 130 MB/s W 56 MB/s R 59 MB/s W 139 MB/s lz2 pas 19707346 42.8% R 123 MB/s W 52 MB/s R 59 MB/s W 139 MB/s lzo asm 20629084 44.8% R 89 MB/s W 40 MB/s R 135 MB/s W 302 MB/s lzo C 20629083 44.8% R 66 MB/s W 29 MB/s R 145 MB/s W 325 MB/s zip 15564126 33.8% R 6 MB/s W 2 MB/s R 30 MB/s W 91 MB/s TRKCHG.DBF 4572297 bytes (4MB) lz1 asm 265782 5.8% R 430 MB/s W 25 MB/s R 29 MB/s W 510 MB/s lz1 pas 265782 5.8% R 296 MB/s W 17 MB/s R 28 MB/s W 483 MB/s lz2 pas 274773 6.0% R 258 MB/s W 15 MB/s R 27 MB/s W 450 MB/s lzo C 266897 5.8% R 318 MB/s W 18 MB/s R 41 MB/s W 702 MB/s zip 158408 3.5% R 25 MB/s W 0 MB/s R 11 MB/s W 318 MB/s CATENA5.TXT 6358752 bytes lz1 asm 3275269 51.5% R 132 MB/s W 68 MB/s R 66 MB/s W 129 MB/s lz1 pas 3275269 51.5% R 103 MB/s W 53 MB/s R 57 MB/s W 112 MB/s lz2 pas 3277397 51.5% R 95 MB/s W 49 MB/s R 57 MB/s W 112 MB/s lzo C 3289373 51.7% R 63 MB/s W 33 MB/s R 90 MB/s W 175 MB/s zip 2029096 31.9% R 4 MB/s W 1 MB/s R 29 MB/s W 91 MB/s
Here is a mail I just received about these units:
thank you very much for sending me the files!
I made a quick test and the compressor has amazing speed at a good
compression ratio. It is much faster than ZCompressStream with zcfastest
A 40MB TBitmap32 image is compressed to 19MB in less a second!
SynLZ is the faster one against SynLZO?
It would be helpful if you add the following methods for Stream
procedure LZCompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Size:= Source.Size; GetMem(Buf, Size); try Source.Seek(0, soFromBeginning); Source.Read(Buf^, Size); Size:= SynLZcompressdestlen(Size); GetMem(Tmp, Size); try ComprSize:= SynLZcompress1asm(Buf, Source.Size, Tmp); Dest.Write(ComprSize, SizeOf(ComprSize)); if Size > 0 then Dest.WriteBuffer(Tmp^, ComprSize); finally FreeMem(Tmp); end; finally FreeMem(Buf); end; end; procedure LZDecompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Source.Read(ComprSize, SizeOf(ComprSize)); GetMem(Tmp, ComprSize); try Source.Read(Tmp^, ComprSize); Size:= SynLZdecompressdestlen(Tmp); GetMem(Buf, Size); try Size:= SynLZdecompress1asm(Tmp, ComprSize, Buf); if Size > 0 then Dest.WriteBuffer(Buf^, Size); finally FreeMem(Buf); end; finally FreeMem(Tmp); end; end;
I have to save/load ComprSize because the stream to decompress has a
header and decompression is done at current stream position.
I had no luck with reading directly from a TMemoryStream.Memory^, so i
have to use Buffers with GetMem. Any chance to ged rid off using GetMem
when i already have a TMemoryStream?
Maybe my problem is because of the current stream position..
Btw. i use D2007 with FastMM4
About speed: yes, in my tests SynLZ is faster than LZO for compression.
There is yet indeed no stream-based compression. Compressing a buffer could work well. There could be other approach (on the fly compression using a fixed side memory buffer, or memory-mapped files e.g.) as well.
Thanks for your code sharing: I'll take a look at that, and optimize it for TMemoryStream if you think it's worth it. There is no need to store the size in your code, since both compressed and uncompressed sizes are already stored in our SynLZ/SynLZO units data.
Can this be used in commercial applications?
As stated by the units content, they are both licensed under a MPL/GPL/LGPL tri-license.
So if you pickup the MPL license, you can use it to any commercial application, even statically linked (i.e. as a .dcu unit used to create the exe).
Please don't forget to put somewhere in your credit window or documentation, a link to http://synopse.info if you use any of these units (like any other components from us).
If your application is GPL, just use the GPL license.
If your application (or library, since LGPL is more library-devoted) is LGPL, just use the LGPL license.
I took the code from above for LZ-compression between streams and it works great, then I tried to make same code to work with LZO, and don't work. I just can't see what I am doing wrong here
procedure LZODecompressStream(const Source, Dest: TStream); procedure LZOCompressStream(const Source, Dest: TStream); procedure LZODecompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Source.Read(ComprSize, SizeOf(ComprSize)); GetMem(Tmp, ComprSize); try Source.Read(Tmp^, ComprSize); Size:= lzopas_decompressdestlen(Tmp); GetMem(Buf, Size); try Size:= lzopas_decompress(Tmp, ComprSize, Buf); if Size > 0 then Dest.WriteBuffer(Buf^, Size); finally FreeMem(Buf); end; finally FreeMem(Tmp); end; end; procedure LZOCompressStream(const Source, Dest: TStream); var Size: Integer; Buf, Tmp: Pointer; ComprSize: Integer; begin Size:= Source.Size; GetMem(Buf, Size); try Source.Seek(0, soFromBeginning); Source.Read(Buf^, Size); Size := lzopas_compressdestlen(Size); GetMem(Tmp, Size); try ComprSize := lzopas_compress(Buf, Source.Size, Tmp); Dest.Write(ComprSize, SizeOf(ComprSize)); if Size > 0 then Dest.WriteBuffer(Tmp^, ComprSize); finally FreeMem(Tmp); end; finally FreeMem(Buf); end; end;
Maybe more eyes can tell me how stupid I really am
btw. is there ompatible C# implementation of LZ or LZO?
I don't think it could be useful to code LZ or LZO in pure C#. I suspect the gain in speed will be none compared to native standard zlib compressing, since the LZ/LZO code and algorithm rely heavily on pointer arithmetic and optimized hashing. The C# overhead will make it slower than standard zlib compress, which, I think, is implemented in the DotNet framework as unmanaged code.
Could you provide me with a whole testing program source code? Perhaps the problem is not in your code, but mine!
There is no "Seek" at the beginning of the decompression routine, it sounds fair since your stream format sounds like compressing multiple chunk of data in the same stream, but perhaps the problem is here (don't read from the good position). I need the whole testing program...
I don't think it could be useful to code LZ or LZO in pure C#. I suspect the gain in speed will be none compared to native standard zlib...
OK, my bad, I always mix up few things in .NOT world. I mean any .NET comptible implementation which I/we could use. Save or unsafe code, as long as it is working and fast is fine by me
Could you provide me with a whole testing program source code? Perhaps the problem is not in your code, but mine! ... I need the whole testing program...
Seems that it was totally my error, made testing program tough, which maybe you could use also . In which Email address you would like to receive it (seems that this forum don't allow attachments)
I did jump a gun. Found and Error (or stumble upon one). Now you really wan't the Test code
There is some serious problem. That is why I did see lot of problems before.
I making fool of my self... I'll get back when I get this sorted out
I forgot this totally,
Bug was between the screen and back of my head
Sorry about this, hope there would be native Stream "interface" in next release. So I would get rid of this "hack"
It is very fast indeed, in memory compression especially Thanks for this!!!
Last edited by TPrami (2010-10-29 11:38:21)
hi, the unit is very good i was looking for a zip compress unit but the example is very good
also i wanted to ask:
i used this code to compress the file, it's possible to use a different compression level?
i dont know what to use for the argoment of methodComp so i used 'nil,nil,nil' for the arguments
anyway the compression is very good, the bitmap file was 1.37 mb and now it's 190kb
This function will compress the file using our LZO algorithm, and not the ZIP algorithm.
There is only one compression level with our LZO implementation.
The methodComp argument can stay as nil; in this case, it'll use our LZO implementation.
The compression with LZO is good, but is less effective than with ZIP or other (like LZMA). And the file format resulting of the lzopas_compressfile function is proprietary. But LZO is much faster than ZIP. And our SynLZ algorithm is even faster.
If you need to have a valid zip file format, take a look at our PasZip and SynZip/SynZipFiles units, in our source code repository. The first one is a pure pascal implementation of .zip; the 2nd is a more optimized (with some fast asm) implementation, but using a little more code size. Both have TZipRead and TZipWrite classes, which allow you to manage .zip file format easily.
hi, thank you very much, synzip is very good
to extract the file:
while a<b do begin
is this the correct code to create the file zip? The code works,
the unit is very good, i was trying to use only the default Delphi
components for the game, or also small dll, so if i can release the
source it's easy to compile with delphi 6 or delphi 7.
The code is OK.
Perhaps more easy to read:
with TZipWrite.Create('myzipfile.zip') do try AddDeflated('file1.bmp',true,6); AddDeflated('file2.bmp',true,6); finally Free; end; with TZipRead.Create('myzipfile.zip') do try for i := 0 to Count-1 do UnZip(i,'c:\mydir'); finally Free; end;
But it's exactly the same...
Thanks for this wonderful unit. I'm facing a big problem though: How to (de)compress extremely large files? The original LZO routines allowed for (de)compressing data in chunks. Thus, extremely large files could be processed with a small memory footprint. Is that possible with SynLZO, too?
Last edited by Morkel (2011-05-23 13:58:14)
You have compression/decompression in chunks sample code in SynCommons.pas, functions FileSynLZ and FileUnSynLZ.
It works with 128 MB chunks, add crc checksums to each chunk, and will handle a cardinal value as a "magic" signature to recognize your file content.
Could we get the Stream interface for these unit's as default.
I have been pondering some kind of stream Stack interface. I'll try to explain what I am talking about. (not sure what they are actually called)
Like now if you first build the data, then Compress it, then Encrypt it, and maybe at las you calculate some kind of check-sum Hash of the data.
In Standard Delphi you have to make new stream and you'll have the more than less make copies and handle those operations.
I was thinking of Data handler stack in which you would feed the data, and it internally passes those pieces of data forward in the stack. And does the operations would be done without making an copy of the whole data, and copying it into some compression stream and then to encryption stream and so on...
I think this is possible in some way in the .NOT...
Just rambling on about this what have been bothering me in some projects I'd had in the past...
This would be more than less important, because almost all the lib's out there have stream interface for the data...
with TZipWrite.Create('myzipfile.zip') do
on decompression with winrar gives
`! G:\UFastMM4\Test applications\sync compression\synopseCompresssion.zip: The archive is either in unknown format or damaged`
error ..can you tell me how to get rid of it?
Do you use the latest version from http://synopse.info/fossil ?
Can it be opened by Windows? (i.e. via a double-click in the Explorer)
Try to repair it using WinRar and send us both original and repaired file for comparison.
Send it to webcontact01 at synopse dot info.
"Do you use the latest version from http://synopse.info/fossil ?"
i downloaded from -http://synopse.info/fossil/dir?ci=tip
ok, as i was writing the reply i got my answer(mistake)
i in my code had missed 'Free'
so i was only adding the files but i was not TZipWrite..im sory
i just figured out that in my code i had made changes
Hi, is there a way to determine if (maybe invalid) data can be decompressed with SynLZ without access violation?
if is_synlzcompressed(str) then
the only way?