#1 2022-02-18 14:29:43

TPrami
Member
Registered: 2010-07-06
Posts: 119

"chunked/stream" compression with SynLZCompress1() (etc)

Yo!

Did not find with quick search (or did not understand what I saw).

So would it possible to use SynLZCompress1() to compress data in lets say 10kb chunks instead of whole data at once.

I have use case where I would like to avoid to have single large piece of the data in  memory all at once (op there is possibility of that).

To be clear I mean that I would like to get same result with chunks than as I would if I had all the data in one large buffer. So the data would be fed in same pipeline but not in one go. (So not just small individual pieces of compressed data, but same binary data as if I would compress in one go, hope someone will understand this smile )

-Tee-

Offline

#2 2022-02-18 16:35:32

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,659
Website

Re: "chunked/stream" compression with SynLZCompress1() (etc)

What makes SynLZ efficient is its in-memory process.
This is not the only algorithm doing it. For instance, Lizard has the same feature / limitation.
See https://github.com/inikep/lizard/blob/l … _format.md

So there is no "stream" oriented interface of SynLZ.
What you could do is cut the main data in chunks.
This is what we do e.g. with FileSynLZ() which use 128 MB compression chunks.

Offline

#3 2022-02-19 18:11:13

TPrami
Member
Registered: 2010-07-06
Posts: 119

Re: "chunked/stream" compression with SynLZCompress1() (etc)

OK,

I could do somethiong like that.

if code is something like that

whilke Data do
  Compress(GetFewBytesOfData);

make the Compress() keep cache internally which is flushed at end and when buffer gets full.

I take a look at the file compression routine.

Thanks man.

-Tee-

Offline

Board footer

Powered by FluxBB