Compress big data

Sep 18, 2014 at 5:25 PM
What is the best way to compress data relatively large 2-200 MB.

If it's a 200 MB file, I don't want to load it in ram and then compress it.
Would I get a nice performance if I compress the file with the stream method?

Do I lost performance using stream with small files?

Thx!
Coordinator
Sep 22, 2014 at 10:16 AM
The is an LZ4Stream class to compress a lot of data. It is as fast as block compression, you may get a little bit worse compression ratio as it is using separate blocks (1MB by default but you can define your own size) instead of "sliding window". If you are not interested in details than yes: LZ4Stream is your answer.
Sep 22, 2014 at 3:58 PM
I was not sure about the overhead for each block but it's the only way to get a low memory footprint.

Thx Krashan.

Small example for others :
//Compression
using (var istream = new FileStream("uncompressedFile", FileMode.Open))
using (var ostream = new FileStream("compressedFile", FileMode.Create))
using (var lzStream = new LZ4.LZ4Stream(ostream, System.IO.Compression.CompressionMode.Compress))
{
      istream.CopyTo(lzStream);
}
//Decompression
using (var istream = new FileStream("compressedFile", FileMode.Open))
using (var ostream = new FileStream("uncompressedFile", FileMode.Create))
using (var lzStream = new LZ4.LZ4Stream(istream, System.IO.Compression.CompressionMode.Decompress))
{
      lzStream.CopyTo(ostream);
}