Beware of BrotliStream in .NET Core 2.1

I love compression algorithms and have built a fair share of my own for fun and profit. It is a difficult task to build something that is efficient when it comes to the compromise between speed and compression ratio, so when I saw Google's Brotli algorithm mentioned in the .NET Core 2.1 RC1 announcement, I was estatic!

It lasted about 15 minutes until I put the BrotliStream to a test, only to find out that it took 140x longer to compress a 518 MB file!

Curious if this was a bug, I delved into the code and realized that the default compression level is set to 11. For those not familiar with Brotli, that is the highest level available, that prioritize compression ratio above speed.

I reported it as an issue, but Microsoft seems to think that having the same configuration as Google's Brotli library is more important than a sane default, so it is only a matter of time before tons of StackOverflow posts popup complaining about the speed of BrotliStream.

If you want to use Brotli in your application (a wise choice! it is a good algorithm), be sure to NOT use the builtin CompressionLevel enum in the BrotliStream constructor. Instead, cast an integer (I would recommend a value of 6) like this:

BrotliStream stream = new BrotliStream(sm, (CompressionLevel)6);

That is a good compromise between speed and compression ratio for most data.

Update 2021-11-01: Microsoft decided to block user settings in this merge request.

Update 2022-05-01: Microsoft now defaults to quality 4. It will be part of .NET 7.

Update 2022-09-01: Microsoft acknowledges the problem, but stays with the decision of setting quality to 4.

Comments

Popular posts from this blog

.NET Compression Libraries Benchmark

Reducing the size of self-contained .NET Core applications

Convex polygon based collision detection