Performance testing

While developing it (and introducing small changes) I was continuously testing the performance of this 4 implementations. It was very disappointing that every time I run it I get different results. Unfortunately my PC does many things (Skype, Steam, Virus scan) and every test is ran in a little bit different conditions.
I could run those test for long time and do averages but averages also have some disadvantages - one poor run can ruin the whole result.
I decided run tests for long time (actually 2 days continuous work) and take best results. Assuming that in ideal world I should run performance tests on machine doing nothing else (100% CPU usage) taking best results is as close as I can imagine and one poor run (when Skype decided to take 70% of CPU for no reason) won't spoil the whole test.
Some 'bests' were a little bit surprising though, so I used 'averages' to confirm them.

Note: LZ4 itself comes it 2 flavours - 32 and 64 bit. One was supposed to be run in x86 and the other in x64 architecture. But I decided to implement both for both architectures. And I'm glad I did (see below).

So finally we have 4 different approaches (Mixed, C++/CLI, Unsafe, Safe), all of them in two flavours (32 and 64-bit) and all of them can be run on x86 and x64 giving 16 different combinations.

compare-max-nolz4sharp.png

As picture is worth 1024 words:

Compression

compare-max-encode.png

Things you probably expected: Although, there are things I was surprised with:

Decompression

compare-max-decode.png

This is something what I wasn't expecting. Every single algorithm behaves a little bit different.
I confirmed all the anomalies with average values:
compare-avg.png