| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
| |
ensure that `frametest` works fine with these values,
notably with low LZ4_MEMORY_USAGE (dict test notably)
following suggestions from @t-mat at #1016
|
| |
|
|
|
|
|
| |
ensure `dictBase` is only used
when there is an actual dictionary content.
|
| |
|
|
|
|
| |
doesn't happen on my environment, though it's a different version of Visual Studio
|
|
|
|
|
|
| |
only include <intrin.h> on vs2005+ (#947)
remove some useless #pragma
fix a few minor Visual warnings
|
|
|
|
| |
to help scan-build detect the condition
|
| |
|
|
|
|
| |
fix one (rare & complex) issue discovered by this test
|
|
|
|
|
|
| |
applying new more accurate formula from LZ4_compress_HC_destSize()
also : fix some minor display issue in tests/frametest
|
|
|
|
| |
properly track history
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
EndMark, the 4-bytes value indicating the end of frame,
must be `0x00000000`.
Previously, it was just mentioned as a `0-size` block.
But such definition could encompass uncompressed blocks of size 0,
with a header of value `0x80000000`.
But the intention was to also support uncompressed empty blocks.
They could be used as a keep-alive signal.
Note that compressed empty blocks are already supported,
it's just that they have a size 1 instead of 0 (for the `0` token).
Unfortunately, the decoder implementation was also wrong,
and would also interpret a `0x80000000` block header as an endMark.
This issue evaded detection so far simply because
this situation never happens, as LZ4Frame always issues
a clean 0x00000000 value as a endMark.
It also does not flush empty blocks.
This is fixed in this PR.
The decoder can now deal with empty uncompressed blocks,
and do not confuse them with EndMark.
The specification is also clarified.
Finally, FrameTest is updated to randomly insert empty blocks during fuzzing.
|
| |
|
|
|
|
|
| |
could trigger %0 on exceptional circumstances
due to wrong buffer size parameter.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
It's now possible to select a custom LZ4_DISTANCE_MAX at compile time,
provided it's <= 65535.
However, in some cases (when compressing in byU16 mode),
the new distance wasn't respected,
as it used to implied that it was necessarily within range.
Added a distance check for this case.
Also : added a new TravisCI test which ensures that
custom LZ4_DISTANCE_MAX compiles correctly
and compresses correctly (relying on `assert()` to find outsized offsets).
|
| |
|
|
|
|
|
|
| |
so that noisy src decompression
doesn't generate output
nor fails when decompression fails (which is expected).
|
| |
|
|
|
|
| |
and created target cxx17build
|
| |
|
|
|
|
|
| |
which is unable to understand that the variable is necessarily initialized
in spite of an assert just before.
|
| |
|
|
|
|
| |
was a false positive, but better remove it anyway
|
|
|
|
| |
ensure canary remains within buffer limits
|
|
|
|
|
|
| |
One test could write a canary value out of bound
in exceptional conditions involving multiple flushes
triggered by -s3421 -t462948.
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
which actively tries to make it write out of bound.
For this scenario to be possible,
it's necessary to set dstCapacity < LZ4F_compressBound()
When a compression operation fails,
the CCtx context is left in an undefined state,
therefore compression cannot resume.
As a consequence :
- round trip tests must be aborted, since there is nothing valid to decompress
- most users avoid this situation, by ensuring that dstCapacity >= LZ4F_compressBound()
For these reasons, this use case was poorly tested up to now.
|
|
|
|
|
|
|
|
|
|
|
|
| |
so "funny" thing with cppcheck
is that no 2 versions give the same list of warnings.
On Mac, I'm using v1.81, which had all warnings fixed.
On Travis CI, it's v1.61, and it complains about a dozen more/different things.
On Linux, it's v1.72, and it finds a completely different list of a half dozen warnings.
Some of these seems to be bugs/limitations in cppcheck itself.
The TravisCI version v1.61 seems unable to understand %zu correctly, and seems to assume it means %u.
|
|
|
|
|
|
| |
as Makefile target and Travis CI test.
Fixed last cppcheck warnings in tests and examples
|
|
|
|
|
|
| |
don't fix dictionaries of size 0.
setting dictEnd == source triggers prefix mode,
thus removing possibility to use CDict.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The error can be reproduced using following command :
./frametest -v -i100000000 -s1659 -t31096808
It's actually a bug in the stream LZ4 API,
when starting a new stream
and providing a first chunk to complete with size < MINMATCH.
In which case, the chunk becomes a dictionary.
No hash was generated and stored,
but the chunk is accessible as default position 0 points to dictStart,
and position 0 is still within MAX_DISTANCE.
Then, next attempt to read 32-bits from position 0 fails.
The issue would have been mitigated by starting from index 64 KB,
effectively eliminating position 0 as too far away.
The proper fix is to eliminate such "dictionary" as too small.
Which is what this patch does.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
Not obvious : copying the state was copying cdict's compression level
|
|
|
|
|
| |
note : only compression API is implemented and tested
still to do : decompression API
|
|
|
|
|
| |
Compressor can set dictID on LZ4F_compressBegin()
Decompressor can retrieve it using LZ4F_getFrameInfo()
|
| |
|
| |
|
|
|
|
| |
and added entry "make list"
|
| |
|
| |
|