fix: add temporary fix for high compression ratio files

Currently, we have a hardcoded buffer size of compressed * 4 for the
decompressed data, since deflate does not give the true decompressed
size (although zip metadata does). So, on files with a very high
compression ratio, a buffer overflow is triggered as our destination
buffer is not large enough for the entire decompressed data.

This temporary fix involves increasing the factor from 4 to 7 for the
size estimation. A future fix would involve using the true decompressed
size.
This commit is contained in:
Erica Marigold 2025-01-01 19:33:51 +00:00
parent b1818de2f2
commit 1f4dd5715b
Signed by: DevComp
GPG key ID: 429EF1C337871656

View file

@ -344,7 +344,10 @@ end
--- Main decompression function that processes DEFLATE compressed data
local function uncompress(source: buffer): buffer
local dest = buffer.create(buffer.len(source) * 4)
-- FIXME: This is a temporary solution to avoid a buffer overflow
-- We likely want some type of reflection with the zip metadata to
-- have a definitive buffer size
local dest = buffer.create(buffer.len(source) * 7)
local d = Data.new(source, dest)
repeat