From 0463f7c9aad060fcd56e98d025ce16185279e2bc Mon Sep 17 00:00:00 2001 From: DRC <information@libjpeg-turbo.org> Date: Thu, 4 Feb 2016 18:34:38 -0600 Subject: [PATCH] Prevent overread when decoding malformed JPEG The accelerated Huffman decoder was previously invoked if there were > 128 bytes in the input buffer. However, it is possible to construct a JPEG image with Huffman blocks > 430 bytes in length (http://stackoverflow.com/questions/2734678/jpeg-calculating-max-size). While such images are pathological and could never be created by a JPEG compressor, it is conceivable that an attacker could use such an artifially-constructed image to trigger an input buffer overrun in the libjpeg-turbo decompressor and thus gain access to some of the data on the calling program's heap. This patch simply increases the minimum buffer size for the accelerated Huffman decoder to 512 bytes, which should (hopefully) accommodate any possible input. This addresses a major issue (LJT-01-005) identified in a security audit by Cure53. --- jdhuff.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/jdhuff.c b/jdhuff.c index 4197cc5..59b065a 100644 --- a/jdhuff.c +++ b/jdhuff.c @@ -743,7 +743,7 @@ decode_mcu_fast (j_decompress_ptr cinfo, JBLOCKROW *MCU_data) * this module, since we'll just re-assign them on the next call.) */ -#define BUFSIZE (DCTSIZE2 * 2) +#define BUFSIZE (DCTSIZE2 * 8) METHODDEF(boolean) decode_mcu (j_decompress_ptr cinfo, JBLOCKROW *MCU_data)