Page 143 - 49A Field Guide to Genetic Programming
P. 143
12.11 Compression 129
Iterated Functions System (IFS) are important in the domain of frac-
tals and the fractal compression algorithm. Lutton, Levy-Vehel, Cretin,
Glevarec, and Roll (1995a,b) used genetic programming to solve the inverse
problem of identifying a mixed IFS whose attractor is a specific binary (black
and white) image of interest. The evolved program can then be taken to rep-
resent the original image. In principle, this can then be further compressed.
The technique is lossy, since rarely the inverse problem can be solved ex-
actly. No practical application or compression ratio results were reported
in (Lutton et al., 1995a,b). Using similar principles, Sarafopoulos (1999)
used GP to evolve affine IFSs whose attractors represent a binary image
containing a square (which was compressed exactly) and one containing a
fern (which was achieved with some error in the finer details).
Wavelets are frequently used in lossy image and signal compression.
Klappenecker and May (1995) used GP to evolve wavelet compression algo-
rithms (internal nodes represented conjugate quadrature filters, leaves rep-
resented quantisers). Results on a small set of real-world images were im-
pressive, with the GP compression outperforming JPEG at all compression
ratios.
The first lossless compression technique (Fukunaga and Stechert, 1998)
used GP to evolve non-linear predictors for images. These were used to
predict the gray level a pixel will take based on the gray values of a subset
of its neighbours (those that have already been computed in a row-by-row
and column-by-column scan of the image array). The prediction errors to-
gether with the model’s description represent a compressed version of the
image. These were compressed using the Huffman encoding. Results on five
images from the NASA Galileo Mission database were very promising with
GP compression outperforming some of the best human-designed lossless
compression algorithms.
In many compression algorithms some form of pre-processing or transfor-
mation of the original data is performed before compression. This often im-
proves compression ratios. Parent and Nowe (2002) evolved pre-processors
for image compression using GP. The objective of the pre-processor was to
reduce losslessly the entropy in the original image. In tests with five images
from the Canterbury corpus, GP was successful in significantly reducing the
image entropy. As verified via the application of bzip2, the resulting images
were markedly easier to compress.
In (Krantz, Lindberg, Thorburn, and Nordin, 2002) the use of program-
matic compression was extended from images to natural videos. A program
was evolved that generates intermediate frames of video sequence, where
each frame is composed by a series of transformed regions from the adjacent
frames. The results were encouraging in the sense that a good approxima-
tion to frames was achieved. While a significant improvement in compres-
sion was achieved, programmatic compression was very slow in comparison