Published:
May 2000
Proceedings:
Proceedings of the Thirteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2000)
Volume
Issue:
Proceedings of the Thirteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2000)
Track:
All Papers
Downloads:
Abstract:
Neural networks have the potential to extend data com-pression algorithms beyond the character level n-gram models now in use, but have usually been avoided because they are too slow to be practical. We introduce a model that produces better compression than popular Limpel-Ziv compressors (zip, gzip, compress), and is competitive in time, space, and compression ratio with PPM and Burrows-Wheeler algorithms, currently the best known. The compressor, a bit-level predictive arithmetic encoder is fast because only 4-5 connections are simultaneously active and because it uses a variable learning rate optimized for one-pass training.
FLAIRS
Proceedings of the Thirteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2000)
ISBN 978-1-57735-113-9
Published by The AAAI Press, Menlo Park, California.