Abstract
In this thesis, a minimum redundant prefix coding with higher compression ratio and lower time complexity is proposed for lossless compression of HD images. The compression algorithm is based on Canonical Huffman coding, preprocesses the data source to be compressed according to the image data features, and then compresses the data in batches using the locally uneven features in the data, which improves the compression ratio by 1.678 times compared with the traditional canonical Huffman coding. During the implementation of the algorithm, the counting sorting method with lower time complexity and the code-length table construction method without relying on binary trees are used to reduce the complexity of the algorithm and achieve the purpose of real-time data processing by the system. Finally, the proposed compression algorithm is deployed in FPGA to improve the encoding rate by parallel hardware circuit and pipeline design.
Publisher
Darcy & Roy Press Co. Ltd.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献