Saturday, May 11, 2019
Compression Algorithms Research Paper Example | Topics and Well Written Essays - 1750 words
Compression algorithms - Research Paper useThis process of size reduction of data is popularly known as compression of data, though it was officially known as source steganography. Compression is important as it aids in cutting deal the use of resources, uniform space of data storage or capacity of transmission. As compressed data should be unwinded in order to use, the extra processing computation or costs that arise from decompression, the particular differs far from free lunch. Algorithm compression is likely to be subjected to a trade off of clip space complexity. For example, a video compression scheme needs a costly hardw be to decompress the video with speed for it to be observed during the decompressing process. Opting for decompression of the video before watching may be of inconvenience or may need additional storage. Data compression design schemes signify tradeoffs amid various factors, inclusive of compression degree, distortion introduced and required comput ational resources to uncompress and compress the data. There are new options for traditional systems that sample fully then compress providing effective usage of resource ground on compressed sensing principles. Compressed sensing methods circumvent the requirement for compression of data choosing from a selected basis. Origin The compression is either lossless or lossy. ... Compression is important as it aids in cutting down the use of resources, like space of data storage or capacity of transmission. Algorithm compression has played an important role in IT from the 1970s. During this time, internet was growing in its popularity and there was introduction of Lempel-Ziv algorithms. The Lempel-Ziv algorithm unfortunately, has a stretched history in non-computing. The earliest invention of compression algorithms is the Morse code that took go in in 1883. It involves the a compression of data entailing common letters found in English like t and e which are allocated Morse codes that are shorter. Later, when mainframe computers started taking hold in the course 1949, Robert Fano and Claude Shannon invented coding that was named Shannon-Fan. Their algorithm allocates codes to cipher in a specific data blocks based on likeliness of occurrence of the symbol. The probability being of one symbol occurring is indirectly proportional to the code length which results to a shorter means of representing data (Wolfram, 2002) After two years, David Huffman as he studied information theory divided a class with Fano Robert. Fano issued the class with the option of either taking final exam or piece a research paper. Huffman made for the research paper that was on the topic of working out on the most effective binary coding method. After a research carried out for months that proved not to be fruitful, Huffman almost gave up on the work to study for a final exam to blind for the paper. At that point is when Huffman got an epiphany, building a technique that was more effic ient yet similar to the coding of Shannon-Fano. The major difference between Huffman and Shannon-Fano is in the later is there is a bottom-up built
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment