Knowledge

Lossless compression

Source 📝

3315: 3305: 287:, which in mathematics, denotes a difference), but the term is typically only used if both versions are meaningful outside compression and decompression. For example, while the process of compressing the error in the above-mentioned lossless audio compression scheme could be described as delta encoding from the approximated sound wave to the original sound wave, the approximated version of the sound wave is not meaningful in any other context. 249:, but there are other techniques that do not work for typical text that are useful for some images (particularly simple bitmaps), and other techniques that take advantage of the specific characteristics of images (such as the common phenomenon of contiguous 2-D areas of similar tones, and the fact that color images usually have a preponderance of a limited range of colors out of those representable in the color space). 737:) are the latest generation of lossless algorithms that compress data (typically sequences of nucleotides) using both conventional compression algorithms and specific algorithms adapted to genetic data. In 2012, a team of scientists from Johns Hopkins University published the first genetic compression algorithm that does not rely on external genetic databases for compression. HAPZIPPER was tailored for 792:, so winners in these benchmarks may be unsuitable for everyday use due to the slow speed of the top performers. Another drawback of some benchmarks is that their data files are known, so some program writers may optimize their programs for best performance on a particular data set. The winners on these benchmarks often come from the class of 1079:; for example, a compression application may consider files whose names end in ".zip", ".arj" or ".lha" uncompressible without any more sophisticated detection. A common way of handling this situation is quoting input, or uncompressible parts of the input in the output, minimizing the compression overhead. For example, the 1184: 1019: − 1 bits, these kinds of claims can be safely discarded without even looking at any further details regarding the purported compression scheme. Such an algorithm contradicts fundamental laws of mathematics because, if it existed, it could be applied repeatedly to losslessly reduce any file to length 1. 961:
Most practical compression algorithms provide an "escape" facility that can turn off the normal coding for files that would become longer by being encoded. In theory, only a single additional bit is required to tell the decoder that the normal coding has been turned off for the entire input; however,
883:
Lossless data compression algorithms cannot guarantee compression for all input data sets. In other words, for any lossless data compression algorithm, there will be an input data set that does not get smaller when processed by the algorithm, and for any lossless data compression algorithm that makes
842:
Sami Runsas (the author of NanoZip) maintained Compression Ratings, a benchmark similar to Maximum Compression multiple file test, but with minimum speed requirements. It offered the calculator that allowed the user to weight the importance of speed and compression ratio. The top programs were fairly
167:
model, the data is analyzed and a model is constructed, then this model is stored with the compressed data. This approach is simple and modular, but has the disadvantage that the model itself can be expensive to store, and also that it forces using a single model for all data being compressed, and so
1074:
Real compression algorithm designers accept that streams of high information entropy cannot be compressed, and accordingly, include facilities for detecting and handling this condition. An obvious way of detection is applying a raw compression algorithm and testing if its output is smaller than its
874:
The Compression Analysis Tool is a Windows application that enables end users to benchmark the performance characteristics of streaming implementations of LZF4, Deflate, ZLIB, GZIP, BZIP2 and LZMA using their own data. It produces measurements and charts with which users can compare the compression
172:
models dynamically update the model as the data is compressed. Both the encoder and decoder begin with a trivial model, yielding poor compression of initial data, but as they learn more about the data, performance improves. Most popular types of compression used in practice now use adaptive coders.
974:
than N. So if we know nothing about the properties of the data we are compressing, we might as well not compress it at all. A lossless compression algorithm is useful only when we are more likely to compress certain types of files than others; then the algorithm could be designed to compress those
205:
additionally uses data points from other pairs and multiplication factors to mix them into the difference. These factors must be integers, so that the result is an integer under all circumstances. So the values are increased, increasing file size, but hopefully the distribution of values is more
1091:
Mark Nelson, in response to claims of "magic" compression algorithms appearing in comp.compression, has constructed a 415,241 byte binary file of highly entropic content, and issued a public challenge of $ 100 to anyone to write a program that, together with its input, would be smaller than his
193:
These techniques take advantage of the specific characteristics of images such as the common phenomenon of contiguous 2-D areas of similar tones. Every pixel but the first is replaced by the difference to its left neighbor. This leads to small values having a much higher probability than large
1022:
On the other hand, it has also been proven that there is no algorithm to determine whether a file is incompressible in the sense of Kolmogorov complexity. Hence it is possible that any particular file, even if it appears random, may be significantly compressed, even including the size of the
991:
that the algorithm is designed to remove, and thus belong to the subset of files that that algorithm can make shorter, whereas other files would not get compressed or even get bigger. Algorithms are generally quite specifically tuned to a particular type of file: for example, lossless audio
180:
meaning that they can accept any bitstring) can be used on any type of data, many are unable to achieve significant compression on data that are not of the form for which they were designed to compress. Many of the lossless compression techniques used for text also work reasonably well for
2105: 194:
values. This is often also applied to sound files, and can compress files that contain mostly low frequencies and low volumes. For images, this step can be repeated by taking the difference to the top pixel, and then in videos, the difference to the pixel in the next frame can be taken.
103:
Lossless compression is used in cases where it is important that the original and the decompressed data be identical, or where deviations from the original data would be unfavourable. Common examples are executable programs, text documents, and source code. Some image file formats, like
209:
The adaptive encoding uses the probabilities from the previous sample in sound encoding, from the left and upper pixel in image encoding, and additionally from the previous frame in video encoding. In the wavelet transformation, the probabilities are also passed through the hierarchy.
722:. However, many ordinary lossless compression algorithms produce headers, wrappers, tables, or other predictable output that might instead make cryptanalysis easier. Thus, cryptosystems must utilize compression algorithms whose output does not contain these predictable patterns. 969:
In fact, if we consider files of length N, if all files were equally probable, then for any lossless compression that reduces the size of some file, the expected length of a compressed file (averaged over all possible files of length N) must necessarily be
986:
The "trick" that allows lossless compression algorithms, used on the type of data they were designed for, to consistently compress such files to a shorter form is that the files the algorithms are designed to act on all have some form of easily modeled
2102: 862:
The Monster of Compression benchmark by Nania Francesco Antonio tested compression on 1Gb of public data with a 40-minute time limit. In December 2009, the top ranked archiver was NanoZip 0.07a and the top ranked single file compressor was
895:
Suppose that there is a compression algorithm that transforms every file into an output file that is no longer than the original file, and that at least one file will be compressed into an output file that is shorter than the original
953:
that is simultaneously the output of the compression function on two different inputs. That file cannot be decompressed reliably (which of the two originals should that yield?), which contradicts the assumption that the algorithm was
982:
of all files that will become usefully shorter. This is the theoretical reason why we need to have different compression algorithms for different kinds of files: there cannot be any algorithm that is good for all kinds of data.
744:
Genomic sequence compression algorithms, also known as DNA sequence compressors, explore the fact that DNA sequences have characteristic properties, such as inverted repeats. The most successful compressors are XM and GeCo. For
313:. For this reason, many different algorithms exist that are designed either with a specific type of input data in mind or with specific assumptions about what kinds of redundancy the uncompressed data are likely to contain. 2112:, "n "Frequency-Time Based Data Compression Method" supporting the compression, encryption, decompression, and decryption and persistence of many binary digits through frequencies where each frequency represents many bits." 1010:
It is provably impossible to create an algorithm that can losslessly compress any data. While there have been many claims through the years of companies achieving "perfect compression" where an arbitrary number
1049:
on sequences (normally of octets). Compression is successful if the resulting sequence is shorter than the original sequence (and the instructions for the decompression map). For a compression algorithm to be
763:
Self-extracting executables contain a compressed application and a decompressor. When executed, the decompressor transparently decompresses and runs the original application. This is especially often used in
140:
for the input data, and the second step uses this model to map input data to bit sequences in such a way that "probable" (i.e. frequently encountered) data will produce shorter output than "improbable" data.
814:
dating back to 1987 is no longer widely used due to its small size. Matt Mahoney maintained the Calgary Compression Challenge, created and maintained from May 21, 1996, through May 21, 2016, by Leonid A.
252:
As mentioned previously, lossless sound compression is a somewhat specialized area. Lossless sound compression algorithms can take advantage of the repeating patterns shown by the wave-like nature of the
77:. Different algorithms exist that are designed either with a specific type of input data in mind or with specific assumptions about what kinds of redundancy the uncompressed data are likely to contain. 268:
models to predict the "next" value and encoding the (hopefully small) difference between the expected value and the actual data. If the difference between the predicted and the actual data (called the
197:
A hierarchical version of this technique takes neighboring pairs of data points, stores their difference and sum, and on a higher level with lower resolution continues with the sums. This is called
875:
speed, decompression speed and compression ratio of the different compression methods and to examine how the compression level, buffer size and flushing operations affect the results.
176:
Lossless compression methods may be categorized according to the type of data they are designed to compress. While, in principle, any general-purpose lossless compression algorithm (
226:
compression, and in particular licensing practices by patent holder Unisys that many developers considered abusive, some open source proponents encouraged people to avoid using the
741:
data and achieves over 20-fold compression (95% reduction in file size), providing 2- to 4-fold better compression much faster than leading general-purpose compression utilities.
1381:; Mandyam, Giridhar D.; Magotra, Neeraj (April 17, 1995). Rodriguez, Arturo A.; Safranek, Robert J.; Delp, Edward J. (eds.). "DCT-based scheme for lossless image compression". 272:) tends to be small, then certain difference values (like 0, +1, −1 etc. on sample values) become very frequent, which can be exploited by encoding them in few output bits. 978:
Thus, the main lesson from the argument is not that one risks big losses, but merely that one cannot always win. To choose an algorithm always means implicitly to select a
884:
at least one file smaller, there will be at least one file that it makes larger. This is easily proven with elementary mathematics using a counting argument called the
1819: 1029:, which appear random but can be generated by a very small program. However, even though it cannot be determined whether a particular file is incompressible, a 1283: 848: 218:
Many of these methods are implemented in open-source and proprietary tools, particularly LZW and its variants. Some algorithms are patented in the
128:
files are typically used on portable players and in other cases where storage space is limited or exact replication of the audio is unnecessary.
1565: 2139: 1092:
provided binary data yet be able to reconstitute it without error. A similar challenge, with $ 5,000 as reward, was issued by Mike Goldman.
2807: 2618: 1058:
from "plain" to "compressed" bit sequences. The pigeonhole principle prohibits a bijection between the collection of sequences of length
160:, whereas Huffman compression is simpler and faster but produces poor results for models that deal with symbol probabilities close to 1. 852: 66:, no lossless compression algorithm can shrink the size of all possible data: Some data will get longer by at least one symbol or bit. 1033:
shows that over 99% of files of any given length cannot be compressed by more than one byte (including the size of the decompressor).
2507: 383: 749:
XM is slightly better in compression ratio, though for sequences larger than 100 MB its computational requirements are impractical.
156:. Arithmetic coding achieves compression rates close to the best possible for a particular statistical model, which is given by the 3349: 3013: 2836: 2630: 957:
We must therefore conclude that our original hypothesis (that the compression function makes no file longer) is necessarily untrue.
296: 2321: 396: 300: 1470: 1430:
Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181)
3018: 2595: 864: 730: 999:
data cannot be consistently compressed by any conceivable lossless data compression algorithm; indeed, this result is used to
1861: 1708: 1480: 1083:
data format specifies the 'compression method' of 'Stored' for input files that have been copied into the archive verbatim.
2748: 1988: 1964: 1633: 1147: 125: 1066:−1. Therefore, it is not possible to produce a lossless algorithm that reduces the size of every possible input sequence. 222:
and other countries and their legal usage requires licensing by the patent holder. Because of patents on certain kinds of
2026: 3125: 2863: 2802: 2613: 2563: 2386: 446: 772:. This type of compression is not strictly limited to binary executables, but can also be applied to scripts, such as 2246: 2231: 2132: 1785: 1445: 1362: 1228: 1163: 799: 561: 69:
Compression algorithms are usually effective for human- and machine-readable documents and cannot shrink the size of
3238: 3248: 3086: 2937: 2856: 2650: 1330: 988: 962:
most encoding algorithms use at least one full byte (and typically more than one) for this purpose. For example,
117: 74: 44: 2094: 3221: 2841: 2635: 2423: 536: 424: 242:
with a selection of domain-specific prediction filters. However, the patents on LZW expired on June 20, 2003.
3318: 2354: 1737: 1684: 1210: 835:
The Generic Compression Benchmark, maintained by Matt Mahoney, tests compression of data generated by random
344: 2983: 121: 3308: 3211: 2753: 2311: 2125: 1117: 1102: 871:
The Compression Ratings website published a chart summary of the "frontier" in compression ratio and time.
738: 565: 697: 2301: 2296: 1030: 765: 227: 3243: 3170: 3008: 2988: 2932: 2590: 2381: 2184: 1775: 1261: 325: 198: 1911: 1545:
Pratas, D.; Pinho, A. J.; Ferreira, P. J. S. G. (2016). "Efficient compression of genomic sequences".
3344: 3253: 3194: 3120: 2968: 2558: 2553: 2408: 2251: 1251: 653: 630: 626: 486: 480: 377: 231: 39:
that allows the original data to be perfectly reconstructed from the compressed data with no loss of
3258: 2831: 2625: 2326: 641: 436:(RLE) – Simple scheme that provides good compression of data containing many runs of the same value 406: 275:
It is sometimes beneficial to compress only the differences between two versions of a file (or, in
223: 17: 1651: 3199: 2570: 2457: 2413: 2226: 2209: 2199: 2050: 843:
different due to the speed requirement. In January 2010, the top program was NanoZip followed by
578: 93: 92:. It is also often used as a component within lossy data compression technologies (e.g. lossless 2824: 2575: 2359: 2204: 1046: 789: 784:
Lossless compression algorithms and their implementations are routinely tested in head-to-head
758: 509: 3096: 1689:
8th International Conference on Informatics in Schools: Situation, Evolution, and Perspectives
1348: 3228: 1837: 1137: 1042: 1004: 785: 665: 521: 458: 1724: 949:
But 2 is smaller than 2+1, so by the pigeonhole principle there must be some file of length
788:. There are a number of better-known compression benchmarks. Some benchmarks cover only the 714:
encryption for added security. When properly implemented, compression greatly increases the
362:(LZ77 and LZ78) – Dictionary-based algorithm that forms the basis for many other algorithms 2912: 2374: 2336: 1390: 1298: 885: 63: 1892: 1795: 56: 8: 3143: 3034: 2993: 2978: 2947: 2942: 2851: 2758: 2691: 2660: 2645: 2428: 475: 433: 157: 136:
Most lossless compression programs do two things in sequence: the first step generates a
1394: 1302: 245:
Many of the lossless compression techniques used for text also work reasonably well for
3216: 3186: 3165: 3071: 3003: 2897: 2585: 2401: 2391: 2286: 2266: 2261: 1867: 1714: 1559: 1522: 1497: 1451: 1406: 1322: 1132: 1122: 1055: 647: 80:
Lossless data compression is used in many applications. For example, it is used in the
2797: 1926: 1468: 549: 3160: 3148: 3130: 2998: 2882: 2819: 2665: 2580: 2536: 2497: 2179: 1984: 1960: 1871: 1857: 1781: 1704: 1527: 1476: 1441: 1358: 1314: 1252:"General characteristics and design considerations for temporal subband video coding" 1153: 1080: 734: 600: 594: 593:– (includes lossless compression method via Le Gall–Tabatabai 5/3 reversible integer 492: 369: 338: 276: 153: 81: 48: 2070: 1718: 1580: 1546: 1455: 1410: 966:
compressed files never need to grow by more than 5 bytes per 65,535 bytes of input.
3135: 3091: 3064: 3059: 2917: 2902: 2812: 2721: 2716: 2545: 2278: 2256: 2148: 1849: 1696: 1517: 1509: 1433: 1425: 1398: 1326: 1306: 1112: 1107: 768:
coding, where competitions are held for demos with strict size limits, as small as
715: 464: 36: 938:
keeps its size during compression. There are 2 such files possible. Together with
892:
Assume that each file is represented as a string of bits of some arbitrary length.
3054: 2868: 2792: 2773: 2743: 2711: 2677: 2236: 2174: 2109: 1956: 1955:. The Morgan Kaufmann Series in Multimedia Information and Systems (5 ed.). 1845: 1692: 1853: 1700: 1637: 124:
formats are most often used for archiving or production purposes, while smaller
2846: 2640: 2369: 2364: 2221: 2194: 2166: 1980: 1469:
Alfred J. Menezes; Paul C. van Oorschot; Scott A. Vanstone (October 16, 1996).
1437: 1354: 1142: 836: 811: 793: 659: 452: 353: 284: 280: 265: 145: 2034: 27:
Data compression approach allowing perfect reconstruction of the original data
3338: 3153: 3101: 2768: 2763: 2738: 2670: 2291: 2189: 2006: 1833: 1158: 719: 359: 305:
No lossless compression algorithm can efficiently compress all possible data
246: 235: 219: 182: 1310: 3274: 2241: 2216: 2117: 1896: 1531: 1318: 1127: 942:, this makes 2+1 files that all compress into one of the 2 files of length 819: 707: 470: 1236: 564:– High Efficiency Image File Format (lossless or lossy compression, using 316:
Some of the most common lossless compression algorithms are listed below.
3233: 3111: 2907: 2783: 2733: 1977:
Lossless Compression Handbook (Communications, Networking and Multimedia)
1513: 43:. Lossless compression is possible because most real-world data exhibits 40: 347:
reversible transform for making textual data more compressible, used by
3290: 3081: 3076: 2963: 2922: 2728: 1076: 773: 746: 515: 498: 428: 1597: 1402: 2090: 992:
compression programs do not work well on text files, and vice versa.
823: 590: 504: 333: 1023:
decompressor. An example is the digits of the mathematical constant
163:
There are two primary ways of constructing statistical models: in a
3204: 3049: 2706: 1378: 1069: 1051: 769: 415: 391: 202: 144:
The primary encoding algorithms used to produce bit sequences are
2973: 2447: 2396: 1687:(September 28 – October 1, 2015). "Surprising Computer Science". 1615: 963: 844: 684: 612: 606: 530: 365: 239: 149: 51:
permits reconstruction only of an approximation of the original
2487: 1495: 996: 400: 70: 3322: 2927: 2520: 2467: 1777:
An Introduction to Kolmogorov Complexity and its Applications
1256: 856: 829: 584: 575: 348: 329: 2477: 2331: 2316: 2306: 2049: 1738:"Lossless Compression - an overview | ScienceDirect Topics" 1389:. International Society for Optics and Photonics: 474–478. 1383:
Digital Video Compression: Algorithms and Technologies 1995
1185:"Unit 4 Lab 4: Data Representation and Compression, Page 6" 671: 571: 555: 387: 373: 113: 89: 52: 644:– Portable Document Format (lossless or lossy compression) 279:, of successive images within a sequence). This is called 168:
performs poorly on files that contain heterogeneous data.
2452: 2418: 1284:"Mathematical properties of the JPEG2000 wavelet filters" 826: 635: 410: 368:– Combines LZ77 compression with Huffman coding, used by 109: 105: 97: 85: 1062:
and any subset of the collection of sequences of length
674:– (lossless or lossy compression of RGB and RGBA images) 2089: 1498:"HapZipper: sharing HapMap populations just got easier" 1025: 668:– Tag Image File Format (lossless or lossy compression) 1798: 1691:. Lecture Notes in Computer Science. Vol. 9378. 1086: 915:
be the length (in bits) of the compressed version of
818:
The Large Text Compression Benchmark and the similar
1755: 1662: 356:– Entropy encoding, pairs well with other algorithms 230:(GIF) for compressing still image files in favor of 1544: 1377: 1211:"Lossless Streaming – the future of high res audio" 802:, in his February 2010 edition of the free booklet 587:– (lossless or lossy compression of B&W images) 112:, use only lossless compression, while others like 1813: 3336: 2069: 1070:Points of application in real compression theory 911:bits that compresses to something shorter. Let 603:– (lossless/near-lossless compression standard) 1912:"The Million Random Digit Challenge Revisited" 1832: 903:be the least number such that there is a file 386:(LZMA) – Very high compression ratio, used by 2133: 1679: 1677: 1423: 2147: 2071:"Lossless and lossy audio formats for music" 1578: 1496:Chanda, P.; Elhaik, E.; Bader, J.S. (2012). 1462: 687:– Lossless compression of 3D triangle meshes 1031:simple theorem about incompressible strings 1015:of random bits can always be compressed to 718:by removing patterns that might facilitate 2140: 2126: 1674: 1564:: CS1 maint: location missing publisher ( 1036: 213: 120:may use either lossless or lossy methods. 100:encoders and other lossy audio encoders). 1975:Sayood, Khalid, ed. (December 18, 2002). 1773: 1683: 1521: 1432:. Vol. 3. pp. 1769–1772 vol.3. 1281: 1249: 1075:input. Sometimes, detection is made by 725: 623:, includes a lossless compression method 297:Category:Lossless compression algorithms 2097:from the original on February 10, 2013. 1350:The Essential Guide to Video Processing 301:List of lossless compression algorithms 55:, though usually with greatly improved 14: 3337: 1974: 1950: 1909: 1761: 1668: 1538: 1426:"Reversible discrete cosine transform" 1250:Sullivan, Gary (December 8–12, 2003). 710:often compress data (the "plaintext") 2121: 1346: 1291:IEEE Transactions on Image Processing 1208: 59:(and therefore reduced media sizes). 1275: 1148:Lossless Transform Audio Compression 806:, additionally lists the following: 1951:Sayood, Khalid (October 27, 2017). 1780:. New York: Springer. p. 102. 1636:. September 1, 2016. Archived from 1424:Komatsu, K.; Sezaki, Kaoru (1998). 1054:, the compression map must form an 455:(ALAC – Apple Lossless Audio Codec) 24: 2024: 1944: 1927:"The $ 5000 Compression Challenge" 1598:"Large Text Compression Benchmark" 1087:The Million Random Digit Challenge 543: 447:Adaptive Transform Acoustic Coding 427:(PPM) – Optimized for compressing 319: 25: 3361: 1999: 1924: 1164:Universal code (data compression) 609:– (lossless or lossy compression) 384:Lempel–Ziv–Markov chain algorithm 3314: 3313: 3304: 3303: 1953:Introduction to Data Compression 1893:".ZIP File Format Specification" 1774:Li, Ming; Vitányi, Paul (1993). 1472:Handbook of Applied Cryptography 308: 3350:Lossless compression algorithms 1918: 1903: 1885: 1826: 1767: 1730: 1644: 1626: 1616:"Generic Compression Benchmark" 1608: 1590: 1572: 1489: 731:Genetics compression algorithms 702: 574:– (lossless RLE compression of 1910:Nelson, Mark (June 20, 2006). 1808: 1802: 1417: 1371: 1340: 1243: 1221: 1202: 1177: 878: 752: 678: 425:Prediction by partial matching 13: 1: 2091:"Image Compression Benchmark" 1209:Price, Andy (March 3, 2022). 1170: 1003:the concept of randomness in 779: 698:list of lossless video codecs 403:in tandem with Huffman coding 188: 131: 2027:"Theory of Data Compression" 1581:"Data Compression Explained" 1118:Entropy (information theory) 1103:Comparison of file archivers 558:– Free Lossless Image Format 328:– Entropy encoding, used by 7: 2055:Hydrogenaudio Knowledgebase 1854:10.1007/978-3-319-16250-8_3 1701:10.1007/978-3-319-25396-1_1 1652:"Compression Analysis Tool" 1548:Data Compression Conference 1282:Unser, M.; Blu, T. (2003). 1095: 656:– Portable Network Graphics 397:Lempel–Ziv–Storer–Szymanski 306: 228:Graphics Interchange Format 10: 3366: 3195:Compressed data structures 2517:RLE + BWT + MTF + Huffman 2185:Asymmetric numeral systems 1838:"The Pigeonhole Principle" 1438:10.1109/ICASSP.1998.681802 1262:Video Coding Experts Group 804:Data Compression Explained 756: 461:(also known as MPEG-4 ALS) 294: 290: 234:(PNG), which combines the 199:discrete wavelet transform 3299: 3283: 3267: 3185: 3110: 3042: 3033: 2956: 2890: 2881: 2782: 2699: 2690: 2606: 2554:Discrete cosine transform 2544: 2535: 2484:LZ77 + Huffman + context 2437: 2347: 2277: 2165: 2156: 2108:February 2, 2017, at the 1821:is not partial recursive. 1792:Theorem 2.6 The function 1215:Audio Media International 733:(not to be confused with 631:Discrete Cosine Transform 527:TTA (True Audio Lossless) 487:Meridian Lossless Packing 481:Free Lossless Audio Codec 345:Burrows–Wheeler transform 232:Portable Network Graphics 3259:Smallest grammar problem 2007:"LZF compression format" 1235:. Unisys. Archived from 1229:"LZW Patent Information" 995:In particular, files of 691: 539:(Windows Media Lossless) 440: 3200:Compressed suffix array 2749:Nyquist–Shannon theorem 1347:Bovik, Alan C. (2009). 1311:10.1109/TIP.2003.812329 1037:Mathematical background 650:– Quite OK Image Format 552:– AV1 Image File Format 283:(from the Greek letter 214:Historical legal issues 84:file format and in the 1815: 1658:. Noemax Technologies. 975:types of data better. 796:compression software. 790:data compression ratio 759:Executable compression 510:Original Sound Quality 501:(also known as HD-AAC) 465:Direct Stream Transfer 360:Lempel-Ziv compression 45:statistical redundancy 3229:Kolmogorov complexity 3097:Video characteristics 2474:LZ77 + Huffman + ANS 2051:"Lossless comparison" 1899:chapter V, section J. 1816: 1742:www.sciencedirect.com 1640:on September 1, 2016. 1579:Matt Mahoney (2010). 1138:Kolmogorov complexity 1043:compression algorithm 1005:Kolmogorov complexity 726:Genetics and genomics 459:Audio Lossless Coding 94:mid/side joint stereo 73:data that contain no 3319:Compression software 2913:Compression artifact 2869:Psychoacoustic model 2103:US patent #7,096,360 1814:{\displaystyle C(x)} 1796: 1336:on October 13, 2019. 886:pigeonhole principle 518:(RealAudio Lossless) 495:(Monkey's Audio APE) 64:pigeonhole principle 62:By operation of the 33:Lossless compression 3309:Compression formats 2948:Texture compression 2943:Standard test image 2759:Silence compression 1395:1995SPIE.2419..474M 1303:2003ITIP...12.1080U 1045:can be viewed as a 822:both use a trimmed 476:DTS-HD Master Audio 434:Run-length encoding 158:information entropy 3217:Information theory 3072:Display resolution 2898:Chroma subsampling 2287:Byte pair encoding 2232:Shannon–Fano–Elias 2077:. November 6, 2003 1836:(March 18, 2015). 1811: 1723:See in particular 1514:10.1093/nar/gks709 1133:Information theory 1123:Grammar-based code 735:genetic algorithms 638:– PiCture eXchange 533:(WavPack lossless) 413:images and Unix's 341:– Entropy encoding 309:§ Limitations 264:essentially using 148:(also used by the 3332: 3331: 3181: 3180: 3131:Deblocking filter 3029: 3028: 2877: 2876: 2686: 2685: 2531: 2530: 2057:. January 5, 2015 1863:978-3-319-16250-8 1710:978-3-319-25396-1 1695:. pp. 1–11. 1554:. Snowbird, Utah. 1502:Nucleic Acids Res 1482:978-1-4398-2191-6 1403:10.1117/12.206386 1154:Lossy compression 595:wavelet transform 399:(LZSS) – Used by 339:Arithmetic coding 311:for more on this) 277:video compression 240:deflate algorithm 154:arithmetic coding 150:deflate algorithm 138:statistical model 96:preprocessing by 57:compression rates 49:lossy compression 16:(Redirected from 3357: 3345:Data compression 3317: 3316: 3307: 3306: 3136:Lapped transform 3040: 3039: 2918:Image resolution 2903:Coding tree unit 2888: 2887: 2697: 2696: 2542: 2541: 2163: 2162: 2149:Data compression 2142: 2135: 2128: 2119: 2118: 2098: 2086: 2084: 2082: 2075:Bobulous Central 2066: 2064: 2062: 2046: 2044: 2042: 2033:. Archived from 2031:Data Compression 2021: 2019: 2017: 1994: 1990:978-0-12390754-7 1970: 1966:978-0-12809474-7 1938: 1937: 1935: 1933: 1925:Craig, Patrick. 1922: 1916: 1915: 1907: 1901: 1900: 1889: 1883: 1882: 1880: 1878: 1830: 1824: 1823: 1820: 1818: 1817: 1812: 1771: 1765: 1759: 1753: 1752: 1750: 1748: 1734: 1728: 1722: 1681: 1672: 1666: 1660: 1659: 1648: 1642: 1641: 1630: 1624: 1623: 1612: 1606: 1605: 1594: 1588: 1587: 1585: 1576: 1570: 1569: 1563: 1555: 1553: 1542: 1536: 1535: 1525: 1493: 1487: 1486: 1466: 1460: 1459: 1421: 1415: 1414: 1375: 1369: 1368: 1344: 1338: 1337: 1335: 1329:. Archived from 1297:(9): 1080–1090. 1288: 1279: 1273: 1272: 1270: 1268: 1247: 1241: 1240: 1239:on June 2, 2009. 1225: 1219: 1218: 1206: 1200: 1199: 1197: 1195: 1181: 1113:David A. Huffman 1108:Data compression 716:unicity distance 662:– Truevision TGA 418: 409:(LZW) – Used by 407:Lempel–Ziv–Welch 312: 263: 262: 258: 47:. By contrast, 37:data compression 21: 3365: 3364: 3360: 3359: 3358: 3356: 3355: 3354: 3335: 3334: 3333: 3328: 3295: 3279: 3263: 3244:Rate–distortion 3177: 3106: 3025: 2952: 2873: 2778: 2774:Sub-band coding 2682: 2607:Predictive type 2602: 2527: 2494:LZSS + Huffman 2444:LZ77 + Huffman 2433: 2343: 2279:Dictionary type 2273: 2175:Adaptive coding 2152: 2146: 2110:Wayback Machine 2080: 2078: 2060: 2058: 2040: 2038: 2015: 2013: 2005: 2002: 1991: 1967: 1957:Morgan Kaufmann 1947: 1945:Further reading 1942: 1941: 1931: 1929: 1923: 1919: 1908: 1904: 1891: 1890: 1886: 1876: 1874: 1864: 1831: 1827: 1797: 1794: 1793: 1788: 1772: 1768: 1760: 1756: 1746: 1744: 1736: 1735: 1731: 1711: 1682: 1675: 1667: 1663: 1650: 1649: 1645: 1632: 1631: 1627: 1620:mattmahoney.net 1614: 1613: 1609: 1602:mattmahoney.net 1596: 1595: 1591: 1586:. pp. 3–5. 1583: 1577: 1573: 1557: 1556: 1551: 1543: 1539: 1494: 1490: 1483: 1467: 1463: 1448: 1422: 1418: 1376: 1372: 1365: 1357:. p. 355. 1345: 1341: 1333: 1286: 1280: 1276: 1266: 1264: 1248: 1244: 1227: 1226: 1222: 1207: 1203: 1193: 1191: 1183: 1182: 1178: 1173: 1168: 1098: 1089: 1072: 1039: 934:file of length 881: 837:Turing machines 782: 761: 755: 728: 705: 694: 681: 546: 544:Raster graphics 443: 414: 322: 320:General purpose 303: 293: 260: 256: 254: 216: 191: 178:general-purpose 134: 28: 23: 22: 15: 12: 11: 5: 3363: 3353: 3352: 3347: 3330: 3329: 3327: 3326: 3311: 3300: 3297: 3296: 3294: 3293: 3287: 3285: 3281: 3280: 3278: 3277: 3271: 3269: 3265: 3264: 3262: 3261: 3256: 3251: 3246: 3241: 3236: 3231: 3226: 3225: 3224: 3214: 3209: 3208: 3207: 3202: 3191: 3189: 3183: 3182: 3179: 3178: 3176: 3175: 3174: 3173: 3168: 3158: 3157: 3156: 3151: 3146: 3138: 3133: 3128: 3123: 3117: 3115: 3108: 3107: 3105: 3104: 3099: 3094: 3089: 3084: 3079: 3074: 3069: 3068: 3067: 3062: 3057: 3046: 3044: 3037: 3031: 3030: 3027: 3026: 3024: 3023: 3022: 3021: 3016: 3011: 3006: 2996: 2991: 2986: 2981: 2976: 2971: 2966: 2960: 2958: 2954: 2953: 2951: 2950: 2945: 2940: 2935: 2930: 2925: 2920: 2915: 2910: 2905: 2900: 2894: 2892: 2885: 2879: 2878: 2875: 2874: 2872: 2871: 2866: 2861: 2860: 2859: 2854: 2849: 2844: 2839: 2829: 2828: 2827: 2817: 2816: 2815: 2810: 2800: 2795: 2789: 2787: 2780: 2779: 2777: 2776: 2771: 2766: 2761: 2756: 2751: 2746: 2741: 2736: 2731: 2726: 2725: 2724: 2719: 2714: 2703: 2701: 2694: 2688: 2687: 2684: 2683: 2681: 2680: 2678:Psychoacoustic 2675: 2674: 2673: 2668: 2663: 2655: 2654: 2653: 2648: 2643: 2638: 2633: 2623: 2622: 2621: 2610: 2608: 2604: 2603: 2601: 2600: 2599: 2598: 2593: 2588: 2578: 2573: 2568: 2567: 2566: 2561: 2550: 2548: 2546:Transform type 2539: 2533: 2532: 2529: 2528: 2526: 2525: 2524: 2523: 2515: 2514: 2513: 2510: 2502: 2501: 2500: 2492: 2491: 2490: 2482: 2481: 2480: 2472: 2471: 2470: 2462: 2461: 2460: 2455: 2450: 2441: 2439: 2435: 2434: 2432: 2431: 2426: 2421: 2416: 2411: 2406: 2405: 2404: 2399: 2389: 2384: 2379: 2378: 2377: 2367: 2362: 2357: 2351: 2349: 2345: 2344: 2342: 2341: 2340: 2339: 2334: 2329: 2324: 2319: 2314: 2309: 2304: 2299: 2289: 2283: 2281: 2275: 2274: 2272: 2271: 2270: 2269: 2264: 2259: 2254: 2244: 2239: 2234: 2229: 2224: 2219: 2214: 2213: 2212: 2207: 2202: 2192: 2187: 2182: 2177: 2171: 2169: 2160: 2154: 2153: 2145: 2144: 2137: 2130: 2122: 2116: 2115: 2114: 2113: 2087: 2067: 2047: 2037:on May 8, 2016 2022: 2001: 2000:External links 1998: 1997: 1996: 1989: 1981:Academic Press 1979:(1 ed.). 1972: 1965: 1946: 1943: 1940: 1939: 1917: 1902: 1884: 1862: 1848:. p. 21. 1842:Proof Patterns 1834:Joshi, Mark S. 1825: 1810: 1807: 1804: 1801: 1786: 1766: 1754: 1729: 1709: 1673: 1661: 1643: 1625: 1607: 1589: 1571: 1537: 1488: 1481: 1461: 1446: 1416: 1370: 1363: 1355:Academic Press 1339: 1274: 1242: 1220: 1201: 1175: 1174: 1172: 1169: 1167: 1166: 1161: 1156: 1151: 1145: 1143:List of codecs 1140: 1135: 1130: 1125: 1120: 1115: 1110: 1105: 1099: 1097: 1094: 1088: 1085: 1071: 1068: 1041:Abstractly, a 1038: 1035: 959: 958: 955: 947: 920: 897: 893: 888:, as follows: 880: 877: 869: 868: 860: 840: 833: 816: 812:Calgary Corpus 794:context-mixing 781: 778: 757:Main article: 754: 751: 727: 724: 704: 701: 693: 690: 689: 688: 680: 677: 676: 675: 669: 663: 657: 651: 645: 639: 633: 624: 610: 604: 598: 588: 582: 569: 559: 553: 545: 542: 541: 540: 534: 528: 525: 519: 513: 507: 502: 496: 493:Monkey's Audio 490: 484: 478: 473: 468: 462: 456: 453:Apple Lossless 450: 442: 439: 438: 437: 431: 422: 421: 420: 404: 394: 381: 357: 354:Huffman coding 351: 342: 336: 321: 318: 292: 289: 281:delta encoding 266:autoregressive 247:indexed images 215: 212: 190: 187: 183:indexed images 146:Huffman coding 133: 130: 122:Lossless audio 35:is a class of 26: 9: 6: 4: 3: 2: 3362: 3351: 3348: 3346: 3343: 3342: 3340: 3324: 3320: 3312: 3310: 3302: 3301: 3298: 3292: 3289: 3288: 3286: 3282: 3276: 3273: 3272: 3270: 3266: 3260: 3257: 3255: 3252: 3250: 3247: 3245: 3242: 3240: 3237: 3235: 3232: 3230: 3227: 3223: 3220: 3219: 3218: 3215: 3213: 3210: 3206: 3203: 3201: 3198: 3197: 3196: 3193: 3192: 3190: 3188: 3184: 3172: 3169: 3167: 3164: 3163: 3162: 3159: 3155: 3152: 3150: 3147: 3145: 3142: 3141: 3139: 3137: 3134: 3132: 3129: 3127: 3124: 3122: 3119: 3118: 3116: 3113: 3109: 3103: 3102:Video quality 3100: 3098: 3095: 3093: 3090: 3088: 3085: 3083: 3080: 3078: 3075: 3073: 3070: 3066: 3063: 3061: 3058: 3056: 3053: 3052: 3051: 3048: 3047: 3045: 3041: 3038: 3036: 3032: 3020: 3017: 3015: 3012: 3010: 3007: 3005: 3002: 3001: 3000: 2997: 2995: 2992: 2990: 2987: 2985: 2982: 2980: 2977: 2975: 2972: 2970: 2967: 2965: 2962: 2961: 2959: 2955: 2949: 2946: 2944: 2941: 2939: 2936: 2934: 2931: 2929: 2926: 2924: 2921: 2919: 2916: 2914: 2911: 2909: 2906: 2904: 2901: 2899: 2896: 2895: 2893: 2889: 2886: 2884: 2880: 2870: 2867: 2865: 2862: 2858: 2855: 2853: 2850: 2848: 2845: 2843: 2840: 2838: 2835: 2834: 2833: 2830: 2826: 2823: 2822: 2821: 2818: 2814: 2811: 2809: 2806: 2805: 2804: 2801: 2799: 2796: 2794: 2791: 2790: 2788: 2785: 2781: 2775: 2772: 2770: 2769:Speech coding 2767: 2765: 2764:Sound quality 2762: 2760: 2757: 2755: 2752: 2750: 2747: 2745: 2742: 2740: 2739:Dynamic range 2737: 2735: 2732: 2730: 2727: 2723: 2720: 2718: 2715: 2713: 2710: 2709: 2708: 2705: 2704: 2702: 2698: 2695: 2693: 2689: 2679: 2676: 2672: 2669: 2667: 2664: 2662: 2659: 2658: 2656: 2652: 2649: 2647: 2644: 2642: 2639: 2637: 2634: 2632: 2629: 2628: 2627: 2624: 2620: 2617: 2616: 2615: 2612: 2611: 2609: 2605: 2597: 2594: 2592: 2589: 2587: 2584: 2583: 2582: 2579: 2577: 2574: 2572: 2569: 2565: 2562: 2560: 2557: 2556: 2555: 2552: 2551: 2549: 2547: 2543: 2540: 2538: 2534: 2522: 2519: 2518: 2516: 2511: 2509: 2506: 2505: 2504:LZ77 + Range 2503: 2499: 2496: 2495: 2493: 2489: 2486: 2485: 2483: 2479: 2476: 2475: 2473: 2469: 2466: 2465: 2463: 2459: 2456: 2454: 2451: 2449: 2446: 2445: 2443: 2442: 2440: 2436: 2430: 2427: 2425: 2422: 2420: 2417: 2415: 2412: 2410: 2407: 2403: 2400: 2398: 2395: 2394: 2393: 2390: 2388: 2385: 2383: 2380: 2376: 2373: 2372: 2371: 2368: 2366: 2363: 2361: 2358: 2356: 2353: 2352: 2350: 2346: 2338: 2335: 2333: 2330: 2328: 2325: 2323: 2320: 2318: 2315: 2313: 2310: 2308: 2305: 2303: 2300: 2298: 2295: 2294: 2293: 2290: 2288: 2285: 2284: 2282: 2280: 2276: 2268: 2265: 2263: 2260: 2258: 2255: 2253: 2250: 2249: 2248: 2245: 2243: 2240: 2238: 2235: 2233: 2230: 2228: 2225: 2223: 2220: 2218: 2215: 2211: 2208: 2206: 2203: 2201: 2198: 2197: 2196: 2193: 2191: 2188: 2186: 2183: 2181: 2178: 2176: 2173: 2172: 2170: 2168: 2164: 2161: 2159: 2155: 2150: 2143: 2138: 2136: 2131: 2129: 2124: 2123: 2120: 2111: 2107: 2104: 2101: 2100: 2096: 2092: 2088: 2076: 2072: 2068: 2056: 2052: 2048: 2036: 2032: 2028: 2025:Phamdo, Nam. 2023: 2012: 2008: 2004: 2003: 1992: 1986: 1982: 1978: 1973: 1968: 1962: 1958: 1954: 1949: 1948: 1928: 1921: 1913: 1906: 1898: 1894: 1888: 1873: 1869: 1865: 1859: 1855: 1851: 1847: 1843: 1839: 1835: 1829: 1822: 1805: 1799: 1789: 1787:0-387-94053-7 1783: 1779: 1778: 1770: 1764:, p. 38. 1763: 1758: 1743: 1739: 1733: 1726: 1720: 1716: 1712: 1706: 1702: 1698: 1694: 1690: 1686: 1680: 1678: 1671:, p. 41. 1670: 1665: 1657: 1653: 1647: 1639: 1635: 1629: 1621: 1617: 1611: 1603: 1599: 1593: 1582: 1575: 1567: 1561: 1550: 1549: 1541: 1533: 1529: 1524: 1519: 1515: 1511: 1507: 1503: 1499: 1492: 1484: 1478: 1475:. CRC Press. 1474: 1473: 1465: 1457: 1453: 1449: 1447:0-7803-4428-6 1443: 1439: 1435: 1431: 1427: 1420: 1412: 1408: 1404: 1400: 1396: 1392: 1388: 1384: 1380: 1374: 1366: 1364:9780080922508 1360: 1356: 1352: 1351: 1343: 1332: 1328: 1324: 1320: 1316: 1312: 1308: 1304: 1300: 1296: 1292: 1285: 1278: 1267:September 13, 1263: 1259: 1258: 1253: 1246: 1238: 1234: 1230: 1224: 1216: 1212: 1205: 1190: 1186: 1180: 1176: 1165: 1162: 1160: 1159:Normal number 1157: 1155: 1152: 1149: 1146: 1144: 1141: 1139: 1136: 1134: 1131: 1129: 1126: 1124: 1121: 1119: 1116: 1114: 1111: 1109: 1106: 1104: 1101: 1100: 1093: 1084: 1082: 1078: 1067: 1065: 1061: 1057: 1053: 1048: 1044: 1034: 1032: 1028: 1027: 1020: 1018: 1014: 1008: 1006: 1002: 998: 993: 990: 984: 981: 976: 973: 967: 965: 956: 952: 948: 945: 941: 937: 933: 929: 925: 921: 918: 914: 910: 906: 902: 898: 894: 891: 890: 889: 887: 876: 872: 866: 861: 858: 854: 850: 846: 841: 838: 834: 831: 828: 825: 821: 817: 813: 809: 808: 807: 805: 801: 797: 795: 791: 787: 777: 775: 771: 767: 760: 750: 748: 742: 740: 736: 732: 723: 721: 720:cryptanalysis 717: 713: 709: 708:Cryptosystems 700: 699: 686: 683: 682: 673: 670: 667: 664: 661: 658: 655: 652: 649: 646: 643: 640: 637: 634: 632: 628: 625: 622: 618: 614: 611: 608: 605: 602: 599: 596: 592: 589: 586: 583: 580: 577: 573: 570: 567: 563: 560: 557: 554: 551: 548: 547: 538: 535: 532: 529: 526: 523: 520: 517: 514: 511: 508: 506: 503: 500: 497: 494: 491: 488: 485: 482: 479: 477: 474: 472: 469: 466: 463: 460: 457: 454: 451: 448: 445: 444: 435: 432: 430: 426: 423: 417: 412: 408: 405: 402: 398: 395: 393: 389: 385: 382: 379: 375: 371: 367: 364: 363: 361: 358: 355: 352: 350: 346: 343: 340: 337: 335: 331: 327: 324: 323: 317: 314: 310: 302: 298: 288: 286: 282: 278: 273: 271: 267: 250: 248: 243: 241: 237: 233: 229: 225: 221: 220:United States 211: 207: 204: 200: 195: 186: 184: 179: 174: 171: 166: 161: 159: 155: 151: 147: 142: 139: 129: 127: 123: 119: 115: 111: 107: 101: 99: 95: 91: 87: 83: 78: 76: 72: 67: 65: 60: 58: 54: 50: 46: 42: 38: 34: 30: 19: 3275:Hutter Prize 3239:Quantization 3144:Compensation 2938:Quantization 2661:Compensation 2227:Shannon–Fano 2167:Entropy type 2157: 2099:overview of 2079:. Retrieved 2074: 2059:. Retrieved 2054: 2039:. Retrieved 2035:the original 2030: 2014:. Retrieved 2010: 1976: 1952: 1930:. Retrieved 1920: 1905: 1897:PKWARE, Inc. 1887: 1875:. Retrieved 1841: 1828: 1791: 1776: 1769: 1757: 1745:. Retrieved 1741: 1732: 1688: 1664: 1655: 1646: 1638:the original 1628: 1619: 1610: 1601: 1592: 1574: 1547: 1540: 1505: 1501: 1491: 1471: 1464: 1429: 1419: 1386: 1382: 1379:Ahmed, Nasir 1373: 1349: 1342: 1331:the original 1294: 1290: 1277: 1265:. Retrieved 1255: 1245: 1237:the original 1233:About Unisys 1232: 1223: 1214: 1204: 1192:. Retrieved 1188: 1179: 1128:Hutter Prize 1090: 1073: 1063: 1059: 1040: 1024: 1021: 1016: 1012: 1009: 1000: 994: 985: 979: 977: 971: 968: 960: 950: 943: 939: 935: 931: 927: 923: 916: 912: 908: 907:with length 904: 900: 882: 873: 870: 820:Hutter Prize 803: 800:Matt Mahoney 798: 783: 762: 743: 729: 711: 706: 703:Cryptography 695: 620: 616: 537:WMA Lossless 471:Dolby TrueHD 315: 304: 274: 269: 251: 244: 217: 208: 196: 192: 177: 175: 169: 164: 162: 143: 137: 135: 102: 79: 68: 61: 32: 31: 29: 3234:Prefix code 3087:Frame types 2908:Color space 2734:Convolution 2464:LZ77 + ANS 2375:Incremental 2348:Other types 2267:Levenshtein 2081:October 17, 2061:October 17, 2041:October 17, 2016:October 17, 1995:(488 pages) 1971:(790 pages) 1762:Sayood 2002 1747:October 30, 1669:Sayood 2002 1508:(20): 1–7. 1189:bjc.edc.org 879:Limitations 753:Executables 679:3D Graphics 629:– Lossless 615:– formerly 126:lossy audio 41:information 3339:Categories 3291:Mark Adler 3249:Redundancy 3166:Daubechies 3149:Estimation 3082:Frame rate 3004:Daubechies 2964:Chain code 2923:Macroblock 2729:Companding 2666:Estimation 2586:Daubechies 2292:Lempel–Ziv 2252:Exp-Golomb 2180:Arithmetic 1877:August 24, 1656:Free Tools 1171:References 1077:heuristics 989:redundancy 786:benchmarks 780:Benchmarks 774:JavaScript 747:eukaryotes 516:RealPlayer 499:MPEG-4 SLS 429:plain text 295:See also: 206:peaked. 189:Multimedia 132:Techniques 75:redundancy 3268:Community 3092:Interlace 2478:Zstandard 2257:Fibonacci 2247:Universal 2205:Canonical 1872:116983697 1685:Bell, Tim 1634:"Summary" 1560:cite book 1056:injection 954:lossless. 832:data set. 824:Knowledge 815:Broukhis. 591:JPEG 2000 505:OptimFROG 334:Zstandard 3254:Symmetry 3222:Timeline 3205:FM-index 3050:Bit rate 3043:Concepts 2891:Concepts 2754:Sampling 2707:Bit rate 2700:Concepts 2402:Sequitur 2237:Tunstall 2210:Modified 2200:Adaptive 2158:Lossless 2106:Archived 2095:Archived 1846:Springer 1719:26313283 1693:Springer 1532:22844100 1456:17045923 1411:13894279 1319:18237979 1194:April 9, 1096:See also 1052:lossless 1047:function 922:Because 853:flashzip 621:HD Photo 416:compress 203:JPEG2000 170:Adaptive 18:Lossless 3212:Entropy 3161:Wavelet 3140:Motion 2999:Wavelet 2979:Fractal 2974:Deflate 2957:Methods 2744:Latency 2657:Motion 2581:Wavelet 2498:LHA/LZH 2448:Deflate 2397:Re-Pair 2392:Grammar 2222:Shannon 2195:Huffman 2151:methods 1932:June 8, 1725:pp. 8–9 1523:3488212 1391:Bibcode 1327:2765169 1299:Bibcode 972:greater 964:deflate 845:FreeArc 685:OpenCTM 617:WMPhoto 613:JPEG XR 607:JPEG XL 601:JPEG-LS 581:images) 531:WavPack 522:Shorten 449:(ATRAC) 419:utility 366:Deflate 291:Methods 261:‍ 257:‍ 238:-based 3323:codecs 3284:People 3187:Theory 3154:Vector 2671:Vector 2488:Brotli 2438:Hybrid 2337:Snappy 2190:Golomb 2011:github 1987:  1963:  1870:  1860:  1784:  1717:  1707:  1530:  1520:  1479:  1454:  1444:  1409:  1361:  1325:  1317:  1150:(LTAC) 1001:define 997:random 980:subset 867:1.30c. 855:, and 739:HapMap 712:before 483:(FLAC) 401:WinRAR 380:images 376:, and 165:static 152:) and 71:random 3114:parts 3112:Codec 3077:Frame 3035:Video 3019:SPIHT 2928:Pixel 2883:Image 2837:ACELP 2808:ADPCM 2798:μ-law 2793:A-law 2786:parts 2784:Codec 2692:Audio 2631:ACELP 2619:ADPCM 2596:SPIHT 2537:Lossy 2521:bzip2 2512:LZHAM 2468:LZFSE 2370:Delta 2262:Gamma 2242:Unary 2217:Range 1868:S2CID 1715:S2CID 1584:(PDF) 1552:(PDF) 1452:S2CID 1407:S2CID 1334:(PDF) 1323:S2CID 1287:(PDF) 1257:ITU-T 932:every 896:file. 857:7-Zip 830:UTF-8 692:Video 585:JBIG2 576:Amiga 524:(SHN) 512:(OSQ) 489:(MLP) 467:(DST) 441:Audio 349:bzip2 330:LZFSE 307:(see 270:error 88:tool 3126:DPCM 2933:PSNR 2864:MDCT 2857:WLPC 2842:CELP 2803:DPCM 2651:WLPC 2636:CELP 2614:DPCM 2564:MDCT 2508:LZMA 2409:LDCT 2387:DPCM 2332:LZWL 2322:LZSS 2317:LZRW 2307:LZJB 2083:2017 2063:2017 2043:2017 2018:2017 1985:ISBN 1961:ISBN 1934:2009 1879:2021 1858:ISBN 1782:ISBN 1749:2022 1705:ISBN 1566:link 1528:PMID 1477:ISBN 1442:ISBN 1387:2419 1359:ISBN 1315:PMID 1269:2019 1196:2022 926:< 899:Let 865:ccmx 810:The 766:demo 696:See 672:WebP 666:TIFF 627:LDCT 619:and 572:ILBM 566:HEVC 562:HEIF 556:FLIF 550:AVIF 390:and 388:7zip 374:gzip 332:and 299:and 255:data 236:LZ77 116:and 114:TIFF 90:gzip 53:data 3171:DWT 3121:DCT 3065:VBR 3060:CBR 3055:ABR 3014:EZW 3009:DWT 2994:RLE 2984:KLT 2969:DCT 2852:LSP 2847:LAR 2832:LPC 2825:FFT 2722:VBR 2717:CBR 2712:ABR 2646:LSP 2641:LAR 2626:LPC 2591:DWT 2576:FFT 2571:DST 2559:DCT 2458:LZS 2453:LZX 2429:RLE 2424:PPM 2419:PAQ 2414:MTF 2382:DMC 2360:CTW 2355:BWT 2327:LZW 2312:LZO 2302:LZ4 2297:842 1850:doi 1697:doi 1518:PMC 1510:doi 1434:doi 1399:doi 1307:doi 1081:zip 849:CCM 827:XML 660:TGA 654:PNG 648:QOI 642:PDF 636:PCX 579:IFF 411:GIF 378:PNG 370:ZIP 326:ANS 224:LZW 118:MNG 110:GIF 108:or 106:PNG 98:MP3 86:GNU 82:ZIP 3341:: 2989:LP 2820:FT 2813:DM 2365:CM 2093:. 2073:. 2053:. 2029:. 2009:. 1983:. 1959:. 1895:. 1866:. 1856:. 1844:. 1840:. 1790:. 1740:. 1713:. 1703:. 1676:^ 1654:. 1618:. 1600:. 1562:}} 1558:{{ 1526:. 1516:. 1506:40 1504:. 1500:. 1450:. 1440:. 1428:. 1405:. 1397:. 1385:. 1353:. 1321:. 1313:. 1305:. 1295:12 1293:. 1289:. 1260:. 1254:. 1231:. 1213:. 1187:. 1026:pi 1007:. 930:, 851:, 847:, 776:. 770:1k 392:xz 372:, 201:. 185:. 3325:) 3321:( 2141:e 2134:t 2127:v 2085:. 2065:. 2045:. 2020:. 1993:. 1969:. 1936:. 1914:. 1881:. 1852:: 1809:) 1806:x 1803:( 1800:C 1751:. 1727:. 1721:. 1699:: 1622:. 1604:. 1568:) 1534:. 1512:: 1485:. 1458:. 1436:: 1413:. 1401:: 1393:: 1367:. 1309:: 1301:: 1271:. 1217:. 1198:. 1064:N 1060:N 1017:N 1013:N 951:N 946:. 944:N 940:F 936:N 928:M 924:N 919:. 917:F 913:N 909:M 905:F 901:M 859:. 839:. 597:) 568:) 285:Δ 259:— 20:)

Index

Lossless
data compression
information
statistical redundancy
lossy compression
data
compression rates
pigeonhole principle
random
redundancy
ZIP
GNU
gzip
mid/side joint stereo
MP3
PNG
GIF
TIFF
MNG
Lossless audio
lossy audio
Huffman coding
deflate algorithm
arithmetic coding
information entropy
indexed images
discrete wavelet transform
JPEG2000
United States
LZW

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.