Compressibility and Entropy

Compressibility and entropy are related. Kolmogorov Complexity is a theoretical measure of the complexity of a given string. It is defined as the minimum length program that has that string as the output. This measure is non-computable due to the halting problem in computation. Although non-computable, Kolmogorov complexity can be approximated by compression algorithms and forms a foundation for practical metrics for complexity.

Compression algorithms aim to reduce redundancy to produce a more compact representation of the input. Low entropy inputs are compressed to smaller sizes that better reflect the intrinsic entropy. This compact representation tends to have higher entropy because most regularity has been compressed away. A random input string has high entropy and therefore is incompressible. In fluid dynamics, the compressibility of a fluid determines the volumetric response of a fluid to changes in pressure. Fluid dynamical compression reduces overall entropy as the available phase space is decreased. The compressibility of information is determined by its entropy and likely produces a result that has higher entropy in contrast to the fluid dynamical counterpart.