Anonymous ID: 38e9c4 April 3, 2024, 12:52 a.m. No.20670963   🗄️.is 🔗kun   >>0965

>>20670408

 

A measure of the disorder of a system. Systems tend

to go from a state of order (low entropy) to a state of

maximum disorder (high entropy).

 

The entropy of a system is related to the amount of

information it contains. A highly ordered system can be

described using fewer bits of information than a disordered

one. For example, a string containing one million "0"s can be

described using run-length encoding as [("0", 1000000)]

whereas a string of random symbols (e.g. bits, or characters)

will be much harder, if not impossible, to compress in this

way.

 

Shannon's formula gives the entropy H(M) of a message M in

bits:

 

H(M) = -log2 p(M)

 

Where p(M) is the probability of message M.

 

(1998-11-23)