Entropy is a measure of how many configurations could yield the same macrostate, and thus how probable the macrostate is. It can be a measure of information, or a measure of disorder in a physical …
Very refreshing to see the technical content of the notion of entropy succinctly summarized in the first sentence. Too many articles about entropy fixate on interpreting entropy as “order” or “disorder” without ever giving a precise account of what entropy is.
Very refreshing to see the technical content of the notion of entropy succinctly summarized in the first sentence. Too many articles about entropy fixate on interpreting entropy as “order” or “disorder” without ever giving a precise account of what entropy is.