** Information ≡ that which reduces uncertainty about the state of a system.
Correct that Shannon entropy is syntax-only. But [[003_A1.3_Information-Primacy|A1.3]] establishes that χ carries semantic content. This definition captures the minimal operational criterion. Semantic information is uncertainty reduction about meaning, which still fits the definition. The χ-field adds the semantic layer; this definition provides the quantitative foundation.
Von Neumann entropy S(ρ) = -Tr(ρ log ρ) is the quantum generalization and still measures uncertainty reduction. The definition holds; only the mathematical formalism changes. A qubit's information content is still "that which reduces uncertainty about the quantum state."
Kolmogorov complexity K(x) measures the minimal description length of x—which is itself a measure of how much uncertainty is reduced by receiving x. They are complementary, not competing. Both confirm information as uncertainty reduction.
Maxwell's demon resolution (Szilard 1929, Bennett 1982):
The demon must acquire information about particle positions. Landauer's principle: erasing this information costs k_B T ln 2 per bit, exactly compensating the work extracted. Information acquisition/erasure is a physical process.
Entropy-information correspondence:
S = k_B H
Boltzmann entropy S relates to Shannon entropy H by Boltzmann's constant. Thermodynamic entropy IS information (measured in different units).
Black hole thermodynamics:
Shannon entropy is uniquely characterized by these axioms (Khinchin 1957):
1. Continuity: H(p₁,...,p_n) is continuous in all p_i
2. Maximum: H(1/n,...,1/n) = f(n) is monotonically increasing in n
3. Recursivity: H(p₁,...,p_n) = H(p₁+p₂,p₃,...,p_n) + (p₁+p₂)H(p₁/(p₁+p₂), p₂/(p₁+p₂))
Uniqueness theorem: Any function satisfying these axioms has the form:
H(p_1,...,p_n) = -k \sum_i p_i \log p_i
for some constant k > 0. Shannon entropy is the ONLY consistent measure of uncertainty.