Definition · Chain Position 4 of 346

INFORMATION DEFINITION

** Information ≡ that which reduces uncertainty about the state of a system.

Connections

Objections & Responses
Objection: Semantic vs Syntactic Information
"Shannon information ignores meaning. Your definition is purely syntactic."
Response

Correct that Shannon entropy is syntax-only. But [[003_A1.3_Information-Primacy|A1.3]] establishes that χ carries semantic content. This definition captures the minimal operational criterion. Semantic information is uncertainty reduction about meaning, which still fits the definition. The χ-field adds the semantic layer; this definition provides the quantitative foundation.

Objection: Quantum Information Is Different
"Quantum information (qubits) doesn't fit classical Shannon theory"
Response

Von Neumann entropy S(ρ) = -Tr(ρ log ρ) is the quantum generalization and still measures uncertainty reduction. The definition holds; only the mathematical formalism changes. A qubit's information content is still "that which reduces uncertainty about the quantum state."

Objection: [[029_D4.1_Kolmogorov-Complexity|Kolmogorov Complexity]] Alternative
"Algorithmic information (K-complexity) is more fundamental"
Response

Kolmogorov complexity K(x) measures the minimal description length of x—which is itself a measure of how much uncertainty is reduced by receiving x. They are complementary, not competing. Both confirm information as uncertainty reduction.

Physics Layer

Thermodynamic Information

Maxwell's demon resolution (Szilard 1929, Bennett 1982):

The demon must acquire information about particle positions. Landauer's principle: erasing this information costs k_B T ln 2 per bit, exactly compensating the work extracted. Information acquisition/erasure is a physical process.

Entropy-information correspondence:

S = k_B H

Boltzmann entropy S relates to Shannon entropy H by Boltzmann's constant. Thermodynamic entropy IS information (measured in different units).

Black hole thermodynamics:

  • Bekenstein entropy: S = k_B A/(4ℓ_P²)
  • Hawking radiation carries information out
  • Information paradox resolution (via AdS/CFT): information is conserved, not destroyed
Mathematical Layer

Shannon's Axiomatic Foundation

Shannon entropy is uniquely characterized by these axioms (Khinchin 1957):

1. Continuity: H(p₁,...,p_n) is continuous in all p_i

2. Maximum: H(1/n,...,1/n) = f(n) is monotonically increasing in n

3. Recursivity: H(p₁,...,p_n) = H(p₁+p₂,p₃,...,p_n) + (p₁+p₂)H(p₁/(p₁+p₂), p₂/(p₁+p₂))

Uniqueness theorem: Any function satisfying these axioms has the form:

H(p_1,...,p_n) = -k \sum_i p_i \log p_i

for some constant k > 0. Shannon entropy is the ONLY consistent measure of uncertainty.

Evidence
Empirical Grounding
This isn't philosophy. This is measured.
  • Landauer's Principle
  • Holevo Bound
Defeat Conditions

To Falsify This

  1. Provide an alternative definition of information that does not reference uncertainty reduction
  2. Show a case where "information" exists but uncertainty is not reduced
  3. Demonstrate that Shannon's formalization is fundamentally flawed