Classical physics treats information as a quantifiable reduction of uncertainty: a system has a definite state, and information is what we lack about it. In quantum physics, the situation is subtler. The state itself may be indeterminate, and “information” seems to occupy a space between ontology and epistemology — sometimes objective, sometimes observer-dependent, sometimes both.
In a relational ontology, this ambiguity is unnecessary. Information is not an entity or a quantity stored in things. It is a construal of constraint — a measure of how a system's internal tensions delimit the space of possible actualisations. Information is not what a system “contains,” but how it is structured for transformation.
1. Information as Relational Constraint
-
In this model, information is not a token passed between systems, but a measure of coherence under constraint,
-
It quantifies how a system constrains itself — how its internal relations pattern the field of potential outcomes,
-
There is no “amount” of information in a particle — there is only the degree of differentiation a configuration permits under a given construal.
2. No Carriers, No Containers
-
Classical and quantum information theories alike often rely on the metaphor of information-as-substance — something that can be encoded, stored, transmitted, and decoded,
-
But a relational view sees no containers and no carriers: there are only relational fields resolving under shifting conditions,
-
“Transmission of information” is not movement, but coherent transformation across subsystems under shared constraint.
3. Measurement as Interpretive Resolution
-
A quantum measurement is not the extraction of pre-existing information from a system,
-
It is the resolution of tension: the system and apparatus co-constraining each other to produce a new configuration of coherence,
-
What we call “gaining information” is in fact punctualising a relational field — producing new differential structure under new constraints.
4. Entropy as Potential, Not Disorder
-
Information entropy, in relational terms, is not a measure of randomness or ignorance,
-
It is a measure of openness — the extent to which potential remains unresolved within a given configuration,
-
High entropy does not signal chaos; it signals greater relational flexibility, more available paths to coherence.
5. Quantum Information and Relational Dynamics
-
Quantum information theory shows that entanglement, superposition, and coherence can be harnessed in computation and communication,
-
But from a relational standpoint, this is not because particles “store” more bits,
-
It is because relational systems can support more nuanced and distributed constraints, allowing new kinds of transformational grammar.
Closing
Information, in a relational ontology, is not a substance, not a message, not a commodity. It is the differential structure of constrained possibility — a signature of how a system is disposed to resolve itself under the tensions that define it. It measures coherence, not content; transformability, not transmissibility.
In the next post, we will explore symmetry and invariance — how physics encodes conserved quantities and transformation rules, and how a relational perspective reframes these as patterns in the grammar of affordance rather than properties of objects.
No comments:
Post a Comment