neuralcosmology
Essays
May 3, 2026·5 min

Superposition as `while(true)`: quantum mechanics through an engineer's eyes

A generative loop in code gives a surprisingly precise intuition for wave function collapse and the delayed-choice quantum eraser. Where the metaphor stops and physics begins.

Anyone who has ever run a generative loop knows the feeling. While the loop runs, every possible continuation lives in it at the same time. Step in — drop a print, take a return, read a value — and from the whole bundle of branches only one survives. The others simply do not get recorded as "what happened."

That picture sits so close to what physics calls wave function collapse that it is worth asking: metaphor, or hint about the architecture.

What the formalism actually says

The Schrödinger equation evolves a state unitarily. Meaning: as long as nobody "looks," the system sits in a linear combination of possible outcomes with known amplitudes. Measurement is the only step that converts amplitudes into probabilities and lets one branch persist.

That step is not described inside the equation. This is exactly what is called the measurement problem. The Copenhagen interpretation (Bohr, Heisenberg) takes collapse as a primitive. Many-worlds (Everett, 1957) claims there is no collapse, only branching of the observer. Decoherence (Zeh, Zurek, 1970s onward) gives a quantitative mechanism: interaction with the environment suppresses the off-diagonal entries of the density matrix and effectively "selects" a measurement basis. To the question "what is left after," decoherence is silent; it explains why we see definite states rather than their superpositions.

In code terms this is exactly the mechanic an engineer meets in distributed systems. While a transaction is running, the state is undefined. The commit fixes one slice. All other branches that opened along the way are discarded and never reach the final database. Decoherence is the analogue of losing transactional isolation: an external observer "sees" one branch because the intermediate states have ceased to be reachable.

That is still metaphor. From here it gets more interesting.

The delayed-choice quantum eraser

The experiment where the metaphor starts to crowd reality.

Scully and Drühl proposed the scheme in 1982. A cross-check on entangled photon pairs: one flies to the main screen, the second carries a "tag" — information about which slit the first went through. While the tag exists there is no interference at the main screen. Erase the tag and the interference returns. The erasure can be performed after the main photon has already hit the screen.

Kim et al. (2000, Phys. Rev. Lett. 84, 1) ran the basic version. A Korean group in 2023 (Optica 10:1, 12) reproduced the scheme on coherent photon pairs, with delays clearly outside the time window of the main measurement. In 2024 came implementations on programmable superconducting qubits: the same Scully–Drühl logic is built directly into a quantum circuit, and the choice to erase the tag is made algorithmically after the "measurement" of the main qubit (Wang et al., Phys. Rev. A 109, 2024).

The experiment does not send information backward in time. Worth stressing, because popular retellings get this wrong on a loop. Erasing the tag regroups already-recorded data — it changes the joint distribution between main screen and which-path channel, and interference appears in the subset selected by the new condition.

In machine-learning terms: measurement is the loss. Erasing the tag is a backward pass: it does not "come from the future," it recomputes joint statistics over a different dependency graph. The forward pass already happened; its raw outputs do not change. What changes is how we group them.

That is the engineering content of collapse. A quantum system is not "reality waiting for an observer." It is a distributed statistical structure in which the choice of measurement determines which joint distributions are accessible to us and which are not.

"It from bit" as a blueprint, not a slogan

John Wheeler in 1989 wrote down the line "it from bit": every physical quantity, at bottom, derives its value from binary yes/no answers — that is, from informational events. For years it sounded like a philosophical flourish without technical content.

Holographic codes (Pastawski, Yoshida, Harlow, Preskill, 2015 — the so-called HaPPY code) gave the first concrete model: spacetime as a network of tensors, with "deep" degrees of freedom reconstructible from "boundary" ones via specified unitary rules. If that picture is right, the physics of bulk objects is literally operations on a graph of pointers: add a node, rewrite an edge, repackage a codeword.

Holographic codes are not a metaphor anymore. They are a concrete mathematical object on which testable predictions hold — about reconstruction, about error tolerance, about entropy relations. That is real engineering physics. The universe is not "like" a computer. Part of its formalism coincides with the formalism of a specific class of computations.

Where the metaphor stops

No, your while(true) does not literally generate quantum states. Decoherence on an ordinary chip is instantaneous because the environment is rigidly coupled to it through heat exchange. To preserve coherence long enough for a measurable quantum effect you need cryogenic conditions, isolation, specially engineered degrees of freedom — exactly what quantum-computing labs spend their lives on.

And no metaphor turns measurement into magic. Bell experiments (Hanson 2015, Zeilinger 2017, NIST 2018) showed local realism is dead. That means: a quantum correlation is not a "hidden parameter tucked inside the system." It is something structurally poorer and stranger. The generative-loop metaphor helps catch the shape; it does not substitute for the physics.

Why this parallel is worth drawing

The usefulness of the engineering view is not that it "explains quantum mechanics." It is that it makes quantum mechanics discussable for people who already have working intuition about distributed systems, transaction semantics, and gradient descent. There are millions of those people now. Foundational physics has not had a pool of fresh readers like this since the decade after WWII.

The pointer architecture programme behind this site sits exactly at that seam. Its claim is narrow and testable: certain formal structures describe both computation and physics simultaneously, and that gives testable predictions. No "everything is code."

The preprint reports the first such test on galaxies. The companion book, Celestial Code, walks the intuition from while(true) to information geometry. This essay is about the cheapest bridge between the two languages. Anyone who wants to argue seriously about how reality is built should be willing to walk that bridge in both directions.

quantum-mechanicsinformationmeasurementWheeler