neuralcosmology
Essays
March 14, 2026·3 min

A Loss Function for the Universe

Five independent anomalies, five different fields, one shape — and what it tells us to measure next.

Here is a small experiment. Line up five of the most stubborn anomalies in contemporary science side by side.

The first is galactic rotation. Stars at the edges of spiral galaxies move faster than Newton says they should, given the mass we can see. The standard fix is to postulate large amounts of invisible matter — dark matter — distributed in halos around each galaxy. This works, in the sense that models with the right halo reproduce the curves, but it has not quite settled into the kind of discovery that ends an argument.

The second is the matter–antimatter asymmetry. The Standard Model predicts that the Big Bang should have produced equal amounts of both; the universe we live in is made of matter, and the small amount of antimatter we detect is consistent with cosmic rays and collider physics, not primordial asymmetry. There are several proposed mechanisms — leptogenesis, electroweak baryogenesis — but none have been confirmed.

The third is the measurement problem in quantum mechanics. The Schrödinger equation evolves states unitarily until we measure them, at which point something happens — wave function collapse, or an effective collapse — that is not itself part of the equation. Interpretations multiply; the experimental record keeps refusing to single one out.

The fourth is consciousness. There is no accepted account of why a particular configuration of physical matter gives rise to subjective experience. Integrated Information Theory, Global Workspace, Predictive Processing — all plausible, none decisive.

The fifth is cellular bioelectricity. In the last decade it has become clear that cells compute collectively via bioelectric gradients in ways that look uncomfortably like the kind of coordination you would expect in a distributed system with shared state. Michael Levin's lab has kept producing results that are hard to reconcile with purely local chemistry.

These five live in five different journals. The usual thing to do with a list like this is to take each in turn and let the specialists fight it out.

Let me propose a different move. Let's ask what happens if we try to fit them with one loss function.

Why a loss function is a different kind of object

In machine learning, a loss function is the thing you minimise. It takes the current state of a system and says: here is how far from the target we are. Gradient descent is the procedure that nudges the system in the direction that reduces the loss.

Physics already uses objects like this — the action, the free energy, the variational principle. But there is a subtle difference. In ML, the loss is defined by what we want. The system doesn't know the loss exists; we impose it from outside. In physics, the analogous principle is that the system is already minimising something — action, entropy, free energy — and the universe falls out as the trajectory that does the minimising.

What if the five anomalies above share one loss function, and the reason they look separate is that we are looking at different coordinates of the same descent?

This is the move that Pointer Architecture, the research programme behind this site, takes seriously. The hypothesis is testable: if there is a single underlying minimisation, then certain residuals at galactic scales should correlate with certain residuals at biological scales. The preprint reports the first such test on SPARC, and the code is public.

I am not asking you to buy the hypothesis. I am asking you to notice that this is the kind of hypothesis that pays rent. If it is wrong, it is wrong in specific, measurable ways. If it is right, it rewrites a lot of textbooks.

What to measure next

Three things.

  1. Residual-structure features correlated with galactic age. Already in the preprint; we find four of six.
  2. Bioelectric coordination under perturbation consistent with a shared minimum. Hard to do without collaborators; the Levin-lab techniques are the right starting point.
  3. Conditions under which the measurement problem looks different. Quantum Zeno-like setups tuned to the predicted regime; this is where the experimental cost is highest.

Each of these is a bet. Each bet is made publicly, in advance, which is the only way to tell a scientific claim from a plausible story.

The companion book, The Celestial Code, walks through the argument in full. The preprint gives the numbers. This essay is the one-paragraph version I wished I had before writing any of it.

pointer-architecturecosmologyconsciousness