When I was in high school, one of my Physics professors (Prof Ananthan) said something profound. He was a big fan of Newton, and when we were studying light, he explained Newton’s corpuscular theory of light in great detail, including even Newton’s explanations for the typical wave-phenomena like interference and diffraction using a particle theory. After he’d finished giving us a complete tour of Newton’s theories, over nearly a week, he told us: “That was the genius of Newton. He could explain everything using the wrong theory.”
That quote has stuck in my mind for the longest time. I’m a bit of a science-skeptic nowadays (story for another day), but one thing I find fascinating about science in general (and physics in particular) is that there are so many different ways to formulate a solution to the same problem. Of late, I’ve been thinking about a formulation in terms of something different: information.
To start a little tangentially. In school physics, we typically learn mechanics in terms of Newton’s laws. Newton’s laws are time-centric – they describe how things change with time. Using Newton’s differential calculus, it becomes possible to calculate – given any initial configuration of particles, their masses, positions, velocities and accelerations known, what state this configuration will be in at any point of time. The equations are differential equations with time as the independent variable, and the positions of different particles the dependent variable.
In one of Feynman’s lectures, he takes an interlude to talk about a different formulation of mechanics – not the differential equations describing particles’ positions, but something else. He talks about what is called the calculus of variations. In this formulation, we assume that all matter, passing through space, leaves ‘trails’ in space-time. Instead of concentrating on the particles, we concentrate on the trail. And Newton’s three laws become one very elegant law called the Law of Action: that of all the hypothetical trail that a particle can make in space, the one that ends up being made is the one that minimizes ‘action’, where action is a technical word meaning the difference between kinetic and potential energy.
In collegiate physics, I studied yet another formulation of mechanics, viz. the Hamiltonian. The Hamiltonian for a physical system is a complex beast of a matrix, which represents the total energy of the system. In the Hamiltonian formulation, we are able to solve for the positions, velocities and accelerations of matter in a system by solving the equation that formulates the law of conservation of energy of the system.
The last several days, I have been wondering whether there has been a formulation of physics that conserves (doesn’t conserve?) information. Information is well-defined in the communication space – Shannon’s ‘negative log probability’ is the starting point, though scientists have gotten quite far ahead of that now.
One interesting thing about looking at information flow in a system is that it provides a neat solution for Maxwell’s demon’s paradox. We know from that resolution of that paradox that it takes energy to create information. So is there an equivalence there, which we could pose as a conservative principle of information.
We look at the physics of the world around us and see things moving and changing. A more ‘Zen’ way of looking at it is that things are separate, and the agency that animates them is different – called energy. A third way of looking at it would be that matter exists, energy exists, but there is also the ‘creation of information’ which doesn’t come for free but at a price, and the price may very well be energy.