The true sign of intelligence is not knowledge but imagination (Albert Einstein)
As you no doubt know, this year marks the hundredth anniversary of Einstein’s theory of general relativity, first presented by him in 1915 to the Prussian Academy of Sciences.
Even among physics enthusiasts, general relativity has a daunting reputation, largely because of its mathematical difficulties, and it was very much a minority interest for several decades. It has come to the foreground of interest recently because of its relevance to advances in cosmology, which shed light on the origins and development of the universe, and also because of its predictions of exotic phenomena such as wormholes and black holes, which have entered popular consciousness as staples of science fiction. It also led to one of the entry points for the concept of information to become central to physics; the other being Boltzmann’s ideas on statistical mechanics and entropy.
Einstein, of course, had two theories of relativity; the special theory and the general theory. He derived both from a very simple starting point: the idea that the laws of physics should be found to be the same in any laboratory, regardless of where that laboratory is, or how it is moving; ‘in all frames of reference’ to put it in the usual technical way.
The special theory, dating from 1905, is what most people understand by ‘relativity’. This is the theory that tells us that space and time are linked together in a four-dimensional ‘spacetime’, that nothing than exceed the speed of light, that time appear pass more slowly for someone travelling at high speed from the perspective of someone at rest, and that mass and energy are related by the familiar equation of E=mc2, s that we should speak of ‘massenergy’ as a single concept.
The special theory is ‘special’ because it works only in the special circumstances where the laboratory is keeping still (at rest) or moving with a constant speed in a constant direction; put technically, an ‘inertial frame of reference’. It doesn’t work when the laboratory is accelerating (or decelerating, or changing direction, which amount to the same thing). And this means that, although the special theory deals with the mechanics of moving bodies and with electromagnetism, it cannot account for the force of gravity, because gravity is closely associated with acceleration.
The general theory overcame this problem, by being ‘general’ in sense of being applicable to any circumstances, involving acceleration or not, and hence being able to include gravity. It shows that the four-dimensional spacetime of special relativity is distorted, or curved, by the presence of matter, and that this curvature in turn affects the movements of matter, which follows the shorted path in curved spacetime. This gives the effects described from Newton’s time as the ‘force of gravity’.
Developments over the last few decades have been rapid and led to dramatic changes in our view of the universe. There are now numerous accessible accounts, for example from Plus Maths, and in the books by Pedro Ferreira and by Pankaj Joshi.
So what has this to with information? The concept came in from the study one of the predictions of general relativity, ‘black holes’, created when a massive star collapses to such an extent that all its mass is in a single point, warping spacetime around it so that nothing, not even light can escape. Anything which falls into a black hole is effectively lost from the universe. According to the ‘no hair’ theorem, proved by Roger Penrose and colleagues in the early 1970s, black holes retain no ‘memory’ of how they were formed. All black holes with the same mass (and spin and charge) look exactly alike, regardless of what material went into them. Penrose also showed that that any physical change of any change had the result that the surface area of a black hole always stayed the same or (usually) increased; this has resonance with the second law of thermodynamics, by which entropy always increases.
This was problematic, in that the same mass of material going into a black hole could be very ordered or very disordered, with very different entropies; and entropy is connected closely to information. All information/entropy is lost to the universe in a black hole. If very high entropy material is dropped in, the entropy of the universe decreases, in contradiction to the second law. Jacob Bekenstein proposed that back holes do have entropy, proportional to their surface area. But this would mean that black holes would have a temperature, and would radiate; they would not be black. This was highly controversial, but Stephen Hawking, who initially hated the idea, showed it to be true. Through quantum processes at the event horizon, black holes do emit a very faint ‘Hawking radiation’; as the black hole shines it loses mass, and eventually evaporates, returning the ‘lost’ massenergy and entropy.
But this still leaves an information paradox – although the massenergy is returned, the information about what went in is lost. And how is information/entropy stored in a back hole anyway? This problem is far from fully solved, but the solution generally believed to be tied up with the development of a theory of ‘quantum gravity’. Einstein relativity theories, which deal with the universe at its largest scales, are classical theories, with a very different conceptual and mathematical structure to the quantum theory which applies in the realm of the very small. Combining the two, and creating a theory of quantum gravity, has so far eluded physicists.
There have been various ideas as how black holes can process information. Lee Smolin and colleagues suggest that the area of a black hole surface can be divided into microscopic pieces, each of which stores a bit of information like a screen of digitised information. Andrew Strominger and Cumrun Vafa have suggested that using a form of string theory, a relationship between entropy, information and surface area of black holes can be derived. The interactions of strings and the structures formed from them on the horizon of a black hole can be used to reconstruct all the information contained in the black hole.
According to these views, the amount of information that a black hole can store – related to its entropy – is a function of its surface area, not of its volume. This means that the maximum amount of information that can be stored in any region of space has an upper limit. This can be found by considering a hypothetical black hole containing that volume of space, and working out how much information can be stored on its surface. What happens with the information on the surface determines all the physics happening in the space which it encompasses, in the same way as a two-dimensional hologram can encode all the information of a three-dimensional scene. If this is true for a piece of space, it may even be true everywhere, for the whole of the universe. This leads to the idea of a ‘holographic universe’, in which information exchange is the most fundamental process, with space, time, matter and physical laws the consequences.
This challenging idea is by no means accepted by all physicists, but the idea of information as a fundamental feature of the physical world in certainly gaining traction. Albert’s imagination has brought us a long way.