The two pillars of twentieth-century physics, general relativity and quantum mechanics, could not be more different from each other.

Both theories teach us that the fine structure of nature is more subtle than it appears. But general relativity is a compact gem: conceived by a single mind, that of Albert Einstein, it’s a simple and coherent vision of gravity, space and time. Quantum mechanics, or ‘quantum theory’, on the other hand, has gained unequalled experimental success and led to applications which have transformed our everyday lives (the computer on which I write, for example); yet more than a century after its birth it remains shrouded in mystery and incomprehensibility.

It’s said that quantum mechanics was born precisely in the year 1900, virtually ushering in a century of intense thought. The German physicist Max Planck calculated the electric field in equilibrium in a hot box. To do this he used a trick: he imagined that the energy of the field is distributed in ‘quanta’, that is, in packets or lumps of energy. The procedure lead to a result which perfectly reproduced what was measured (and therefore must be in some way correct) but clashed with everything that was known at the time. Energy was considered to be something which varied continuously, and there was no reason to treat it as if it were made up of small building blocks. To treat energy as if it were made up of finite packages had been, for Planck, a peculiar trick of calculation, and he did not himself fully understand the reason for its effectiveness. It was to be Einstein once again who, five years later, came to understand that the ‘packets of energy’ were real.

The work of Einstein was initially treated by colleagues as the nonsensical juvenilia of an exceptionally brilliant youth. Subsequently it was for the same work that he received the Nobel Prize. If Planck is the father of the theory, Einstein is the parent who nurtured it.

But like all offspring, the theory then went its own way, unrecognized by Einstein himself (mainly due to his deterministic views). In the second and third decades of the twentieth century it was the Dane Niels Bohr who pioneered its development. It was Bohr who understood that the energy of electrons in atoms can only take on certain values, like the energy of light, and crucially that electrons can only ‘jump’ between one atomic orbit and another with fixed energies, emitting or absorbing a photon when they jump. These are the famous ‘quantum leaps’. And it was in his institute in Copenhagen that the most brilliant young minds of the century gathered together to investigate and try to bring order to these baffling aspects of behaviour in the atomic world, and to build from it a coherent theory. In 1925 the equations of the theory finally appeared, replacing the entire mechanics of Newton. It’s difficult to imagine a greater achievement. At one stroke, everything makes sense, and you can calculate everything.

Take one example: do you remember the periodic table of elements, devised by Mendeleev, which lists all the possible elementary substances of which the universe is made, from hydrogen to uranium, and which was hung on so many classroom walls?

Why are precisely these elements listed there, and why does the periodic table have this particular structure, with these periods, and with the elements having these specific properties?

The answer is that each element corresponds to one solution of the main equation of quantum mechanics. The whole of chemistry emerges from a single equation.

The first to write the equations of the new theory, basing them on dizzying ideas, would be a young German of genius, Werner Heisenberg. Heisenberg imagined that electrons do not always exist. They only exist when someone or something watches them, or better, when they are interacting with something else. They materialize in a place, with a calculable probability, when colliding with something else. The ‘quantum leaps’ from one orbit to another are the only means they have of being ‘real’: an electron is a set of jumps from one interaction to another. When nothing disturbs it, it is not in any precise place. It is not in a ‘place’ at all.

In quantum mechanics no object has a definite position, except when colliding headlong with something else. In order to describe it in mid-flight, between one interaction and another, we use an abstract mathematical formula which has no existence in real space, only in abstract mathematical space. But there’s worse to come: these interactive leaps with which each object passes from one place to another do not occur in a predictable way but largely at random. It is not possible to predict where an electron will reappear, but only to calculate the probability that it will pop up here or there. The question of probability goes to the heart of physics, where everything had seemed to be regulated by firm laws which were universal and irrevocable.

The equations of quantum mechanics and their consequences are used daily in widely varying fields: by physicists, engineers, chemists and biologists. They are extremely useful in all contemporary technology. Without quantum mechanics there would be no transistors. Yet they remain mysterious. For they do not describe what happens to a physical system, but only how a physical system affects another physical system.

Our knowledge grows, in real terms. It allows us to do new things that we had previously not even imagined. But that growth has opened up new questions. New mysteries. Those who use the equations of the theory in the laboratory carry on regardless, but in articles and conferences that have been increasingly numerous in recent years physicists and philosophers continue to search. What is quantum theory, a century after its birth? An extraordinary dive deep into the nature of reality? A blunder that works, by chance? Part of an incomplete puzzle? Or a clue to something profound regarding the structure of the world which we have not yet properly digested?...