A few more of these dangerous items were found during repairs to Westminster Abbey in 1909, and they were put safely into a museum, where they could do less damage.
Notches probably came before language. Prehistoric people probably used words like ‘one’, ‘two’, ‘three’ and ‘many’ for anything more complicated. In fact, sometimes ‘three’ might mean ‘many’. Take the French, for example: ‘trois’ (three) and ‘tres’ (very). Or the Latin: ‘tres’ (three) and ‘trans’ (beyond). A tribe of cave dwellers was discovered in the Philippines in 1972 who couldn’t answer the question ‘How many people are there in your tribe?’ But they could write down a list of all 24. But then counting is a philosophical problem, because you have to categorize. You have to be able to see the similarity in things and their differences, and decide which are important, before you can count them. You have to be able to do Venn diagrams in your head. ‘It must have required many ages to discover that a brace of pheasants and a couple of days were other instances of the number two,’ said the philosopher Bertrand Russell. But once you have grasped that concept, there are so many other categories you have to create before you can count how many people there are in your tribe. Do you count children? Do you count foreigners who happen to live with you? Do you count people who look completely different from everybody else? Counting means definition and control. To count something, you have to name it and define it. It is no coincidence that it was the ancient Sumerian civilization, the first real empire, which developed the idea of writing down numbers for the first time. They had to if they were going to manage an imperial culture of herds, crops and people. Yet any definition you make simply has to be a compromise with the truth. And the easier it is to count, the more the words give way to figures, the more counting simplifies things which are not simple. Because although you can count sheep until you are blue in the face, actually no two sheep are the same.
The old world did not need precision. If Christ’s resurrection was important, it wasn’t terribly vital to know what the actual date was. Instead Europeans used numbers for effect – King Arthur was described as killing tens of thousands in battles all by himself. Modern politicians are the last remaining profession which does this, claiming unwieldy figures which they have achieved personally, and pretending a spurious accuracy by borrowing the language of statistics, when actually they are using the numbers for impact like a medieval chronicler. Nor were the numbers they used much good for calculation. Nowadays Roman numerals only exist for things which powerful people want to look permanent – like television programmes or the US World Series – but which are actually very impermanent indeed.
The new world needed accuracy and simplicity for its commerce. Although they were briefly banned by an edict at Florence in 1229, the new Arabic numbers – brought back from the Middle East by the crusaders – began to be spread by the new mercantile classes. These were the literate and numerate people – with their quill pens tracing the exchange of vast sums – plotting the despatch of fleets for kings, managing the processing of wool with the new counting boards.
And soon everybody was counting with the same precision. King John’s Archbishop of Canterbury, Stephen Langton, had already organized a system of chapters and verses for the Bible, all numbered and meticulously indexed, which by the following century used the new Arabic numerals. Soon the new numbers were being used to measure much more elusive things. By 1245, Gossoin of Metz worked out that if Adam had set off the moment he was created, walking at the rate of 25 miles a day, he would still have to walk for another 713 years if he was going to reach the stars. The great alchemist Roger Bacon, who tried to measure the exact arc of a rainbow from his laboratory above Oxford’s Folly Bridge, calculated shortly afterwards that someone walking 20 miles a day would take 14 years, seven months and just over 29 days to get to the moon.
It’s a wonderful thought, somehow akin to Peter Pan’s famous directions for flying to Never Never Land, ‘turn right and straight on till morning’. But it was a different time then, when space was measured in the area that could be ploughed in a day and when time was dominated by the unavoidable changes between day and night. There were 12 hours in the medieval day, and 12 hours in the night too, but without proper tools for measuring time, these were expanded and compressed to make sure the 12 hours fitted into the light and the dark. An hour in the summer was much longer than an hour in the winter, and actually referred to the ‘hours’ when prayers should be said.
Nobody knows who invented clocks, though legend has it that it was the mysterious Gerbert of Aurillac, another medieval monk who spent some time in Spain learning from the wisdom of the Arabs, and who, as Sylvester II, was the Pope who saw in the last millennium. He was said to be so good at maths that contemporaries believed he was in league with the Devil. It was not for 250 years that clocks arrived in the mass market, but once they had, you could not argue with their accuracy. From the 1270s, they dominated European townscapes, insisting that hours were all the same length and that trading times and working times should be strictly regulated. Counting in public is, after all, a controlling force, as the people of Amiens discovered in 1335 when the mayor regulated their working and eating time with a bell, attached to a clock.
Clocks had bells before they had faces, and were machines of neat precision, as you can see by the fourteenth-century one still working in the nave of Salisbury Cathedral, with its careful black cogs swinging backwards and forwards, the very model of the new medieval exactitude. Soon every big city was imposing heavy taxes on themselves to afford the clock machinery, adding mechanical hymns, Magi bouncing in and out and – like the one in Strasbourg in 1352 – a mechanical cockerel which crowed and waggled its wings.
Where would they stop, these medieval calculators? Scholars at Merton College, Oxford in the fourteenth century thought about how you can measure not just size, taste, motion, heat, colour, but also qualities like virtue and grace. But then these were the days when even temperature had to be quantified without the use of a thermometer, which had yet to be invented. They must have been heady days, when the whole of quality – the whole of arts and perception – seemed to be collapsing neatly into science.
Renaissance humanity was putting some distance between themselves and the animals, or so they believed. Anyone still dragging their feet really was holding back history. Some dyed-in-the-wool conservatives insisted that people know pretty well when it was day and night, and when the seasons change, without the aid of the new counting devices. But anyone who thinks that, said the Protestant reformer Philip Melanchthon, deserves to have someone ‘shit a turd’ in his hat. The new world of number-crunchers had arrived.
III
To really get down to the business of measuring life, two important ideas about numbers were still needed – a concept of zero and a concept of negative numbers. But to emerge into common use, both had to run the gauntlet of the old battle lines about numbers drawn across medieval Europe. Then there were the adherents of the old ways of the abacus, whose computations were not written down, and whose ritual movements as they made their calculations were inspired by the old wisdom of Pythagoras. The new computations were all written down. They had no mystery. There was something open and almost democratic about them, and they needed no priests to interpret them. Calculation was no longer a mysterious art carried out by skilled initiates.
And the big difference between them now was zero. Its arrival in Europe was thanks to a monk, Raoul de Laon – a particularly skilful exponent of the art of the abacus – who used a character he called sipos to show an empty column. The word came from the Arabic sifr, meaning ‘empty’, the origin of the word ‘cypher’. Either way, the old abacus could be put away in the medieval equivalent of the loft.
Inventing zero turns numbers into an idea, according to the child psychologist Jean Piaget. It’s a difficult idea too: up to the age of six and a half, a quarter of all children write 0+0+0 = 3. But once people had begun to grasp it, they tended to regard zeros with suspicion. Division by zero meant infinity and infinity meant God, yet there it was bandied around the least important trade calculations for fish or sheep for everyone to see. Even more potent were the objections of the Italian bankers, who were afraid this little symbol would lead to fraud. It