We’re so familiar with the standard set of symbols for arithmetic operations that appears on every calculator keyboard that we hardly ever stop to think who created them, or when.
Considering that people have been keeping records on everything from wax and clay tablets to animal skins and tree bark for at least 4000 years, it’s a bit of a shock to discover that our symbols for operations like addition and subtraction are less than 500 years old. But it’s only in such comparatively recent times that most calculations have been done by making marks on paper.
An older method was to use a counting frame such as the abacus. There was a long-running controversy in medieval times about which was faster, the counting frame or pencil and paper, and competitions were held between the two systems to try to decide the matter. There were even names for the disputing groups, abacists and algorists. The second word is closely related to our modern algorithm for a step-by-step recipe for carrying out some calculation. Even now, the abacus is far from defunct in many societies, because in practised hands it’s very quick for simple calculations — and unlike the calculator, it doesn’t need batteries.
For those who did calculations using symbols, it was common in medieval times to indicate plus and minus by the letters p and m, each with a bar or a wavy line over the top, a system that grew up in Italy. Our modern symbols for these operations didn’t appear until the late fifteenth century. They first turn up in a textbook on commercial arithmetic which Johann Widman published at Leipzig in 1489 under the title Rechnung uff allen Kauffmanschafften. But he didn’t use them as we do now, but as symbols for surpluses or deficits in business problems (though some historians still argue about what he did mean by them). There’s some evidence they were around in the commercial world before Widman adopted them, for example as a quick way for merchants to mark barrels to indicate whether they were full or not. It seems that the plus sign started out as an abbreviated scribes’ way of writing the Latin et, “and”, but nobody seems to know for sure where the minus sign came from. They were gradually adopted as standard symbols throughout Europe in the century after Widman’s book came out.
The person most responsible for introducing them to England, and so eventually to the English-speaking world, was Robert Recorde, a mathematician of the sixteenth century, perhaps the only one of any stature in a century that saw very few English workers of note in the field. His mathematical works were written in English. At the time this was still extremely uncommon (more than a hundred years later, Newton automatically wrote his books in Latin, as did many scholars even after him). As a result, Recorde’s books stayed in print as standard texts for at least a century after his death, and they were correspondingly influential. His most famous work was the Whetstone of Witte of 1557. In it he introduced these newfangled signs: “There be other 2 signes in often use of which the first is made thus + and betokeneth more: the other is thus made - and betokeneth lesse”.
In the same book he introduced our modern equals sign: “I will sette as I doe often in woorke use, a paire of paralleles, or Gemowe [that is, twin] lines of one length, thus: ======, bicause noe 2. thynges, can be moare equalle”. His equal signs were actually quite big, about five or six times the length of ours today. They varied in length, seemingly being sized to fit the space available, and were made up of shorter type characters, which look very like our modern equals sign (it seems the printer had this character in his case, perhaps as decorative type, or as a variant on a hyphen). It took more than a century for Recorde’s sign to oust rival schemes, such as the curly symbol of Descartes (which was probably the astrological sign for Taurus turned on its side), and for it to be shortened to match the lengths of the other symbols.
The word whetstone in the title of Recorde’s book, by the way, was a pun on the word coss, then used in English for the unknown thing in algebra (and hence the cossic art or the rule of coss for algebra). This word had come through French from the Italian cosa as a translation of the Arabic shai, “a thing”, but Recorde probably got it from German, where it was also used. The pun arises because in Latin cos means a whetstone. Recorde may have written in English, but he still expected his audience to appreciate a trilingual pun!
Our modern multiplication sign (×) seems to have been invented by the British mathematician William Oughtred. He used it in his Clavis Mathematicae (Key to Mathematics), which was written about 1628 and published in London in 1631. It had turned up in an appendix to a posthumous work by the Scottish mathematician John Napier a decade before, but the suggestion is that Oughtred wrote that, too.
The symbol for division, ÷, was first used by Johann Rahn in Teutsche Algebra in 1659, though it may be that John Pell, who edited Rahn’s book, could really be responsible for introducing it. The symbol wasn’t new. It had been used to mark passages in writings that were considered dubious, corrupt or spurious. Sometimes a dash had been used instead, so in historical origin the division sign could be a dash with a dot above and below it. It has also been suggested that it’s a minus sign with added dots; this is supported by its surviving in Denmark until recently, very confusingly, as a symbol for subtraction, not division, though it has fallen out of use as Danes adopt our symbols under pressure to conform with international usage.
The division sign was in Rahn’s time known either as the obelus or sometimes the obelisk, from a Greek word meaning a roasting spit. The idea seems to have been that such dubious matter was thrust through, as with a spit; the word is the same as that for a tapering pillar, another object with a pointed end. Confusingly, the word obelus was later used for the printer’s character we often call a dagger, another symbol with a point.
As an aside, when people began to write computer languages from the 1950s on, they were hampered by the absence of mathematical symbols in the character sets of the time: they had the plus, minus and equals signs, but not those for multiplication or division. So computer scientists had to improvise by borrowing the asterisk for multiplication and the forward slash for division. This latter mark has a number of aliases, being known also as the solidus, oblique or virgule, among other names. So now we have two sets of symbols for these operations, though the computer characters hardly impinge on the lives of most of us.
Like the words in our language, the signs for arithmetic have been invented and have become standard through historical accident, influenced by a very few pioneers. They could so easily have been otherwise.
Mark Brader’s help is gratefully acknowledged. He supplied extra information from his own research, and commented on a draft of this piece. Thanks also to Dermod Quirke and Brian Holser for additional comments. Any mistakes are mine, of course.
Search World Wide Words
Recently added or updated
Lame duck; But and ben; Logomaniac; Type louse; Corium; Lie Doggo; Fewmet; Dingbat; Kibosh; Caucus; Oryzivorous; Kick the bucket; Satisficer; Beside oneself; Words of the Year 2015; Peradventure; Sconce; Orchidelirium; How’s your father; Goon; Emoji; Thank your mother for the rabbits; Nonplussed; Bob’s-a-dying; Methinks; Bill of goods.