Animals and so-called primitive peoples do not bother to make nice distinctions between entities on the basis of number and even today, when deprived of technological aids, we are not at all good at it (**Note 1**). What people do ‘naturally’ is to make distinctions of *type* not *number* and the favourite principle of division by type is the two-valued either/or principle. Plato thought that this principle, *dichotomy*, was so fundamental that all knowledge was based on it — the reason for this being because the brain works in this way, the nerve synapsis is either *‘on’ *or *‘off’. *Psychologically human beings have a very strong inclination to proceed by straight two-valued distinctions, *light/dark, this/that, on/off, sacred/profane, Greek/Barbarian, Jew/Gentile, good/evil *and so on — more complex gradations are only introduced later and usually with great reluctance Science has eventually recognized the complexity of nature and apart from gender there are not many true scientific *dichotomies *left though we still have the classification of animals into *vertebrates* and *invertebrates*.

Numbers themselves very early on got classified into *even *and *odd* , the most fundamental numerical distinction after the classification *one* and *many* which is even more basic.

The classification *even/odd* is radical: it provides what modern mathematicians call a partition of the whole set. That is, the classification principle is *exhaustive* : with the possible exception of the unit, *all * numbers fall into one or other of the two categories. Moreover, the two classes are mutually exclusive: no number appearing in the list of *evens* will appear in thelist of *odds*. This is by no means true of all classification principles for numbers as one might perhaps at first assume. Numbers can be classified, for example, as *triangular* and as *rectangular *according to whether they can be (literally) made into rectangles or equilateral triangles. But ΟΟΟΟΟΟ turns out to be both since it can be formed either into a triangle or a rectangle:

ΟΟΟ ΟΟΟ

ΟΟΟ ΟΟ

Ο

The Greeks, like practically all cultures in the ancient world, viewed the *odd *and *even *numbers as male and female respectively — presumably because a woman has ‘two’ breasts and a male only one penis. And, since *oddness*, though in Greek the term did not have the same associations as in English, was nonetheless defined with respect to *evenness *and not the reverse, this made an *odd* number a sort of female *manqué. *This must have posed a problem for their strongly patriarchal society but the Greek philosophers and mathematicians got round this by arguing that ‘one’ (and not ‘two’) was the basis of the number system while ‘one’ was the ‘father of all numbers’.

On the other hand a matriarchal society or a species where females were dominant would almost certainly, and with better reasoning, have made ‘one’ a *female* number, the primeval egg from which the whole numerical progeny emerged. Those who consider that mathematics is in some sense ‘eternally true’ should reflect on the question of how mathematics would have developed within a hermaphroditic species, or in a world where there were *three* and not *two* humanoid genders as in Ian Banks’s science-fiction novel *The Player of Games*.

*Evenness* is not easy to define — nor for that matter to recognize as I have just realized since, coming across an earlier version of this section, I found I was momentarily incapable of deciding which of the rows of balls pictured at the head of this chapter represented odd or even numbers. We have to appeal to some very basic feeling for ‘symmetry’ — what is on one side of a dividing line is exactly matched by what is on the other side of it. A definition could thus be

**If you can pair off a collection within itself and nothing remains over, then the collection is called even, if you cannot do this the collection is termed odd.**

This makes *oddness* anomalous and less basic than *evenness *which intuitively one feels to be right — we would not, I think, ever dream of defining *oddness *and then say *“If a collection is not odd, it is even”*. And although it is only in English and a few other languages that *‘odd’* also means *‘strange’*, the pejorative sense that the word *odd* has picked up suggests that we expect and desire things to match up, i.e. we expect, or at least desire, them to be *‘even’* — the figure of *Justice* holds a pair of evenly balanced scales.

The sense of *even* as ‘level’ may well be the original one. If we have two collections of objects which, individually, are more or less identical, then a pair of scales remains level if the collections are placed on each arm of the lever (at the same distance). One could define *even* and *odd* thus pragmatically:

**“If a collection of identical standard objects can be divided up in a way which keeps the arms of a balance level, then the collection is termed even. If this is not possible it is termed odd.” **

This definition avoids using the word *two *which is preferable since the sense of things being *‘even’ *is much more fundamental than a feeling for *‘twoness’* — for this reason the distinction *even/odd*, like the even more fundamental *‘one/many’ *, belongs to the stage of *pre*-numbering rather than that of numbering.

Early man would not have had a pair of scales, of course, but he would have been familiar with the procedure of ‘equal division’, and the simplest way of dividing up a collection of objects is to separate it into *two* equal parts. If there was an item left over it could simply be thrown away. Evenness is thus not only the simplest way of dividing up a set of objects but the principle of division which makes the remainder a minimum: any other method of division runs the risk of having more objects left over.

Euclid’s definition is that of equal division. He says “An **even** number is that which is divisible into two equal parts” (*Elements ***Definition 6. Book VII**) and “An **odd number** is that which is not divisible into two equal parts, or that which differs by a unit from an even number” (*Elements* **Definition 7. Book VII**). Incidentally, in Euclid ‘number’ not only always has the sense ‘positive integer but has a concrete sense — he defines ‘**number**‘ as a “multitude composed of units”.

Note that Euclid defines *odd* first privatively (by what it is not) and then as something deficient with reference to an *even *number. The second definition is still with us today: algebraically the formula for the odd numbers is **(2n-1) **where **n** is given the successive values **1, 2, 3….** or sometimes (in order to leave **1** out of it) by giving **n** the successive values **2, 3, 4…. **In concrete terms, we have the sequence

Ο ΟΟ ΟΟΟ …….. …..

Duplicating them gives us the ‘doubles’ or even numbers

Ο ΟΟ ΟΟΟ ..….

Ο ΟΟ ΟΟΟ ……

and removing a unit each time gives us the ‘deficient’ odd numbers.

The unit itself is something out on its own and was traditionally regarded as neither even nor odd. It is certainly not *even *according to the ‘equal division’ definition since it cannot be divided at all (within the context of whole number theory) and it cannot be put on the scales without disturbing equilibrium. In practice it is often convenient to treat the unit as if it were *odd*, just as it is to consider it a *square *number, *cube *number and so forth, otherwise many theorems would have to be stated twice over. Context usually makes it clear whether the term ‘number’ includes the unit or not.

Note that distinguishing between *even *and *odd* has nothing to do with counting or even with distinguishing between *greater* or *less* – knowing that a number is even tells you nothing about its size. And vice-versa, associating a number word or symbol with a collection of objects will not inform you as to whether the quantity is even or odd — there are no ‘even’ or ‘odd’ endings to the spoken word like those showing whether something is singular or plural, masculine or feminine.

It is significant that we do not have words for numbers which, for example, are multiples of four or which leave a remainder of one unit when divided into three. (The Greek mathematicians did, however, speak of ‘even-even’ numbers.) If our species had three genders instead of two, as in the world described in *The Player of Games*, we would maybe tend to divide things into *threes *and classify all numbers according to whether they could be divided into three parts exactly, were a counter short or a counter over. This, however, would have made things so much more complicated that such a species would most likely have taken even longer to develop numbering and arithmetic than in our own case.

The distinction *even/odd *is the first and simplest case of what is today called a congruence. The integers can be separated out into so-called equivalence classes according to the remainder left when they are divided by a given number termed the *modulus*. All numbers are in the same class (**modulus 1)** since when they are separated out into ones there is only one possible remainder : nothing at all. In Gauss’s notation the even numbers are the numbers which leave a remainder of zero when divided by **2**, or are **‘0 (mod 2)’ **where **mod** is short for **modulus**. And the odd numbers are all **1 (mod 2) **i.e. leave a unit when separated into twos. What is striking is that although the distinction between even and odd, i.e. distinction between numbers that are **0 **or **1 (mod 2) **is prehistoric, congruence arithmetic as such was invented by Gauss a mere couple of centuries ago.

In concrete terms we can set up equivalence classes relative to a given modulus by arranging collections of counters (in fact or in imagination) between parallel lines of set width starting with unit width, then a width which allows two counters only, then three and so on. This image enables us to see at once that the sum of any two or more *even* numbers is always *even*.

And since an odd number has an extra Ο this means a pair of odd numbers have each an extra unit and so, if we fit them together to make the units face each other we have an even result. Thus *Even plus even equals even” *and *“Odd plus odd equals even” *are not just jingles we have to learn at school but correspond to what actually happens if we try to arrange actual counters or squares so that they match up.

We end up with the following two tables which may well have been the earliest ones ever to have been drawn up by mathematicians.

** + odd even ** ×** odd even **

** odd even odd odd odd even**

** even odd even **** even even even**

All this may seem so obvious that it is hardly worth stating but simply by appealing to these tables many results can be deduced that are far from being self-evident. For example, we find by experience that certain concrete numbers can be arranged as rectangles and that, amongst these rectangular numbers, there are ones that can be separated into two smaller rectangles and those that cannot be. However if I am told that a certain collection can be arranged as a rectangle with one side just a unit greater than the other, then I can immediately deduce that it can be separated into two smaller rectangles. Why am I so sure of this? Because, referring to the tables above,

**1.) the ‘product’ of an even and an odd number is even;
**

**2.) an even number can by definition always be separated into two equal parts.**

** **I could deduce this even if I was a member of a society which had no written number system and no more than a handful of number words.

This is only the beginning: the banal distinction between even and odd and reference to the entries in the tables above crops up in a surprising amount of proofs in number theory. The famous proof that the square root of **2** is not a rational number — as we would put it — is based on the fact that no quantity made up of so many equal bits can be at once even and odd. *SH 5/03/15*

**Note 1 **This fact (that human beings are not naturally very good at assessing numerical quantity) is paradoxical since mankind is the numerical animal *par excellence. *Mathematics is the classic case of the weakling who makes himself into Arnold Schwarzenegger. It is because we are so bad at quantitative assessment that playing cards are obliged to show the number words in the corner of the card and why the dots on a dice are arranged in set patterns to avoid confusion.