Is there causality in Mathematics?





is there causality in mathematics?  

Perhaps we should start by pondering whether causality exists at all. Hume thought not and Wittgenstein dismissed the ‘causal connection’ between events as a superstition (Tractatus Logico-Philsophicus). Certainly no one has ever claimed to see or touch the ‘force of causality’, and if the supposed ‘causal connection’ between certain events were self-evident we would not have the difficulties we so manifestly do have in distinguishing  bona fide causal pairs from chance associations of events.

            For all that, I have never lost any sleep over Hume’s attack on causality and don’t intend to. “We know there is causality and there’s an end to it”, as Dr. Johnson said about free will. Hume himself, revealingly, admitted that he “dropped his philosophic scepticism when playing backgammon”.

            Belief in causality is undoubtedly a psychological necessity, and thus  a  biological necessity as well : as a species, we need to believe that we can ‘make a difference’ and, looking at what we have done to the planet, we’ve certainly proved that! If modern philosophers have their doubts about the existence of causality, well, so much the worse for them.

            Science in the West remained happily married to determinism for  three  centuries and Claude Bernard actually went so far as to define science as the application of causality to the material world. But then, in the course of the twentieth century, physics suddenly got infatuated with indeterminism. Why? The official answer is that this was forced on science by experimental discoveries in the atomic and subatomic domains where ‘statistical determinism’ rather than ‘complete individual determinism’ is the norm. Yes, but the phenomenon is far less comprehensive and radical than people think. The individual molecule in a gas is, if you like, allowed freedom of movement — but only because this is very unlikely  to affect the overall result. It is like Saddam Hussein giving Iraqis the vote. Also, it is usually only the order of appearance of the events that is random, not the events themselves. A good example is the process of photographic development which is a chemical amplification of initial atomic events. It is possible, using very weak exposure, to arrange for the  individual photons to arrive one after the other and if this is done the photographic image builds up in a way that is completely unpredictable. But the fact remains that all the micro-events have been completely specified in advance (by the object that is being photographed)1.

            In other cases, ‘random mutation’ for example, the consensus is that the events themselves are basically indeterminate but this remains an untested and probably untestable hypothesis. One suspects that the sudden vogue for indeterminism in physics and elsewhere during the twenties and thirties (strongly resisted by Einstein) was part of the Zeitgeist : the senseless slaughter of the Great War and, later, the Wall Street Crash (which no one had predicted) seemed to many people to demonstrate that the world was not fully comprehensible by rational means after all.  But the real culprit was  logical positivism, a philosophy which has had a crippling effect on the way we think about science and life generally. Whereas common sense always prefers to assume that there is an agent for all changes in the external world — “Every event has a cause” — logical positivism holds fast to the verification principle instead. Since causality cannot be verified directly, it has no right to exist, therefore it doesn’t exist. Stripped of causality, physics becomes an exercise in applied mathematics, while mathematics itself is, according to the moderns, either symbolic logic (Russell) or “a game played according to fixed rules with meaningless marks on paper” (Hilbert). This effectively puts paid not only to determinism but to objective reality itself which has become the unwanted ghost in a wholly symbolic machine. Britain’s most acclaimed theoretical physicist (Hawking) once admitted disingenuously that he was not really concerned about the underlying truth of a theory but only whether it was ‘interesting and fruitful’. 




In the real world by my book events cause other events :  they do not simply happen to precede them. Coercion is involved, not  ‘functional co-variance’. But when we shift to the logical, sanitised universe we find that all we are left with is ‘material implication’  P Þ Q.  

            I don’t expect I need to remind readers that the validity of P Þ Q does not mean we can just invert the terms and conclude that Q Þ P.   What is, however, accepted in both logic and mathematics is that the truth of P Þ Q entails the truth of the Contrapositive  Not-Q  Þ  Not-P    


                        “If (a, p) = 1 and p is prime  ap-1 = 1 (mod p) for all a            True

                        “If ap-1 = 1 (mod p) for all a, then  p is prime       False  (because of Carmichael Numbers such as 561)


“If  it is not the case that ap-1 =1 (mod p)  for all a, then p cannot be prime” —   True.


            Logically speaking the Contrapositive is equivalent to the original statement because the truth tables are identical. However, if the original statement has a causal basis, this feature disappears when we form the contrapositive. Negating an event is not the same thing as negating an assumption, since something that does not occur can neither cause something else to occur nor positively prevent its occurrence.


            “I shot my noisy next door neighbour in the head ten minutes ago, so he is now dead.”

            This statement is valid because the underlying causal connection is valid (a shot in the head causes death) whether or not it corresponds to the facts.  


            “If my next door neighbour is currently alive, I cannot have shot him in the head ten minutes ago”


is, I suppose, valid reasoning but sounds most peculiar — as if I were a psychopath suffering from recurrent bouts of amnesia. This shows what happens when we empty statements of causal content. 


            The point is that Contrapositives are always a good deal weaker than affirmative statements. One of the reasons why Newtonian physics got off to such a flying start, was because it was formulated positively : “Every particle attracts every other directly with respect to mass and inversely in proportion to the square of the distance between them”. Practical people like engineers took to Newtonian mechanics because they could visualize what was going on, “If rod A makes  B go down, then B will make C go up, and C will make D move to the right….”   

            All this has grave consequences for modern mathematics since most modern proofs are indirect (75% it has been estimated) and proceed along the lines, “But if A  is not so, then Y, then Z, but Z is nonsense, therefore not-A cannot be true, therefore A”.   Stevenson and Brunel would not have been impressed. Modern mathematics is choc-a-bloc with entities whose only right to exist is that, if they didn’t, someone would be contradicting himself somewhere.2 Compare this to direct proofs which actually show you how to turn up an example of the thing you are looking for.




In logic  P
ÞQ  [P logically implies Q] is always valid except when P is true and Q is false. Thus


“If  all triangles are equilateral, then no square can be inscribed in a circle”


“If 8 is a prime number,  then G.M. ≤ A.M.”


 are both valid (since we have F Þ F and F Þ T).

            Both these sentences are not even untrue, they are just rubbish because there is no connection between the respective statements.

            Examples like the above only go to show foolish it is to completely ignore meaning when we are setting up a logical system. The rules governing, say,  embroidery or bridge are neither here nor there, they are ‘meaningless’ and none the worse for it. But logic is not embroidery since it can, in principle, have considerable bearing on the decisions we actually make, such as, for example, whether a country is a potential threat to us, and in consequence whether we should go to war or not.

            Logic teaches you how not to contradict yourself. But why not contradict yourself if you feel like it?  One answer is that this frustrates the main purposes of speech which is to communicate with other people. But there is a second reason which is much more significant. We insist on non-contradiction in logic and mathematics because Nature actually is non-contradictory (at the macroscopic level anyway)  :  it (Nature) obeys a very important principle which I have baptised the Axiom of Exclusion “An event cannot at one and the same time both occur and not occur at the same spot”. Without this assumption science would be impossible  for there would be no point in working out, for example, that an eclipse of the sun was going to take place at such and such a locality if it was simultaneously feasible for it not to take place there3. Logic is, or should be, the faithful servant of reality rather than the legislator of what is and is not : The Axiom of Exclusion is the justification for, not the consequence of, the logical rule (in bivalent logic) that “A proposition cannot at one and the same time be true and  false”.    


            Of course, if the reality we are modelling is inherently  fluctuating and ambivalent it is a mistake to make the symbolic system too cut and dried because it will not fit the original. This is basically the reason why literature is able to give a far more convincing picture of actual human behaviour — which seems to be incurably irrational — than bio-chemistry. The maddening ambiguity and vagueness of language — as the mathematician sees it — become assets if we are dealing with a shifting,  inconsistent reality.




The Buddhist logician Dharmottara considered that


            “There can be no necessary relation other than one based on Identity or Causality.” 4     


            This is admirably concise so let us apply it to mathematics. If causality has nothing to do with mathematics, which is the usual view, this means that mathematics is entirely based on ‘Identity’, i.e. it is all one vast tautology.

            This seems to be true of mathematical formulae such as those for summing the figurate numbers.  


0                               0000                                                  00000

00                 +           000         =                                      00000

000                              00                                                  00000

0000                              0                                                  00000


            Since the above is perfectly general we can conclude that the sum of the natural numbers commencing with unity can be presented as a rectangle with one side equal to the greatest natural number of the sequence and the other side equal to that amount plus an extra unit, i.e. 1 + 2 + 3 + ….n  = (n (n+1)/2     Causality as such does not seem to be involved.

            But proofs by rearrangement, though the most convincing of all proofs, are not that common in modern mathematics : one reason why  infinite series are such a minefield is that rearrangement can radically alter the nature of the series, the most notorious case being that of log 2 5.

            But what about mathematical induction? Here there is a definite sense of compulsion : if such and such is true for n, it must be true for (n+1).  Certainly, mathematical induction is not mere rearrangement : there is a sequential element which reminds us unmistakeably of a bona fide causal process, steam forcing a piston along the inside of a cylinder whether it wants to go there or not. A large number of functions — all? — can be defined by recursion rather than analytically, and this often seems a much more natural way of doing things. But definition by recursion is very different from analytical definition  n  à  f(n)  because in the recursive process a function is built up piecemeal instead of being there in its entirety from the word go. Philosophically speaking, analytical definition is being, definition by recursion becoming.  



            For Plato actual lines and circles were imitations of ideal states of affairs and the imperfect nature of the sublunary world meant there might  occasionally be some slight deviations and discrepancies (a tangent drawn in the sand or on papyrus might well touch a circle at more than one ‘point’ for example). For Newton and Kepler what happened down here was wholly dependent on the prior decisions of a mathematical God, and we still seem to think like this a lot of the time which is why we still talk of the ‘laws of Nature’ — we do not speak of  ‘the observed regularities of Nature’. God may not have known, i.e. not bothered to work out in detail, all the particular consequences of his original handful of edicts, but then again He didn’t need to. So long as the original laws were basic and far-reaching enough, the world could be left to take care of itself. There is causality of a kind here because there is compulsion : rocks, plants and animals have no choice but to comply with the rules and even man, though he has free will, remains constrained in his physical being. But, according to this paradigm, the causality we find in Nature is not itself ‘natural’ : it has a supernatural origin and purpose.

             In the classical (post-Renaissance) world-view there is no real difference between physical and mathematical law, between pure and applied mathematics, so the same schema applies. God determined the axioms and everything else is theorems. But today we no longer believe in an omnipotent intelligent Creator God (most of us anyway) so the ‘laws of physics’ and ‘laws of mathematics’ revert to being something rather similar to Platonic Forms, existing out on a limb. This does seem to remove causality from mathematics and physics unless we view the way in which phenomena model themselves on ideal states of affairs  — how an actual gas approximates to the behaviour of an ideal gas, for example — as a sort of watered down ‘formal causality’. In the Judaeo-Christian world view which was that of Kepler and Newton, everything hinges on actions and decisions made ‘in the beginning’ : someone (God) had  sometime in the distant past ‘divided the light from the darkness’ and distinguished primes from composite numbers. But Platonic Forms and mathematical formulae simply are , they do not do anything.   




Supposedly, the whole of mathematics can be derived from the half dozen or so Axioms of von Neumann or Zermolo Set Theory (se M500 206 p. 20). But no one ever sat down of a night to see what interesting theorems he or she could derive from them : they are strangely remote like mountains one sees in the distance but which are utterly unrelated to life down here in the plains.

            But then most modern mathematics has an insubstantial air : the very way in which we are taught to do our mathematics, to consider the basic entities and procedures, inclines us towards a view of the world where nothing really happens. The old-fashioned viewpoint, still very much in evidence in school textbooks of the pre-war era, goes rather like this. We have a numerical or geometric entity, we do something drastic to it, multiply it, chop it up into bits, rotate it &c. &c. and then we see what we are left with. The modern way is to ‘map’ certain values to certain other ones : we make up two sets  (a, b) and (a’, b’) selected according to a rule. Everything exists in a sort of eternal present and we merely move around looking at what’s here and comparing it with what’s there. The idea of an unknown which by dint of intelligent manipulation gets transformed into a known is both intuitively clear and exciting  : it is like  working out the  identity of Mr. X from circumstantial evidence and witness statements. But the idea of a variable is quite different : somehow x has all possible values at once (usually an infinite number) each of which incidentally is a constant. Also, in the real world effects always succeed causes which means, mathematically speaking, that the dependent and independent variables are not freely invertible — precisely what we are told to assume in Calculus. Examples can be multiplied endlessly….

            What nobody seems to have noticed is that the two dominant tendencies in modern mathematics, the axiomatic approach and the analytical, ‘functional’ presentation (which is essentially Platonic) are pulling mathematics in two completely different directions and may well eventually tear it apart.  An axiomatic approach means that deduction is all-important since, not only can everything (or nearly everything if we take Gödel into account) be derived from the axioms, but nothing that is not so deducible will crop up (again pace Gödel). But deduction involves step by step argument, thus temporal sequence; also, there is a strict hierarchy with certain propositions being much higher up the pecking order, as it were, than others. But the ‘functional’, analytical treatment is, implicitly at least, atemporal and non-hierarchical. All the properties of  y = f(x)  are there as soon as we have written down the expression and it is ‘our fault’ is we don’t spot them straightaway. Moreover, all the cross-references between different functions also exist as soon as the functions are properly defined, and in fact prior to their being properly defined (by us). As for some propositions being key ones on which others depend, if everything is already out there nothing ‘depends’ on anything else, it either is out there or it isn’t. There either are odd perfect numbers or there are not. This rather cuts the ground from under the feet of the ‘prove-at-all-costs’ lobby and, moreover, because computers can usually prove or disprove whether such and such an assertion is true over the domain that concerns us in practice, it ceases to be so important to know if something is ‘always’ true or not — indeed, some philosopher will shortly come along and tell us that this is a ‘metaphysical question’ and thus not worth bothering about.

            Currently there are only three theories of mathematics left in the running, formalism, logicism and Platonism . Neither of the first two schools of thought can explain the often amazingly good match between mathematics and physical reality, and, while Platonism does explain this, the metaphysical price to pay is a very high one indeed. Even mathematicians who are not afraid to call themselves Platonists (such as Penrose) fight shy of giving any coherent statement of their philosophic position to the general public. Now logicism and formalism do not recognize causal processes at all while Platonism admits only a very watered down sort of causality at best. So this explains the inevitable demise of causality in the scientific world-view.

            But more significantly none of these mathematical schools can explain the surprising vitality of mathematics which never ceases to astonish (and sometimes to alarm). Formalism allows for human invention since that is what in the last resort the whole of mathematics is but has little to say about how and why inventions come about. I am so far from being a positivist that I see ‘vital forces’ operating everywhere, not only in the biological and physical domains but also in supposedly abstract areas like pure mathematics and even, in a very rarefied form, in logic. There is perhaps a single unified ‘force of necessity’ which is (almost) tangible in an arrangement of rods and levers and which, in a good mathematical proof, can be sensed thrusting the tortuous argument on to its triumphant crescendo.   

            Moreover, this élan vital is surely active in mathematics as a whole, ceaselessly pushing it in new and unexpected directions : mathematics, like technology, has a life of its own and individual mathematicians get dragged along whether they want to go in that direction or not. What is absent from the logicist, Platonist and formalist views on mathematics is precisely a recognition of this vital principle. There is just no driving force in Set Theory : it is a steam-engine that has been cleaned up, varnished and put to rest in a science museum. This is why Poincaré, who was a creative mathematician in a sense that Russell and Whitehead were not, dismissed logicism with the crushing retort, “Logic is sterile but mathematics is the most fertile of mothers”.


References and Notes  


1  See French and Taylor, An Introduction to Quantum Physics, pp. 88-89  The remarkable illustrations show the picture of a girl’s face building up from randomly distributed dots.

            I have conjectured that there is some sort of a law involved : if all the events are specified in advance, their order of appearance need not be, if not all of the events are specified in advance, there must be strict order.  


2 Does anyone, for example, really believe that ‘almost all’ numbers are transcendental? (I remind readers that a transcendental number is a real number that is not the root of a polynomial equation with integer coefficients.)  Apart from e and p I doubt if anyone reading this could produce one without consulting  a dictionary of mathematics. On doing this I find that 10 -1+ 10–2 + 10 -3 + …..  is also a member of this highly select (but apparently very well attended) club.    


3 The trouble with Quantum Mechanics is that it does not verify the Axiom of Exclusion since it permits a physical system to be in incompatible states at the same time. The Many Worlds version of QM does verify the Axiom of Exclusion, of course, but there is a heavy price to pay in universes. 


4  Stcherbatsky, Buddhist Logic Vol. 1. p. 259.  This is, incidentally, an extremely interesting, readable and, I believe, important book despite its abstruse air. It is more concerned with speculative philosophy than logic as such. The world-view of certain Buddhist thinkers in Northern India during the first few centuries of our era, has a distinctly modern feel  — they would have been quite happy with Einstein’s attempt to describe the physical world in terms of causally related events occurring in a single unified Space-Time field.

            This raises the question of why India didn’t get there first in terms of the scientific revolution. Needham, in discussing the question with reference to China, concludes that the key notion of natural law was lacking. But this was certainly not lacking in India (the law of karma). Maybe these Hinayana Buddhists were too advanced in their conceptions : it was necessary to work  through the cruder scientific paradigm of a world made up of ‘hard, massy particles’ interacting with each other by pushes and pulls before moving on to the vision of evanescent bundles of energy evolving in Space/Time. Also, of course, there was little motivation to develop science as such  :  for a Buddhist the physical world was just not important enough to bother about. 



5 log 2  =   1 1/2 + 1/3 1/4 + 1/5 +………   

            =  (1 – ½) 1/4  + (1/3 1/6) 1/8 + (1/5 1/10) ……

            = ½ ¼ + 1/6 1/8 + 1/10 …….

            = ½ { 1 1/2 + 1/3 1/4 + 1/5 +………}   

            =   ½  log 2 


 NOTE.  Ackowledgments toM500 Magazine where this article originally appeared.



Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: