www.TheLogician.net © Avi Sion  all rights reserved 
The Logician © Avi Sion All rights reserved

Logical and Spiritual REFLECTIONS © Avi Sion, 2008. All rights reserved.
Book 2. A Short Critique of Kant’s Unreason
Chapter 7. How numbers arise
If we pay attention to the ratiocinative acts at the foundations of mathematics[1], we notice the following intentions or mental movements: First, there is the mental isolation of something from its immediate experiential context, intending “this and not the rest of it”. A part of present experience is focused on, mentally delimited and considered virtually apart from the other parts of present experience. This makes possible the formation of the mathematical concept of ‘one’ (symbolized by a ‘1’), which is the elementary unit at the basis of all subsequent rational activities of computation or calculation. Second, there is the mental conjunction of a selection of two (or more) such units, intending “this and that (or those)”. The units concerned may all be present in current experience, or some or all of them may have to be brought to mind by memory. This is the basis of the mathematical concept of ‘addition’ (symbolized by a ‘+’), which gives rise to compounds of units, i.e. natural numbers greater than ‘one’ (viz. ‘two’, ‘three’, etc.). Third, there is the mental identification of two (or more) such selected units, intending or declaring them to be ‘effectively the same’, ‘numerically equivalent’, ‘quantitatively equal’ (this being symbolized by the ‘=’ sign). Equivalence or equality between two items means that either item can in practice be substituted for the other with regard to numerical value or quantity, even though symbolically they may be differently constituted (e.g. as two ones and one two)[2]. Thus, first comes the idea of “1”. From this, we build up the series of natural numbers, two, three, etc., by a succession of additions and equations, each of which relies on the preceding, ad infinitum: 1=1; 2=1+1; 3=2+1; 4=3+1; … etc. This gradual and infinite construction is enshrined and recalled in the process of counting: 1, (+1=) 2, (+1=) 3, (…) 4, (…) 5, 6, 7, … etc. These definitions allow us to work out simple arithmetical inferences or proofs, by way of various appropriate substitutions. For example: 2+2 = 4 is derived from 2+2 = 2+(1+1) = (2+1)+1 = 3+1 = 4. The brackets ‘( )’are here used to signify our changing mental focus on the numbers involved. We are now able to invent a fourth useful mathematical operation, viz. subtraction. This formalizes the idea of mental exclusion of some unit(s) from a set of units under consideration. Thus, for example: having two things, removing one, leaves one; this is symbolized (using the ‘minus’ sign, written ‘–’) as 2–1=1. Similarly: 3–1=2; 4–1=3; and so forth (compare to the series of additions and their derivatives). A special application of this idea of mental removal yields our concept of ‘zero’ (symbol ‘0’). “Given just one thing under consideration, if we remove this one thing, what are we left with?” The answer we give to this question is “zero”, which construct we interpret as the negation of all other numerical concepts. That is, we intend by this ‘number’ an absence of any unit or collection of units (and later even of fractions of units, etc.); this is how we define zero. In the same manner, the operations of ‘multiplication’ and then ‘division’ can be introduced, and mathematics as we know it can be gradually built up. All this is simple and straightforward enough, and generally well known. Granting this obvious account, our next question has to be: “is mathematics, then, something ‘subjective’ or ‘objective’?” That is, what is the ontological status of numbers? Historically, mathematics no doubt stems from the material experiences and needs of humans. We can well imagine how necessary economic activities like isolating a cow from a herd[3], or gathering cattle in a pen, or exchanging a cow for some sheep, would in time (as life increased in complexity) give rise to the more abstract ideas of mathematics described above. Raw experience is evidently nonnumerical. It is just a whole, without parts. Nature itself is continuous. It is we, who experience it, who mentally cut the whole into parts, which we mentally regroup in various ways[4]. We do not, however, engage in such acts randomly and arbitrarily, but (more or less) intelligently and rationally. Without humans (or beings with similar cognitive and volitional powers), there would be no numbers, since the unity of all things would never be put in question. It takes humans to discriminate between various aspects of the variegated whole, and thus distinguish ones within a background of many, and so forth. But this construction of number is neither entirely objective nor entirely subjective. That is, it is not given in experience in the way of raw data, but it still relies on experience for its formation. Thus, mathematics is not a mere convention, but a mental organization of data in accordance with its observable features. That is to say: without humans (or the like), events like “one plus one equals two” would not actually arise in the world; yet humans cannot say “one plus one equals three” without thereby contradicting their experience. Thus, mathematics is a product of the experience and understanding of human beings, and not some wild fantasy of theirs. I submit that the above presentation of numbers theory is an accurate account of how numbers actually arise in the minds of individuals, and how they actually arose in the course of human history. It is not intended as a mathematician’s description of events, or a psychologist’s, but as an epistemological account. The latter is more fundamental, and fits numerical concepts in with nonnumerical ones in the wider theory of knowledge. Some comments on “modern maths”. I say this of course with “mathematical logic” in mind. In modern times, starting in the 19^{th} Century, attempts were made to logically streamline arithmetic, better explain it and fit it in the larger context of our knowledge. This is naturally a worthy undertaking, but some erroneous presuppositions were made in the course of it and some unjustifiable inferences were drawn from some of the results obtained. We may here mention, as a major example, the work of Gottlob Frege[5].
His goal was to make more explicit all the logical steps involved in the
development of arithmetic. But in pursuing this commendable goal, he thought he
was engaged in purely formal and deductive acts, and did not realize the extent
to which he was actually depending on experience and conceptual insight. This is evident for example in the way he developed cardinal numbers,
defining them by reference to disjunctions of members of a set (if I am not
mistaken). A set with one disjunction would have two members, one with two
disjunctions would have three members, and so forth. Now, this is ingenious, and
seems to reduce numerical development to a series of purely logical statements,
relying only on abstract concepts like identity and difference, sets and
members, and disjunction (i.e. negation of conjunction). But we can ask many questions. First, does this way of presenting things
correspond to the way humans actually conceptualize numbers, as individuals and
collectively in history? The answer is, I suggest, no. Frege’s system may well
be interesting to logicians and mathematicians as an abstract ex post facto
ordering of mathematical knowledge acquired till then, and as a way to develop
new mathematical knowledge, but it is essentially an artificial and recent
construction. It is more an abstract game than a description of how human knowledge of
numbers occurs in practice. Such a construct cannot be said to radically
discredit and displace the arithmetic notions that precede it. This would be
committing the genetic fallacy, i.e. forgetting the debt owed to what came
before in the human mind. It is only because we have already assimilated numbers
that we are now able to play around with them the way Frege does. Secondly, to fully understand this, and agree with it, consider when,
where and how the concepts Frege uses in his system arise. It is a
misrepresentation to think that he is functioning on some entirely abstract and
mechanical plane, without appeal to experience or inductive reasoning. Does he
anywhere reflect on how we have knowledge of identity and difference, sets and
members, and disjunction (i.e. negation of conjunction), in specific situations
and in general? If we do reflect on these issues, a bit more deeply than Frege ever did,
we quickly realize that these concepts are not so simple and primary. Identity
and difference involve the cognitive acts of comparison and contrast, which rely
on experience and on its subdivision. This indeed occurs before the number one
(1) is first grasped, as above mentioned; but the point made here is that it is
not as instantaneous and mechanical as Frege imagines. The concepts of sets and members, and even that of negation, are
effectively taken by him as primaries, whereas they require much, much study and
reflection to understand and use. It is true that children routinely grasp them
enough to use them, but that only goes to show the native intelligence of the
human species. In truth, if we want to “formalize” their thinking in the way
Frege tries, we have to first clarify the genesis of such abstractions in
detail. Why, for instance, refer to disjunction, when conjunction precedes it in
knowledge? We only understand disjunction by negation of conjunction. So to
explain numbers through conjunction as above proposed, is the more natural way
– and this is the way real humans proceed. I do not pretend to be a
mathematician, but I feel safe in saying that a mathematical system based on
natural numbers developed by successive conjunction of units is just as good, if
not better than Frege’s contraption. It does not follow from such reflections that mathematics is something
psychological, i.e. that we need to study psychology to get to mathematics, or
anything of the sort. What it does mean is that if you study “mathematical
logic” without asking and answering the deeper epistemological questions, all
you will have is an abstract construct. You should not, thereafter, boast to have done away with or replaced
epistemology; all you have done is ignored it. You have flexed your muscles in
manipulation of symbols, but you have not demonstrated insight into how you
actually functioned while doing so. Your “proofs” are then just superficial
processes, which do not take into account every assumption hidden within them. This is not intended to mean that socalled mathematical logic is not objective, i.e. is necessarily divorced from reality in some significant way. What it is intended to mean is that such studies occur in a sandbox, without proper awareness of the wider world. [1]
See also my earlier comments on this topic, in Phenomenology
(II.4 and VIII) and in Ruminations (9.10). [2] And of course, even though, the things we may have in mind behind the figures, e.g. apples and oranges, may be different in various respects. [3]
To eat it, or give it or trade it. [4] When I say mentally, I mean by projecting divisions and groupings; i.e. imagination or even hallucination is used. [5] Germany, 18481925. Though Frege did not himself draw any larger philosophical conclusions from his results, his contemporary Bertrand Russell did so.
