A Philosophical Introduction to The Foundations of Elementary Arithmetic
by Andrew Boucher
v1.03 Last updated: 1 Jan 2001 Created: 1 Sept 2000
Please send your comments to abo

Introduction

As it is currently used, "foundations of arithmetic" can be a misleading expression. It is not always, as the name might indicate, being used as a plural term meaning X = {x : x is a foundation of arithmetic}. Instead it has come to stand for a philosophico-logical domain of knowledge, concerned with axiom systems, structures, and analyses of arithmetic concepts. It is a bit as if "rock" had come to mean "geology." The conflation of subject matter and its study is a serious one, because in the end, one can lose sight of what one should be doing in the first place. Perhaps it is taking matters too literally, but it seems that there is something to be said for taking the term to represent X. Doing so and accepting the term to have some kind of significance, it is then natural to focus on the question of what a foundation of arithmetic should be; and, if one exists, what one is. Whatever the case, that is what shall be done in this paper.

Part I: What should a foundation of arithmetic be?

Like most concepts, that of a foundation of arithmetic never received some clear, initial definition before it developed into wide currency. Thus there is no standard or model, and admittedly it is unlikely that any exegesis of such a term can be perfect, much less universally accepted. Nonetheless, even without resorting to historical verification, it is plausible that the notion of a foundation of a mathematical discipline began as a metaphor, with the analogy of a building. Metaphors, like concepts, are rarely clear-cut, but again it is probably not too off to point to two characteristics of a building's foundation as being crucial: it supports what is on top of it; and it is the most fundamental part of the building, with nothing else supporting it. Whatever the case, assume these two qualities are the essential ones.

In mathematics what gives support are the axioms and rules of deduction. For they are directly or indirectly responsible for every proof, and so to speak are holding up the rest of the building. Even the primitive concepts and grammar of one's language, which are obviously basic and important to a theory, do not seem actually part of the foundation; they are more likened to the glue, nails, and structure--clearly crucial, but not relevant to the particular subject matter at hand.

So a support for arithmetic must be a set of axioms (and rules) sufficient to prove all its theorems. But not all such sets are supports. For one thing the set should not be inconsistent nor should it contain a falsehood. Indeed, returning to the metaphor, a set built on sand (as Weyl put it) does not support. A foundation must be firmly established on solid ground. That is, for a set to be a support, every of its axioms must have some justification, some reason why it should be assseted.

Remark that there is a certain amount of tension between the two essential qualities--a support must have a justification, and it must be fundamental--just identified for a set of axioms to be a foundation. To be fundamental, an axiom's justification cannot itself be another assertion, since otherwise this new assertion would be an axiom providing a deeper foundation. That is, if one were to justify an axiom A by saying proposition B implies it, then one should simply replace A with B to have a deeper set. Fortunately, there are other ways of providing justifications. For instance, to say that an axiom is intuitively obvious provides it a justification but not a support, since there is no proof intended or implied.

Before saying what holds arithmetic up, it is obviously necessary to decide what arithmetic is. One could take a large view, and say that it consists of every true sentence which can be written using arithmetic terms (such as addition and multiplication), and that a foundation must prove every such truth. But in so doing, one of course runs up against Godel's Incompleteness Theorem, with the resulting implication that there can be no foundation for arithmetic in systems of a certain, mechanical nature. Since such a system is a fair way of ensuring that all assumptions are clearly specified, there is no way to avoid the conclusion. Hence a large view of arithmetic, even if it may be the most natural, is also a sterile one for a foundationalist.

However, it does seem a stretch to classify Godel's theorem, even when it is transcribed in arithmetic notation, as arithmetic. It seems more appropriate to limit the subject to--shall we say it?--simpler propositions, such as 2 + 2 = 4, the Commutative Law of Multiplication, and--even here it seems one is trespassing on a domain more properly called "number theory"--the existence of an infinite number of prime numbers. In any case, to obviate sterile semantic arguments, let us call this more limited (admittedly not well-defined) domain elementary arithmetic. It can be asked, without now receiving an immediately negative reply, whether a foundation for elementary arithmetic exists and, if so, what one is.

It hardly seems likely that the existence of a foundation can be held to be intuitively obvious. If it were, the same intuition would probably say as well that there is a foundation for all of arithmetic, which again by Godel is false. The only way to justify the affirmation is seemingly to be actually to construct and produce one. At least, construction would clearly settle the question.

Remark on the indefinite article of the original interrogation: the question is what a, and not the, foundation is. Granted this goes contrary to the metaphor, since a building has one and only one foundation. But given the manner in which the term has been explained--support without support--, there does not appear to be any argument for uniqueness. Certainly there are different ways of formulating what is basically the same assertion, so at the least uniqueness holds only if one has a notion of when axioms are "basically similar." In brief the manner in which the query has been posed is intentional and appears correct. And it is the question of existence to which the present investigation will be directed.

Admittedly "fundamental" is inherently a woolly and vague concept. Even explaining that a set is fundamental if it cannot be provable from a deeper set, helps only a little, since "deeper" is itself also ambiguous. Clearly a set with superfluous members is not fundamental. But simply the lack of superfluity does not guarantee depth. Perhaps the best way of putting it, as was already suggested above, is whether one would justify axioms in the set by saying other propositions imply them--if so, they are not fundamental.

As we have just put it, being fundamental is not related to being more certain. "2 + 2 = 4," which presumably appears certain to almost everyone, is nonetheless not fundamental. At least, it feels on reflection that it is true because other propositions entail it. The aim of a foundation is to provide understanding, not certainty.

Whatever the precise sense of the terms, it does seem that nothing can be more fundamental than an axiom which just explains what an arithmetic term means. There can't be anything more basic than, for instance, the assertion tying zero to the situation where there are no objects, or addition to the putting together of objects. By explaining the essential idea, one can be assured that no assumptions about zero or addition can be deeper. Moreover, it is reasonable to suppose that a foundation must be statable in such terms, that somehow zero must be tied to the situation where there are no objects and addition to the putting together of objects, because otherwise the supposed foundation could presumably be proven from axioms which did make the connection, and so which would be considered deeper.

Unhappily, it may turn out that some concepts--for example, that of "natural number" itself--do not lend themselves to such essential explanations. One then states an axiom or axioms, which are taken to be fundamental only because no alternatives appear more so. So there can never be confirmation that certain axioms are in fact the deepest. Refutation is always possible, as there is always the chance that a new, deeper axiom will be proposed. While less than ideal, such a situation should not detract from the attempt to state a foundation, since even one step on a long journey is nonetheless an advance.

Part II: What are not foundations?

Peano's Axioms themselves, by their mere simplicity, suggest a foundation. Alas, they are not fundamental enough, since they do not tie 0 to the situation of there being no objects, nor do they explain addition by reference to the putting together of objects. In brief they are stated on a higher level than a foundationalist desires. They support, but they can be supported by something deeper.

The best known alternatives are ZF set theory and Frege Arithmetic. ZF defines 0 as the empty set, 1 as the set of the empty set, addition as set union, and so on. Since the empty set represents no objects and set union is a way of putting things together, ZF satisfies, for zero and addition at least, the demand to be fundamental.

The technique of Frege Arithmetic, although older than ZF since it originated (as its name suggests) with Frege, experienced a long period of neglect, and has only recently been revived, by Crispin Wright in the early 1980s. Central to the theory is a proposition known (although historically inaccurate, according to Michael Dummett) as Hume's Principle, namely that

"The number of x = the number of y if and only if there is a 1-1 correspondence between x and y."
Following Charles Parsons, Wright observed the remarkable fact that the inconsistent axiom in Frege's system (Axiom V) is only needed to prove HP. Afterwards, it may be forgotten, since HP alone, with the rest of second-order logic, suffices to prove Peano's Axioms. Frege Arithmetic (FA) is just second-order logic plus HP, and George Boolos, who along with Wright and Richard Heck led the research in this field, baptized as "Frege's Theorem" the fact that FA implies Peano's Axioms. The name is well chosen, since Frege can in fact be considered to be the first to have proven it. Although to my knowledge Crispin Wright never advocated the thesis that, using our terminology, FA is a foundation for elementary arithmetic, it is still a natural conjecture, and has many common points with Wright's own concerns.

So here are two strong candidates to be a foundation: ZF on the one hand, FA on the other.

As usually stated however, neither is in fact a foundation, because despite appearances neither as usually stated is even about arithmetic. In order to have the look of arithmetic, both stipulate that arithmetic symbols are to mean set-theoretic or logical concepts. But one shouldn't and can't be deceived by appearances, and ZF is nothing more than a type of set theory, and FA nothing more than a variety of second-order logic.

Indeed, it is a philosophical truism (and a point of Paul Benacerraf, among others) that you can't stipulate what a concept means, but only what a symbol or a word means. A concept is what it already is. It may be explicated, but cannot be dictated to mean something (else). While we can stipulate that the word 'pig' will mean "creatures with wings," we can't stipulate that what the word 'pig' actually now means (the concept pig) is to be the same as "creatures with wings." More baldly, while the word 'pig' can be stipulated to mean "creature with wings," pigs cannot be stipulated to fly. If one really wants the word 'pig' to mean "creature with wings," then of course one can insist that it be so, but then one must forget the usual meaning of the word. In particular, statements about these so-called pigs are no longer about animals which go oink. The stipulation has changed the subject matter.

Stipulations should only be used when the term to be defined is inconsequential. A good test for this is if it can be replaced by a nonsense word, such as 'plok.' For instance, instead of 'prime number', we might define 'plok' to mean "natural number other than 1 divisible only by 1 and itself". When one goes on to prove there are an infinite number of ploks, no one will be seriously alarmed, because the fundamental content of the assertion has not changed. Having passed the plok test, the definition of 'prime number' is a legitimate stipulation. However, ZF's definition of the term '1' depends crucially on how the symbol is currently used. It needs the symbol to be '1' in order to confuse its audience that the resulting assertions are actually about the number one and are part of arithmetic. Failing the plok test, it is therefore not a stipulation, but can only, at best, be considered an analysis of a concept. The difference is important, because while stipulations cannot be challenged--they are neither true nor false--, analyses can be wrong, and misanalysis is possible. ZF's definitions--analyses of mathematical concepts in terms of set theory--can therefore be contested. At the least, they should be stated as additional axioms, so those who do want to discuss, question, or disagree with them are allowed this option.

The same is true of FA. While the status of Hume's Principle is the subject of philosophic dispute, Wright does sometimes speak of it as if it were a stipulation. Worse, when one defines what it means to be a natural number (using the Fregean technique of the ancestral of the successor relation), it is not stated as an assumption. But again, it can be contested that the analysis is correct, and so the definition should be an axiom.

For the moment we are quibbling, because of course it is possible simply to add axioms which have hitherto been held to be stipulative definitions, and reclassify them, this time correctly, as analyses which can be disputed. Two questions then appear: Are the analyses correct, and can we justify them? Answers to these will go a long way to addressing our original concern, namely are these extensions of traditional ZF and FA (call them ZF* and FA*), which contain axioms rather than stipulative definitions, foundations of elementary arithmetic.

In the case of ZF*, there are good reasons to say the analysis fails. There is first of all, the dilemma noticed by Benacerraf. ZF* has different options on to how to define the number 2, for instance either as {0,(0}} or as {{0}}. How can one choose when both appear equally good candidates? Since the two options are provably different (in ZF!), 2 cannot be both. It seems, then, that 2 cannot be either.

This is not quite a knock-down argument, in the sense it is conceivable that 2 just is (say) {0,(0}}, even if there is no way to explain why it is so. But since there is no explanation, there is no justification, and so 2 = {0, {0}} cannot serve as part of a foundation.

Secondly, Occam's razor is not kind to ZF* (and just plain old ZF), whose ontological implications are, simply put, quite monstrous. It populates the universe with an unbelievable amount of entities, far more than what a foundation of elementary arithmetic seems to warrant. If we need all that to provide a justification for arithmetic, then it hardly seems worth the bother.

Thirdly, we may ask what justifies the axioms of ZF. There are three major proposals:

1) the "naive" idea of "set", restricted to sets which are not too "big";
2) the iterative concept of "set"; and
3) what is probably best called, ZF-works-and-that's-all-that-matters.
Let's take each in turn.

The first essentially accepts the "naive" concept of set but locates the error which leads to the paradoxes in the fact that sets which are too big are not really sets or perhaps are sets which cannot belong to other sets, as Michael Hallett explains in his fine book on the subject. What bigness has to do with it, is not really explained, except perhaps in Cantor's theological meditations about the "absolutely infinite", which hardly seem to count. To point out that otherwise there is a contradiction, is of course just the case of the dog chasing its tail. What, after all, is it in the concept of "set" that excludes the set of everything? And why shouldn't the set of everything belong to itself? The existence of a universal set is, of course, not self-contradictory; contradiction results only when ZF's Axiom Schema of Separation, or some such, is added. Although it takes us too far from our course to argue here, philosophical arguments tend to show that the paradoxes result from Separation --and impredicativity and viciousness--and not the universal set. (See "A Comprehensive Solution of the Paradoxes.")

The second theory advances the idea that sets are formed in stages; at any particular stage, one puts together a set from sets already formed in previous stages. For instance, Dana Scott famously proved that the axioms of ZF (minus Replacement) can be proved in a system which essentially formalizes this idea of iteration, and some philosophers advance this as a justification for ZF. Note that if this is so, then ZF cannot be a foundation for arithmetic. For its axioms would be justified by other propositions, so they are not fundamental, and only Scott's system could be a foundation. Be that as it may, we still would need to know what justifies the iterative theory. While it can perhaps be called "natural" (e.g. Boolos does), it hardly seems evident or to be implicit in the concept "set". The naive concept is evident, or would be evident if it were only consistent--that, after all, is why we call it "naive".

Another difficulty with the iterative concept is that it depends for its intuition on the idea of "forming" sets at particular stages. And, while finite stages seem to be permissible, there seems no iron-clad argument to explain why a stage after all finite stages is.

So, even if the iterative concept is not ad hoc in the strictest sense of the term, it does seem to be ad hoc in spirit, by not having any sort of independent justification. Indeed, since the iterative concept effectively bans the universal set as well, the same criticisms which were lodged against the first justification applies here as well.

Which brings us to the third type of justification--ZF works. This argument goes as follows: ZF proves all or almost all of mathematics, and since mathematics is used in the sciences, the success of science serves as a justification of the axioms. There may be some sense to these observations in other contexts, but it does not have any in the search for a foundation of arithmetic. If arithmetic is the justification for its own foundation, then we are caught in the vicious circle of the house supporting the foundation, which is supposed to support the house. Secondly, if science is not supposed to hypothesize unnecessary entities, neither it would seem should the mathematics which is used in the sciences. And as we have already noted, ZF's ontological needs are vast.

FA* fares better than ZF* on the first objection raised above. For while it does permit multiple definitions of 0, they are different without being inconsistent. To see this, let "(Nx : phi)" be read as "the number of x such that phi." Then FA* might state as an axiom either

(a) 0 = (Nx : ~x = x); or
(b) 0 = (Nx : ~ there exists y s.t. y = x).
Unlike the analogous situation with ZF*, (a) and (b) are not provably inconsistent, and indeed in FA*, they are provably the same. So with FA*, we are not faced with an undecidable choice, with the consequence that we are not forced to eliminate both options.

FA* uses traditional second-order logic, which, while never explicitly advanced as such, can be seen as a solution to the paradoxes. After all, completely unrestricted, a logic with predicates suffers from a version of Russell's paradox, via the predicate which does not satisfy itself. Formally,

Q[P] <=> not P[P].
Second-order logic avoids contradiction by dichotomizing the universe into thing and predicate. Obviously, there is a grammatical difference between a predicate ("is young") and a referring concept ("Thomas"), but (following Quine) second-order logic effectively eradicates this dichotomy when it allows quantification over predicates. "For all predicates" sounds an awful lot like "for all things." Indeed, in order to quantify over "P" in formula like "Px", the latter must be read as something like "x satisfies the predicate P;" so that, for instance, "(x)(P)Px" can be read as, "for all x and all P, x satisfies P." That is to say, the only grammatical predicate in second-order logic is "satisfies," and the predicates P are just a special type of thing. So, big letters are a subclass of small letters; and one should always be able to substitute big letters for small. For instance, "everything is identical to itself" should normally be permitted to imply that "P is identical to itself," for any particular predicate P. This is not just a minor fix, since then full-scale comprehension, which second-order logic assumes, leads to a contradiction. In summary, there is an incoherence in second-order logic: between its need to quantify over predicates and its dichotomy of the universe into thing and predicate. Resolving this in the most natural way leads back to the paradoxes. So traditional second-order logic finds itself between a rock and a hard place.

Finally, FA* does make a certain amount of ontological commitments, obviously much less than ZF*, but still significant (for instance, it says that there exists a number of the set of the natural numbers), and still apparently more than arithmetic warrants.

However, there is a yet even more critical hurdle to FA*'s claim to be a foundation: HP does not appear to be even true, or at the least, its truth is so questionable it is perverse to say that it can serve as a justification for elementary arithmetic. To be clear, HP is, as stated above,

"the number of x is the number of y if and only if there exists a 1-1 relation from x onto y" (HP).
Anyone living before the Cantorian revolution, on encountering HP, would have scratched his head and reacted as Galileo had done, by saying this couldn't be true, because a counterexample is x = the natural numbers and y = the even numbers. For on the one hand doubling is a 1-1 onto relation between them, and on the other the two sets cannot be equinumerous since the even numbers are properly contained in the naturals. Indeed, there is what might be called a standard (I guess, to be fair, the Cantorians would call it the "naive"!) view of number, that the only how-many numbers are the natural numbers (and infinite sets do not have a number). While one can perhaps argue against this standard view, it is hardly evidently false. After all, people seemed to have had an adequate idea of number before Cantor. And if it is not evidently false, then HP, which implies immediately that the natural numbers have a number, is not evidently true.

Even in the finite case, HP does not seem so sure. What happens if they're aren't any relationships, or any big relationships, or if there's a funny sort of relationship that we haven't thought about? Of course, one can prove, in second-order logic, that relationships--and the answers to these questions--are as one expects. But it seems unwarranted to assume facts which need to be proved, as part of the justification for one's axiom. At the least, it makes it out to be less than fundamental.

Given that it does not seem intuitively obvious, one should ask where else may lay its justification. There seem to be three possibilities: (1) it is a generalization from the finite case; (2) it uses a principle of abstraction; or (3) it is a stipulation. Let's take each in turn.

Of course generalizing from the finite to the infinite is fraught with uncertainty, and there are many instances where it simply does not hold. So before one generalizes to get HP, one must somehow provide a justification that it can be done. That is, the request is simply pushed back one step. And, to my knowledge, there is no ready answer to this second demand, which does not also beg the question.

Justifying HP by a principle of abstraction is, in itself, attractive. However, doing so implies that HP is provable from another proposition, which being more general would also be deeper. So FA* cannot be a foundation, only FA + {some principle of abstraction}. The question then becomes, is this latter a foundation?

Now as Dummett has pointed out, abstraction is also the basis of Frege's inconsistent Axiom V, which indeed is the fully general version. So not all abstractions evidently work, and there must be some finetuning as to which principle is exactly being contemplated, and why. This Wright has gamely tried to provide. First, the general principle of abstraction is given as,

(F)(G) (Sx : Fx) = (Sx : Gx) iff F ~ G, where ~ is some equivalence relation.
Wright explains--and this is an exact transcription, except for the change in notation--that one is not allowed to make this assertion precisely when an S object falling under some concept, F, of which it is Sx : Fx entails that it falls under every F of which it is Sx : Fx. If the reader comprehends this in one go, he is evidently more capable than myself. Since the rule is difficult to understand and so able only with difficulty to claim the status of an intuitively obvious assertion, what can justify our accepting it and thus Wright's (restricted) abstraction principle? That is to say, Wright should not merely construct a rule which separates the "good" from the "bad." The technical accomplishment is only half the work, and largely the easiest part. By all rights--even if he is not interested in the foundation question--he should provide an explanation why his rule should be adopted, other than the question-begging reply, "Without it, we get a contradiction." Since Wright entitles his paper "The Harmless Impredicativity of N=," he apparently would not explain his rule by appealing in the direction of impredicativity, and so it is mysterious just how he can pull it out of his philosophical hat.

But suppose even that Wright can correctly explain his rule. Nonetheless, it is inconsistent! As Dummett points out, it lets through Frege's own fix of Axiom V, which is true only in singleton domains. And Boolos has other examples of applications of the rule which force the universe to be finite. Since the abstraction principle also implies HP, which in turn implies PA and that the universe is infinite, the restricted principle is inconsistent as it stands.

But even grant Wright his abstraction principle. He still cannot justify that HP, deduced from an abstraction principle, is about natural numbers, because he has no control over what the abstracted entity is. That is to say, the principle allows one to abstract from the equivalence relation "x is equinumerous to y", to an object "the plok of x." But there is no guarantee that the plok of x will be the natural number of x! This is an additional assertion, which somehow must be stated as an axiom in any system which wants to be a foundation for elementary arithmetic or (as Wright merely wants) prove the Peano Axioms. Again, since it can hardly be asserted to be intuitively obvious, we can again ask what justifies our saying it is so. Indeed, given that it goes against the standard view of number, it seems that it is not even so.

The third possible justification, that HP can be stipulated, certainly takes the principle out of harm's way, because, as we have already remarked, stipulations cannot be challenged. However, as we have argued, the concept "number" cannot be stipulated, only the word. And, if the word is involved, the definition does not change what the concept number is, or what arithmetic is. The definiendum might as well have been "plok" as "number". And had it been "plok", we would all probably agree that the stipulation, of itself, has nothing to do with arithmetic.

Now, as Frege formulates it and Dummett stresses, it is possible to break up HP into two steps:

"the number of x is the number of y if and only if x and y are equinumerous"--
call this HPa--and
"x and y are equinumerous if and only if there exists a 1-1 relation from x onto y"--
call this HPb. HPa appears relatively harmless. HPb, on the other hand, seems to have a certain amount of meat to it, and so again we are entitled to ask what justifies it. Not surprisingly, we are likely to get back the same answers as we did for HP. Generalization from the finite to the infinite, no more works here than it did before, for HP itself. Moreover, as this is not a case of abstraction, there can be no appeal to such a principle.

By process of elimination, that leaves the third possibility. And at first glance it seems possible that HPb is just a stipulation. After all, since HP has been broken up into two steps, HPb can be a stipulation without implying that number itself has been stipulated, because HPa intervenes. This is all to the good, since again, we cannot stipulate what number means, unless we want to give up on talking about arithmetic. However, if HBb is just a stipulation of what the the word "equinumerous" means, then its appearance in any proposition can simply be removed in favor of what it has been fixed to mean. So HPa then simply becomes HP! And so breaking HP into two, serves no purpose, because it hasn't really been divided at all.

Part III: What is a Foundation?

Negativism is all well and good, but let us try to be constructive and sketch--with no claim of originality--what appears to be a foundation of elementary arithmetic. First, as was suggested above, the solution to the paradoxes is in some version of impredicativity or viciousness. An analysis of the situation (see A Comprehensive Solution to the Paradoxes) leads to a statement of comprehension--call it Pred--similar to Weyl's formulation and as it is found in systems like ACA. (Solomon Feferman deserves a great deal of credit for his study of such predicative systems.) First, introduce terms {x,y,z,... : phi} to stand for the predicate of those x,y,z,... which satisfy phi. That is, {x : phi} is a term, {x,y : phi} is a term, and so forth. Then

(x)(y)(z)... ( {x,y,z,... : phi}x,y,z,.. <=> phi ) provided phi does not contain either any bound variable or x,y,z,... in the predicate position.

For those who are worried about ontological commitments, the introduction of the terms {x,y,z,... : phi} makes no new claims. It essentially asserts there is a predicate in the case when there is already a predicate (phi), so there are no "extra" entities required. Pred, which makes no ontological assertion, is itself essentially about when a predicate can be assured to be well-behaved.

Now a common objection to a schema like Pred is that it is too weak to allow a definition of the natural numbers N. For instance, Pred cannot be used if phi corresponds to the ancestral of the successor relation, as in

Nx iff (P) ( P0 & (y)(Py => P(y+1)) => Px )
(Induct)
because there exists a bounded variable, P, in the predicate place. And of course it is precisely Induct which FA uses to define the natural numbers. However, it should be clear to the reader who has accepted the argument in Part II, that this is not a real drawback. True, the natural numbers N cannot be defined in this way, but N cannot and should not be defined in any way, shape, or form. It has to be taken as a given, because if it is not, the resulting system is not even about arithmetic. N cannot be defined, and axioms about N cannot be avoided, in any system hoping to be a foundation of elementary arithmetic.

Another objection is that the system is not sufficiently strong to "do" real analysis. Now granted that the aim of those who study ACA is to see what mathematics can be developed from it, so evidently such a remark may be germane. But it would not make sense in the present context, unless one assumes that somehow real analysis does or should have the same justification as elementary arithmetic. Now this seems implausible at best, since anyway on a superficial glance, the status of real analysis looks different than the status of elementary arithmetic. Besides, without entities which are real numbers (which, according to the philosophy expressed here, should be assumed rather than "constructed"), it is hard to see how a foundation of arithmetic could be one of real analysis as well. Of course, if real analysis were to fall out of the same wash as elementary arithmetic for free, then I as much as the next person would be perfectly willing to accept the gift. But it should not be a condition of a foundation of arithmetic that it also be a foundation for real analysis. In brief, I know of no argument supporting the great Catholic imperative--that all of mathematics must or should emanate from the same single universal source. And until that time (if ever) it is firmly established as necessary, it cannot be fairly used as a bludgeon.

In spite of seeing things very nearly aright on the paradoxes, ACA assumes the Peano axioms. Thus it never ties 0 to the situation where there are no objects, and so on. This is easily rectified, so let's do it! The following system, while not exactly that used in The Foundations of Elementary Arithmetic, is nearly equivalent to it.

Big letters, small letters: big letters can always be substituted for universally quantified small letters. One can suppose the reverse--that small letters can always be substituted for universally quantified big letters (i.e. there is no difference between small and big, and the use of the one or the other is just a stylistic choice)--but it is not necessary and probably better not to for philosophic reasons.

Mn,P means "n numbers the things satisfying P"
(x) means "for all x"
[x] means "there exists x"
{x : phi} means "the predicate of those x satisfying phi" (and is to be considered a "big letter")
Sn,m means "m is the successor of n"
Nn means "n is a natural number"
phi(i\j) substitutes the term i in place of all free occurrences of j. (We presuppose that phi is such that i, upon substitution, does not become bound.)

COMPREHENSION

(x)(y)(z)... ( {x,y,z,... : phi}x,y,z,.. <=> phi )
provided phi does not contain either any bound variable or x,y,z,... in the predicate position.

NUMBER OF
(n)(m)(P) ( Mn,P & Mm,P => n = m)
(n)(P)(a) ( Nn & Mn,P & ~Pa => [m] (Nm & Mm,{x : Px v x = a}) )
ZERO
N0
(P) ( M0,P <=> (x) ~Px )
ONE
(P) ( [a](x)(Px <=> x = a) => M1,P )
TWO, THREE, FOUR (up to 4 so that "2 + 2 = 4" is provable)
(P) ( [a][b](~ a = b & (x)(Px <=> x = a v x = b)) => M2,P )
etc.
SUCCESSOR
(n)(m)(P)(a) ( Nn & Sn,m & ~Pa & (x)(Qx <=> Px V x = a) => (Mn,P <=> Mm,Q) )
INDUCTION
phi(0\n)
(n)(m) ( Nn & Sn,m & phi => phi (m\n) )
----------------------------------------
(n) ( Nn => phi )
ADDITION (+)
(n)(m)(k)(A)(B) ( Nn & Nm & Mn,A & Mm,B & ~[x](Ax & Bx) => (+n,m,k <=> Mk,{x : Ax v Bx}) )
SUBTRACTION (-)
(n)(m)(k)(A)(B) ( Nn & Mn,A & Mm,B & (x)(Bx => Ax) => (-n,m,k <=> Mk,{x : Ax & ~Bx}) )
MULTIPLICATION (*)
(n)(m)(k)(A)(B) ( Nn & Nm & Mn,A & (x) (Ax => Mm,{y : Bx,y}) & (x)(y)(z)(Bx,y & Bz,y => x = z) => ( *n,m,k <=> Mk,{y : [z](Bz,y & Az)} ) )

Remark that the Subtraction axiom does not say what happens when n < m. That is, -1,3,5 (aka 1 - 3 = 5) does not result in a contradiction. Indeed, we can neither prove -1,3,x nor prove ~-1,3,x for any x. Of course, we could state another axiom which would clarify this, but it would not have any important consequences.

A crucial proof (as it is in Frege Arithmetic) is the one that there are an infinite number of things and its cousin, that for any n, there exists a P such that Mn,P. It follows the line of Dedekind's argument, that the empty predicate is one, the empty predicate union the predicate x = the empty predicate are two, and so on. That is, the theorem essentially follows from the infinitary presuppositions of the underlying language. Once it is proven that there are n objects for every n, the hypotheses in the axioms for Successor, Addition, Subtraction, and Multiplication become satisfiable.

It is possible to state and prove the following two theorems (which usually are definitions): that addition is just repetition of the successor relationship; and that multiplication is just repetition of addition. So significant are they to the theory, that they merit the title the Fundamental Theorems of Elementary Arithmetic. One advantage of taking subtraction as primitive is that it allows one to prove (rather than stipulate) that it is the reverse of addition; this is called the Fundamental Theorem of Subtraction.

Generally, all the theorems of elementary arithmetic, and in particular the Peano Axioms, can be proven in this system. And so here is a foundation.