Back to H.F. Philosophy contents


Henry Flynt

(c) 1996 Henry A. Flynt, Jr.


For many years, my work has called for a reorientation of mathematics and logic which diverges from the way these disciplines have been directed since the ancient Egyptians--or the old stone age, for that matter. My campaign has proceeded on many fronts; this is not the place for a full inventory of my work. Because I advocate a reorientation of the entire historical direction of the disciplines, I do not consider the future of mathematics and logic to be primarily a professional question.

The scientific community is deeply committed to a view of its own destiny which is well articulated by theoretical physicists. Historically, science is a series of commitments to mathematical apparatuses which, once they are established, are endlessly elaborated, but never discarded. One builds on Newton, Maxwell, etc., by recycling them; one never repudiates them.

In pure mathematics, the equivalent to this stance is that nobody wants to change the decision for the infinity of primes or the irrationality of [root]2 which was made at the outset of rational mathematics. These tenets are held to be valid by the latest, "Left-wing" standards--and to be the source and guiding light for all that followed them in mathematical history. The profession does not want the Greeks--who adopted the elementary theorems on the basis of elementary proofs--to have taken any other course.

Citations concerning physics will serve to illustrate the scientific consensus. First, Charles W. Misner.

Physicists expect to continue teaching the old, immutable, infallible myths of Newtonian mechanics, special relativity, and Maxwell's equations while searching for failings in the theories that have supplanted these. I want no suggestion here that the myth of Newtonian mechanics is false or inadequate. It is an example of the most certain and permanent truth man has ever achieved. Its only failing is its scope: it does not cover everything. But within its now well-recognized domain it conceptualizes its portion of external physical reality better than its successors, and that is why it continues in use.[1]

The second illustration comes from Jeremy Bernstein, writing on how to spot a crank physicist.[2] One of the signs of a crank submission is that it does not uphold the sequence of previous mathematical apparatuses insofar as they were right (in just the sense we have seen in the Misner quote). Noting that 1905 relativity was manifestly not a crank theory, Bernstein says:

It connects solidly with the physics that preceded it. It explains, among other things, the connection between electricity and magnetism ...

(What Bernstein means is that relativity mathematically united the received mathematical pictures of electricity and magnetism. That is what mathematical physicists do.)

In this perspective, novelties in science arise only on the platform of previous tenets and mathematical models, and are only the affair of the most specialized professionals.

I hold that this scientific consensus is a prescription for an enslavement which can only become more and more deleterious. As opposed to new gimmicks in the high superstructure, I call for replacement: reaching all the way to the common sense which precedes and orients the mathematical sciences. To put it casually so as not to stray into another discussion, common sense is to be replaced with an outlook which regains our whole humanness.

But "the people" cannot launch this development. The people cannot invent a "people's outlook." So far from being autonomously humble, the people merely transmit the bourgeois knowledge of the previous century: the doctrines learned by the high-school student who never reaches college. The people cannot get beyond simplifying what they have learned as common sense.

More than that: science is repeatedly challenged demagogically; then laypeople want to believe things that simply don't work. (In fact, to treat scientific notions as symbols for political relationships.) All the while, technology is expanding at an exponential rate. "The people's" expectations are running opposite to what any democracy of obscurantism could deliver.

We can banter about what would be a desirable scenario of cultural transformation--but of course I cannot direct events. On the other hand, I am already in possession of the intellectual solutions which I envision. Negatively, that means that I see the intellectual disciplines as wholes, and attack them wholesale. Affirmatively, my proposals have to do, for example, with the invention of new mental abilities. Such benchmarks as

The Apprehension of Plurality
Logically Impossible Space
The Counting Stands

will serve as examples.[3]

The upshot is that I need a public which does not really exist, a public which is broadly educated and "scientifically mature" without having any loyalty to the professions. In the past, the endeavor would have been called natural philosophy. But the decline has gone too far for that. There is no longer any slogan such as "philosophic" which the popular mind understands to transcend careerist ploys, and also to separate integrity and discipline from mercenary expediency. One wonders whether there is anyone today whose intentions toward the future are other than unprincipled. (But so it was always.)


The present essay evolved in discussions with Christer Hennix about junctures in mathematics and logic in the middle of the twentieth century. We must suppose that even people who can imagine that received mathematics could be rejected wholesale would still assume that academic mathematics is true for what it is, or as far as it goes. In other words, once you consent to play the game, then the infinitude and uniqueness of the series of positive whole numbers, or the irrationality of [root]2, is beyond dispute. "These tenets are outcomes which were arrived at fairly, and no purpose would be served by trying to disprove these results to a professional mathematician. Once you agree to the game, then these answers eventuate in a straightforward and unique way."

Hennix and I claim that this impression of professional mathematics as responsible and magisterial is erroneous. The purpose of this study is to review content-decisions in mathematics and logic so as to show the educated laity that the professional process is that of a pseudo-science. In the first part of this essay, I will briefly review junctures in the history of mathematics which are commonplaces to those at the research frontier in mathematical logic, but which are probably unknown to the laity. These junctures can be interpreted as evincing casuistry--not to say deeply deluded insincerity--in the negotiation of the content of mathematics. Interestingly enough, while the famous battles between schools, from the early twentieth century, are involved here--because of the way actual research results were guided by them--the material is not at all the same as simply arguing one school's case against another, as Gabriel Stolzenberg and Michael Dummett do on behalf of Intuitionism.[4]

In the second part of the essay, I will turn to case studies from the mid-twentieth century. Hennix furnished the balance of the cases and suggested what they mean. The views expressed here are mine.

There is a problem with the methodology here. I am assimilating junctures-of-record in mathematical history in a new kind of interpretation. That means that I will rely on the work of historians of science and sociologists of knowledge.[5] Such reliance on authorities who do not match me for iconoclasm is untypical of me. It carries two risks. First, the entire academic world has been infected with "radical" self-publicity. (A notable example was Thomas Kuhn and the "scientific revolution" fad of the late Sixties. The problem was exacerbated by Feyerabend, Watzlawick, etc.) As a result, some of the the in-house scholars who might be sources for this investigation make apocalyptic-sounding claims. I have to take care not to compromise what I am doing by carelessly siphoning support from narrowly careerist revolutionaries. Not only are they not allies of mine; they do not wish me to be allowed to speak.[6]

The other risk is that in some cases I will rely on the secondary authors' judgments of the primary literature, without having made a direct study of the primary sources. There is always the chance that what the camp followers have seen in the primary sources misses the point, or even infers the opposite of what is going on.

These limitations of the project cannot be avoided at the present time. Hopefully, if this investigation can continue, it will be possible to give original critical treatments of all of the twentieth-century junctures in question.

There is plenty of in-house evidence that a cynical view of exact science is justified. On the other hand, as long as the in-house scuttlebutt is construed at its manifest level, it can never be piercing enough to show that mathematics is a pseudo-science.[7] What is important is that one does not have to choose between the present quasi-historical review and the inventive studies which I and Hennix have provided. The two avenues support each other.


The announced principle which spread its wings over mathematics and logic in the twentieth century was the principle that the ultimate meaning which could be given to truth or creditability in mathematics was consistency. The Holy Grail of mathematical truth was the proof of the consistency of exact knowledge. Hilbert demanded that theories be brought before the Bar of Justice and subjected to a once-for-all test of their consistency. If found inconsistent, they should be cast into the outer darkness.

We cannot overestimate how pervasive this ideology was in the self-understanding of Foundations of Mathematics, and how great its role was in motivating the milestones of research. Russell on Frege, type theory, Hilbert, Gödel on Hilbert, relative consistency proofs, w-inconsistency: the list of professional results motivated by the consistency ideology is interminable. Both the Russell paradox and the Gödel theorems have been called the most important discoveries in exact science in thousands of years.

Probably the most unusual figure to be accepted in the professional fraternity in the twentieth century has been Yessenin-Volpin.[8] Volpin's whole purpose, the locomotive to which his whole doctrine was hitched, was the drive to overrun Gödel and to fulfill Hilbert's dream of delivering the great all-embracing consistency proof.

There has been a fringe in mathematical logic, called the paraconsistent school, which has proposed that consistency is a fetish; that there should be some way to preserve a theory with a little bit of inconsistency. From that point of view, for example, one can accept naive set theory in spite of its little bit of inconsistency (Russell sets), and in that way one can gain the advantage of naturalness and simplicity which naive set theory possesses.

Whether we may lately allow a little bit of inconsistency, and rehabilitate naive set theory, is beside the point. Exact science in the twentieth century is a gigantic tribute to the mystique of consistency. If consistency never mattered, then both the self-understanding of mathematics and the whole research program were a complete lie.

The paraconsistent school misrepresents how the development of the discipline was guided. The ideology of consistency was the whip which kept the troops marching toward the front. Lower the whip, and the troops would have deserted in every direction.

Allowing a little bit of inconsistency will not solve the problem of how meaning arises. You cannot prove that you have an inconsistency by allowing one.

Mathematics cannot even be recognizable without the posture of probity, of rigor and rational superiority. No matter how much the content of mathematics exploits paradox, mathematicians express dedication to policing their doctrine against inconsistency.[9] Mathematicians do not welcome those who attempt inconsistency proofs of favored theories. The casuistry which I detect in mathematics is encouraged because notions to which the profession is committed need propping up.

Beyond consistency, mathematics depends on a posture of supramundane rationality. An often-quoted passage by F.P. Ramsey on the set-theoretic paradoxes proclaims that mathematics has nothing to do with language, thought, or symbolism. What we can learn from this statement is that Ramsey held mathematics to be superior to these human foibles. Brouwer proclaimed that mathematics is absolutely superior to language and logic. The mathematics profession is not especially permissive. Occasionally it rebukes a charlatan severely, ending a career; cf. Kreisel on Zinoviev and Fermat's Last theorem.

Correlatively, mathematics is a fanaticism of mechanistic objectivity and objectification. Genuinely "subjective" agents are not acknowledged in hard science--not because they aren't palpable, but because there is an agreement, unstated or stated, not to mention them. Ramsey's position, as just noted, is anti-verbal, anti-mental. Notwithstanding quasi-paradoxes and pathological cases, mathematics remains mechanomorphic. The mathematical object remains an abstract, permanent, qualityless thing. Number remains fixed, abstract, qualityless plurality. Such is the ideological-cultural import of the reality-type of the mathematical object. This import of mathematics is further buttressed by the claim that finite mechanomorphic systems can be ascertained to be consistent by inspection.


The preceding observations notwithstanding, the notion of dragging theories before the Bar of Justice and subjecting them to a once-for-all test of consistency was a sham. Only in a small number of cases were prominent theories embarrassed by inconsistency proofs; and in every case, the theory was not discarded, but rather patched over--typically in a way which betrayed the traditional intentions associated with the mathematical structure in question.

After the early episode of Frege's system and the Russell paradox, the only accepted inconsistency proofs, to speak of, were Kleene and Rosser re Church and Curry, and Rosser re Quine. The impact of these proofs on the reputation of mathematical knowledge was negligible. Quick repairs obviated the embarrassments.

To understand the course of mathematics, we need to adopt a different angle of view. What was profound was that results were contextualized so that they ceased to be inconsistency threats. The Löwenheim-Skolem paradox; Skolem's w-inconsistent model of Peano arithmetic (also the conjunction of Gödel's completeness and incompleteness theorems); w-inconsistency of Quine's "New Foundations" set theory; independence of the Axiom of Choice; etc. But mathematics had always proceeded like this: e.g. Dedekind had taken Galileo's paradox as the definition of infinity.

Here, in fact, is one reason why the paraconsistent school is lacking. It completely misses what actually happened to contradictions when they cropped up: they were reclassified from contradictions to Axioms.

Let us reflect on this. It is known that any contradiction can be neutralized by making a distinction which previously nobody had encountered or wanted. What is not understood is that any outcome in mathematics which violates or masks original intentions regarding a structure can be construed as a contradiction if a way is found to express those intentions as an axiom. (E.g. uniqueness of the nonnegative whole numbers.) But: instead of availing themselves of the opportunities to convict themselves of inconsistency, mathematicians abandoned aspects of the original intentions.

A promised vindication which fails can be considered as a sign of charlatanism, of course.


We need a candid definition of Foundations of Mathematics. Foundations of Mathematics is the claim to validate mathematics by adding another layer of mechanical computations to mathematics; it is the claim to substantiate received structures by expressing them with new computational machinery. Mathematical truths are substantiated with schemes of calculation: repeated, rule-governed logical operations.

As an example, I might mention the proof of 2 x 2 = 4 in Wittgenstein's Tractatus, [section]6.241. Or, Kleene's proof that "a = a." The latter consists of seventeen steps, beginning

a = b => (a = c => b = c)

0 = 0 => (0 = 0 => 0 = 0)


Schoenfield's celebrated textbook serves to present the method explicitly, even if it is not the best possible object-lesson.[11] Actually, in spite of what I have just said, the slogan "mechanical" is an ideology. Foundations of Mathematics is by no means free of controversy about being mechanical. (Let us also note that Schoenfield uses set theory to define mathematical logic long before he expounds set theory. Foundations of Mathematics is not only a vicious circle; the vicious circle lies wholly within its sectarian syllabus.)

Returning to the actual development, the outcome was typically that the computational machinery failed the original intentions. The intentions could have been expressed as axioms. In that case, the machinery would have evinced contradiction.

But the opposite course was taken. The profession committed to the machinery. The machinery missed inconsistency only by a technicality. That was considered sound enough. Aspects of the original intentions were abandoned. As a result, modern mathematics has become a world of "pathologies."

The premier case is so obvious that a professional can't see it. It is professionally obligatory to call the nonnegative whole numbers the natural numbers--implying that anyone who doesn't venerate these numbers is an idiot or crazy.[12] In distinction from the positive integers, the natural numbers begin with zero--a point which is not at all trivial. The effect of Foundations of Mathematics has been to destroy the status of these numbers. And yet the name which sanctifies them is still obligatory.

A second structure is naive set theory (with the Comprehension Axiom), or Cantorian set theory. Even though naive set theory has come to be thought of as natural in the twentieth century, it was unseasoned and artificial and went through a brief period of bitter opposition. (Cf. the attack by Frege and Husserl on the empty set, reservations about completed infinity, the Axiom of Choice, etc.)

The consistency issue became a debacle of insincerity. The new territory was entered not because the discovery of the territory was an achievement of such brilliance, but because mathematicians preferred to embrace reductionism and to displace the traditional intentions to the level of unmentionable competences.


The modern attempts to provide an absolute justification of elementary exact knowledge have created an increasingly arcane involution, a casuistry, a shell game. The layperson is allowed to believe that 2 x 2 = 4 is "still" unexceptionable, while the inner circle makes this bit of knowledge more and more obscure. And yet the inner circle never questions whether we should have believed 2 x 2 = 4 in the first place.

If there is anyone whose intentions toward the future are not opportunistic, that person needs to traverse the involution process, or shell game, in a syllabus which is not apologetic. The main target of my attack at this stage has to be the impression cultivated by the profession, the loyalists, that mathematics is unassailable for what it is. The layperson has to learn that 2 x 2 = 4 is not a self-contained certainty, but is a proximate node in an impenetrably obscure structure.

Then the question can be asked: should "knowledge of space and quantity" have been launched in a different way at the beginning of civilization--or of culture? When a profession develops over millennia as an apologism, then the initiates, the adepts, cannot understand that a responsible layperson might want to restart the game. (As opposed to protecting the path already taken, the realized series of casuistries.)

New cultures come from broad elementary strokes which are radically disloyal to apologetic cliques.

So it is that Part I of this essay has the character of a history aimed at a laity which would seemingly not know enough to be an audience for active issues in Foundations of Mathematics. Part II, which covers the contemporary case studies, will argue that evolution of the professional consensus amounts to successively patching a hoax. [I attach an appendix about how a paranoid shores up his delusional system ad infinitum. It amounts to a sketch of the history of mathematics.]


We may ask again: Is academic mathematics "sound" "as far as it goes" or "for what it is"? If it were, no endogenous critique of the published theorems would be possible. One could escape the results of academic mathematics only by bringing in considerations outside those which comprise the declared subject-matter.[13]

The alternative would be that the content of academic mathematics eventuates from sophistry and majority preference. As to the latter, Paul Lorenzen says,

You will become famous if you please famous people -- and all famous mathematicians like axiomatic set theory.[14]

I will refer to such considerations as professional discipline (or even coercion). If this is the situation, then the profession will protect itself from heresies and criticism not by superior reasoning, but by additional professional discipline in conjunction with additional casuistry. (Casuistry must be secondary; because casuistry, a dialectic involving formulas, cannot win in a vacuum.) Then an endogenous critique is possible: we can scrutinize the process to identify the majority pressure and the casuistry.

* *

Part I

In "Anti-Mathematics" (1980), I argued that if mathematical truth is delusive in some deep way (so that mathematics amounts to a culture-wide superstition), this cannot be exposed by procedures internal to mathematics. Specifically, inconsistency proofs internal to professional mathematics -- which the profession pretends are the main form of critique in mathematics (e.g. the Russell paradox) -- cannot furnish any decisive exposé of mathematics' delusive character. One reason is that what a contradiction is, in mathematics, is continually being redefined by the profession via negotiation and interpretation. Another factor is that mathematical thought changes to neutralize inconsistencies. When new results are obtained which are problematic because they conflict with traditional intentions concerning mathematical structures, then aspects of the traditional intentions are suppressed. Hennix calls this the reconceiving of the history of mathematics. I propose: the reductionist re-founding of all previous results. So it is that inconsistency results are often neutralized or co-opted; becoming part of the affirmative content of mathematics.

At the beginning of rational mathematics stands the result concerning the incommensurability of the side and diagonal of a square. This severe embarrassment, which can be conceived as an elementary refutation of mathematics, was instead co-opted as a program for the development of mathematics. (With legitimacy of proof by contradiction as a crucial consideration allowing this.) And yet the status of irrational numbers and the continuum of reals has been questionable to this very day. Cf. Brouwer, Arend Heyting, R.L. Goodstein, Hao Wang, Jan Mycielski, Stan Wagon, Wim Veldman, James Geiser, etc.[15] [the Souslin line.]

To repeat, Galileo's paradox became Dedekind's definition of an infinite set.

For the inconsistency of the calculus in its early presentations, see I. Grattan-Guinness.

Or consider the rise of the topic of divergent series. (Morris Kline) Etc.

The view that mathematics obtains its content by coopting inconsistency was already hinted at in the historical study authored by Morris Kline, and the sociology of mathematics of David Bloor.

The purpose of the present study is ultimately to update this perspective to the research frontier of the twentieth century. The balance of the examples treated here are from metamathematics. By no means is the phenomenon unique to metamathematics, however. The same sort of casuistry can be discerned in the history of treatment of irrational numbers, the history of treatment of divergent series, the history of treatment of infinitesimals, etc. Not being a professional, the boundary between mathematics and metamathematics does not matter to me. (Consisting as it does of computations, metamathematics can be replicated within mathematics, as was noted by Tarski, and Rasiowa and Sikorski.)

Already a precedent for this approach has been provided by historical studies of the Axiom of Choice (with the Banach-Tarski paradox) by Gregory Moore and Stan Wagon. From their excavations, we learn that Émile Borel published a book, Les Paradoxes de l'Infini, which on p. 210 said that the Banach-Tarski paradox amounts to an inconsistency proof of the Axiom of Choice.


Let it be clear, however, that I am distant from Kline, Bloor, Moore, Wagon, and others. I do not take a compliant view of mathematics. I consider it as a historically given doctrinal institution; placing in suspension any claim that there is an ideal perfect mathematics. Given Hennix's combined historical and rational scrutiny of the development of mathematical doctrine, I will propose that the main factor in the establishment of "truth" in mathematics is professional procedure and discipline. In this connection, we should note that in the early twentieth century, there was a sharp counterposition of hostile schools of mathematics.[16] Beyond that, this is somthing other than a technical paper. I wish to offer a message which goes beyond the content of any theorem, and speaks to non-mathematicians about the civilization in general.

As should be familiar by now, I have an independent iconoclastic perspective.[17] I surmise that mathematical knowledge amounts to the crystallization of officially endorsed delusions in an intellectual quicksand--in consequence of insights specific to me. Namely, the "Is there language?" trap; my work on the hypnotic-subjective formation of "mechanical" logical norms; the realization of non-vacuous contradictions in perception; etc. It's not that mathematics is "worse" than verbal thought or empirical science. My perspective arises from universal negative results on cognition (or insights regarding cognition); and its thrust is culturally disintegrative. (This will be evident from what I will say below about seeking inconsistency proofs for "transparent," finite structures.) So this discussion is a somewhat hypocritical exercise. To convey my insights in a manner that is not formally self-defeating, and to manage their culturally disintegrative consequences, requires passage to a sophisticated and unfamiliar manner of presentation. (Swimming in an ocean of chaos.) But it's not required to do all that here.


There already are many results which are inconsistency proofs in effect. Only professional discipline forestalls the obvious interpretation of these proofs as inconsistency proofs. Quasi-inconsistency proofs are neutralized or co-opted by negotiation and the addition of layers of interpretation.

The most celebrated results of the twentieth century (probably earlier centuries as well) came from skirting paradox while claiming not to land in it. A paradoxical positing is made deliberately, and then is deflected so that, as interpreted, it is not a contradiction.

Truth is negotiated on the basis of manipulation of import by distorting interpretations. Interpretation takes the form of discarding traditional intentions concerning mathematical structure: the privileged position of Euclidian geometry; the invariance of dimension; the association between integer and magnitude; uniqueness of the natural number series; etc.

From time to time, results are discovered which patently embarrass the conventional wisdom, or controvert popular tenets. [The Gödel theorems.] Then follows a political manipulation, to distort the unwanted result by interpretation so that it is seen to "enhance" the popular tenet rather than to controvert it.

Every time a major new theory is propounded in mathematics, it causes all previous mathematics to be put on a different foundation. Such re-founding of prior theories has systematically been reductionist. The question of parallel lines is divorced from a bona fide plane. Abandonment of invariance of dimension. Logic becomes a projective geometry, orthomodular lattice, Boolean algebra, or topos. Hilbert was said to reduce mathematics to a game of chess. Natural numbers get divorced from magnitude. Thus, new theories in mathematics continually re-found old theories in reductionist ways.

Contestable theories and proofs in mathematics are accredited because the dominant faction favors them. Proofs have individual steps whose creditability is the topic of an unresolvable disagreement between factions. There are posited notions whose cogency has become the topic of an unresolvable disagreement between factions. In proofs of favored results, steps which are contestable are stipulated or accepted, by the most authoritative members of the profession.

I don't want to launch into a lengthy substantiation of these observations, because that would be to treat them as much more surprising than they really are. The disputes between Platonists and Intuitionists furnish some modern examples. Let me mention in particular Brouwer's inconsistency proof of classical analysis;[18] and van Dantzig's commentary on it, which I will return to later in this essay. As for personal surmises, I will offer only one--that the Diagonalization Lemma is a weak link in metamathematics. (Bona fide self-reference would be much more problematic per se than the profession realized.)

There is another facet of this state of affairs which is not so much a matter of principle, but which should be mentioned to disillusion the layperson or neophyte. If a theorem is favored by the profession, then mathematical custom permits its announced proofs to contain palpable faults without losing credit.[19] In particular:

- It is legitimate for textbook proofs to have known errors of principle (on the grounds that the correct proofs are too tedious for students).
- A favored theorem may be upheld and repeatedly printed even though it has a history of errors being found in its proofs. (The Jordan curve lemma.)


In my view (not Hennix's), for an inconsistency proof of a favorite system such as ZF ZFC, GB, NF, or arithmetic to gain general acceptance would be a notable development in mathematics. (It is evidence against me that certain decisive inconsistency proofs in the past, such as Rosser's re Quine's ML, were absorbed with little fuss.)

Even if my sense of the situation is right, the appearance of such a professionally compelling proof would be a more a matter of packaging and selling than anything else. The evidence that there are danger-points in these structures which might be exploited in inconsistency proofs is already here. [Again, the Diagonalization Lemma.] The biggest hurdle such an attempted proof faces is professional discipline. Whether inconsistency proofs are recognized to have occurred is subject to entirely "political" manipulation.

These circumstances have the effect of rendering the boundary between

- proofs,
- specious proofs, and
- disproofs

meaningless. That is another reason for using the term pseudo-science for mathematics.


And yet, the posture of probity remains crucial to mathematics. Again, we pay heed to the mystique of the objective absoluteness of mathematical knowledge.

Seemingly it would not help my claim that mathematics is a superstition to introduce a tone of reproach into the appraisal. But this inquiry is not banter about expediency. The mode of life is at stake. The present civilization can be pictured as a pursuit of mechanistic objectivity and objectification. Genuinely "subjective" agents are not acknowledged in hard science--not because they aren't palpable, but because there is an agreement, unstated or stated, not to mention them. I look beyond the present civilization to a civilization in which our mechanomorphic mathematics will be discarded. Given this perspective, it should change our lives to learn that received mathematics profoundly misrepresents its functioning, and is anything but an objective absolute.

To expound the "new reality" is not the task of this essay. But the study is needed to prepare the way, by enabling professional mathematics to be perceived differently. The great issues, the contradictions, have been the same for thousands of years. Mathematics generates its content by grinding those contradictions into pseudo-consistent details shaped by fad and by long-range ideological trends -- such as the twentieth-century drive for computability.


Since I want some laypeople among my readers, I must devote some space to notions they may be expected to have about the intellectual status of mathematics. A lay person may wonder how I can possibly suggest that mathematics is an intellectual quicksand, a pseudo-science. To the layperson, a representative result in mathematics is 2 x 2 = 4. Do I suggest that 2 x 2 = 4 is false? Is 2 x 2 = 4 going to change?

It is instructive to give a full answer to these questions, even though doing so delays our consideration of the arcane case studies. In the first place, most mathematicians would say that the interesting or professional part of mathematics is not about 2 x 2 = 4, because 2 x 2 = 4 is a finite and small problem, and so the answer (and the consistency of mathematics at this level) can be ascertained by inspection. As we read in Recursive Aspects of Descriptive Set Theory, by Richard Mansfield and Galen Weitkamp, p. v, "The primary concern of mathematics has been to use the infinite to elucidate this world. ... The primary concern of mathematical logic[20] has been to explore the nature of infinity in order to classify and explain its mathematical applications." So, to elaborate, most mathematicians trust that the consistency of finite structures is ascertainable by inspection. Only infinite structures are problematic: the history of explicit controversy in mathematics is largely a history of notions of the infinitely large and infinitely small. Thus, 2 x 2 = 4 would become problematic only if there were a change in the whole of arithmetic, which is an endless or infinitary structure, such that a dispute about the infinitude of arithmetic reacted back on 2 x 2 = 4.

The layperson needs to understand that, after all, 2 x 2 = 4 is a piece of a larger structure with general rules of multiplication, etc. -- indeed, an infinite structure. The whole structure is characterized by such problems as these.

- Are there an infinite number of prime whole numbers?
- Is the expression of any whole number as a product of prime factors unique?
- Is there a whole number which can be divided and redivided by 2 endlessly?

As we will see, a few figures in academia have not given traditional answers to these questions.

Moreover, let me say that the remark that "only infinite structures are interesting" is a professional overview of the research agenda. It does not mean that there are no conundrums regarding small integers. What about 1 [divided by] 0? What about 0 x 0 = 0 x 1? (Cancel 0 on both sides of the equality and get 0 = 1. Interpretation made this embarrassment disappear--but as Hao Wang notes, it was an embarrassment.) More of this below.

A great deal remains to be said. Modern mathematical logic has its origins precisely as a reaction to the philosophic discussion of the truth of such propositions as 7+5 = 12 in Hume and Kant; and to the views on elementary mathematics of the philosopher Husserl, whose university degree was in mathematics.

The mathematician Frege demolished the more traditional attempts to explain and establish number and mathematical certainty. (To his satisfaction, at any rate). Frege went on to try to give natural numbers and arithmetic a sound rational basis. To him, this meant in part giving them a basis in a scheme of calculation. Frege faced many obstacles. Brouwer, and Wittgenstein in his later period, proclaimed that mathematics cannot be founded by logic, or by another layer of calculations. Frege's work was ignored in Germany; then, when his system was finally published, it was immediately wrecked by Russell's paradox. In spite of this, Frege's program was ultimately upheld by the professional majority (against Brouwer, for example). The mathematical logic elaborated by Frege, and his cohorts Boole, Cantor, Peano, Skolem, Herbrand, Hilbert, etc., became the prevailing view, or contextualization, of elementary arithmetic.

In no way is that a trivial remark or outcome. Calling themselves new hens, the foxes seized control of the henhouse. Arithmetic had been reductionistically re-founded in a way which nullified its traditional consistency and uniqueness.

By no means did the discovery that Frege's system was inconsistent kill it. Instead, it was repaired and adopted: even though the problem of repair was obdurate, and vitiated the tenet of uniqueness of the natural numbers which crystallized Frege's original goal. (The introduction of type theory and the Axiom of Reducibility.) Thus, the majority was prepared to resort to "scandalously artificial" devices to prop up a system which in its straightforward formulation was inconsistent. -- Because the majority loved the new shell game. It must also be said that a significant but weak minority opposed this course, e.g. Brouwer, Heyting, Weyl, Wittgenstein.

The lesson is that when an initially unpopular theory became popular, then prima facie inconsistency did not kill it. It was upheld by casuistry, as it were -- even though it continued to be opposed by a minority.

We enter an ironic -- and, one might say, semi-decadent -- situation. Although professional mathematics upholds 2 x 2 = 4, it does so on a basis which the layperson would find unrecognizable and puzzling. Modern mathematics divorces the natural numbers from all notions of magnitude, so that they become the same as any ordering of distinct symbols with an infinite tail. That means that the natural numbers are indistinguishable from the list

a, b, c, ..., z, a', b', ... , a", ... .

In this list, a must be a zero. Then, the structure is explained in terms of iterated groupings of the empty set [nested bracketed empty set]

ø, {ø}, {ø, {ø}}, ...

And yet, as I said, when the empty set was first proposed by Schröder in 1890, it was roundly denounced. Zermelo calls it a conventient fiction. Now Aristotle's precept that "units should be indistinguishable which combine in composite integers" is violated. (Metaphysics, Book M.) A new development causes everything which has gone before to be re-founded.

When arithmetic is made explicitly dependent on the Principle of Mathematical Induction, then the paradoxes of the heap and of baldness become of serious interest in mathematical logic. Meanwhile, Gentzen gave a proof of the consistency of elementary arithmetic (of mathematical induction) -- but to do so, he used transfinite induction.

Modern mathematics, beginning with Frege, had tried to assure the soundness of arithmetic by reducing it to an explicit game with minimal content. But now it was discovered that there were systems satisfying the new definition of arithmetic which did not satisfy the traditional large-scale properties of arithmetic. (Mathematicians were willing to abandon traditional properties of the natural numbers, at least pro forma. That illustrates my point about the abandonment of traditional intentions regarding mathematical structures.) So in modern mathematics, the very attempt to absolutely substantiate arithmetic led to advanced results, regarded as the affair of the expert only, which contradict the traditional uniqueness and consistency of arithmetic.

I do not wish to write a history of Foundations of Mathematics for laypeople with no technical background. I leave it to the lay reader to confirm my sketch of the situation through supplementary reading. Nevertheless, the circumstance that the professional view of elementary mathematics has become so remote from the lay notion of mathematics is a cultural development which amounts to a new level of bad faith in the mode of life.

Like the existence of God or the divine right of kings, dialogue on the topic may seem appropriately confined to the court's savants; but the issue is implicated in the entire mode of life--and protecting the lie from reasonable criticism, in order to preserve an outmoded regime's power, is not the only possible stand on the matter. The dialogue needs to be opened to philosophic laypeople, subject to my explanations at the beginning.

Admittedly, a professional who reads this may wonder why my (to them) popularization of the truisms of their discourse should have the tone of an exposé. Why should there be the slightest connotation of scandal? The professional is jaded because of what the process of recruitment and indoctrination involved. To become a mathematician, one must absorb all the weird developments, and the technical motivation for them, and the presumption that the weird developments bring us closer to the right mathematics. So, even though, for example, the Hausdorff-Banach-Tarski paradox has been called the most paradoxical result of the twentieth century, classical mathematicians have to convince themselves that it is natural, because it is a consequence of the Axiom of Choice, which classical mathematicians are determined to uphold, because the Axiom of Choice is required for important theorems which classical mathematicians regard as intuitively natural. (The rationalization is already hopelessly contorted here.)

When one indoctrinates oneself to be comfortable with a theory, then one finds it disturbing to see the rise of the theory portrayed as embarrassing and perverse. Yet if Kant, say, could be resuscitated, the contemporary principles of elementary mathematics (the von Neumann integers, Wittgenstein's proof of 2 x 2 = 4, Kleene's proof of "a = a") would seem grotesque to him.

There is still more to be said -- about the lay notion of mathematics versus the contemporary professional view -- relative to elementary propositions such as 2 x 2 = 4.

In the first place, what answer would a layperson receive who asked a professional whether modern mathematics has overturned 2 x 2 = 4? I suggest that the professional consensus can be summarized as follows. Nobody wants children to be taught anything other than traditional arithmetic. Further, nobody says that the Greeks--who first adopted the elementary theorems, on the basis of elementary proofs--should not have taken the course they did.

"The Greeks were right to believe their proofs that (a) [root]2 is irrational, (b) there are an infinite number of prime numbers, etc." The profession does not want Pythagoras or Euclid to have treated [root]2 or prime numbers differently. Michael Beeson, in Foundations of Constructive Mathematics, lets us know how the two great initial theorems are still regarded. These proofs, which are the source and guiding light for all that follows, are still valid by the latest, "Left-wing" standards. No modern professional says the Greeks shouldn't have done what they did.

The profession does not want the ancient mathematicians to have drawn opposite conclusions from the same input. Anyone who today proposes to establish the opposite of the traditional theorems -- via proofs at the Greeks' level of sophistication and rigor (disregarding today's level) -- would be violently resented by professional mathematicians ... even though there is nothing wrong with such a goal. [Its possibility is hinted at by Bloor?]

[Possible alternates:

Interpreting the proof of irrationality of [root]2 and [pi] as proof that perfect geometric figures are inconsistent.

The Euclidian assignment of numbers to spatial magnitudes is erroneous. The Pythagorean theorem has a circular relationship with the discovery of incommensurate magnitudes. So the Pythagorean theorem is false.

Posit that very large integers are both odd and even.]


On the other hand, in the most recherché contemporary work, it becomes a boast that nothing is sacred, that truisms are toppled. We have, for example,

[root]2 is not irrational: (R.L. Goodstein), James Geiser

This is an abstruse result which the student will see, if at all, only after two decades of learning to believe the opposite. And the result dissolves in a welter of detail strapped by professional discipline. For completeness, I should mention that there have been objections to other truisms, which remained at the level of one-liners. Mycielski and Wagon float the idea that the infinite tail of the natural number series relative to the set of all natural numbers as a totality is inconsistent. Wittgenstein denied that there must be an infinity of primes.[21] (All reservations about the primes have been finitist, it seems; the notion is that very large primes would be invisible, so to speak. Isn't that a denial of the Archimedean property?)


We have not exhausted 2 x 2 = 4. What of the validity of 2 x 2 = 4 in the "phenomenological" realm? That is, what of 2 x 2 = 4 in the realm of individuated consciousness and memory and comprehension? What of enumeration as a temporal activity which presupposes perceptual identification of entities in the world as stable and persisting (acknowledging that such discernment in individual consciousness is subjective)?

Certain preconditions must be met for the issue of the abstract certainty of 2 x 2 = 4 even to seem sensible. There must already be the notion of whole numbers as abstract beings, not tied to enumeration of any material species. There must be the logical notion of abstract, transempirical, permanent truth. Also, 2 and 4 must belong to a series allowing continuation to indefinitely large integers. These were not, for the Greeks, truisms as they are for us. They were active issues. (Parmenides; Aristotle's "Metaphysics," book M; Archimedes' "The Sand-Reckoner.")

It is in this area that some of the questions surrounding very elementary steps would belong. Can a child acceptably learn arithmetic from the version of arithmetic propounded in Foundations of Mathematics? Is not competence in naive arithmetic and logic in fact a prerequisite for all sophisticated mathematics? Are not zero and the empty set paradoxical by inspection?

The short answer is that Foundations of Mathematics excludes "phenomenological" questions, precisely in consequence of its definition as I stated earlier. Actually, mathematicians have had more than a little to say about phenomenology, but it has been highly personal and never consolidated. The reader might begin with Dedekind's phenomenological proof of the existence of an infinite set ("the set of all my thoughts is infinite"), which he adapted from Bolzano.

Then, Brouwer's entire intellectual evolution involved a notable performance in this area. On the one hand, Brouwer raised issues which were explicitly spelled out in the professional literature only once, by van Dantzig. The passage in [section]1.4 of his paper is too long to quote, but it considers what it means for mathematical truth if there is only one mathematical knower, who dies. If, on the other hand, the mathematical knower consists of a community, that leaves

open the possibility that there is no unanimity between the mathematicians, as to whether a theorem has or has not been proved, in which case also the definition of the next [term] fails. [italics added]

In general, there is a "semi-empirical hypothesis" that

there will "always" be a human being, willing and able to test the [given] mathematical assertion ...

Van Dantzig goes on to say, for example, that since I do not live through my death, then what is my infinity (inaccessible futurity) is a finitude for anyone who survives me. van Dantzig notes that Brouwer never acknowledged the relativity of finiteness which follows when phenomenological considerations are allowed. To unfold van Dantzig's remarks fully would require a volume, perhaps. In any case, he curtails these speculations immediately.

Brouwer's putative disciple Heyting struck back vigorously at the master.

... it seems to me to contradict reality, to suppose that intuitionistic mathematics in its pure form consists only of constructions in the thoughts of individual mathematicians, constructions which exist independently of each other and between which language provides no more than a very loose connection. For that purpose, different mathematicians influence each other too much and understand each other too well.[22]

Aside from venturing onto a precipice which his own disciples didn't want, Brouwer made fantastic phenomenological claims in connection with bar-induction. And he glibly maintained that phenomenological considerations uniquely validate the classical natural numbers. As against that, Hao Wang noted that arithmetic on "pieces of cloud" might lead to a quite nontraditional elementary arithmetic. (If Wang had been serious, he could have done much better than that: cf. my "Apprehension of Plurality" and "Counting Stands.") Goodstein made a related remark in the last chapter of Constructive Formalism. And as I mentioned, E.T. Bell decried the notion that every culture must see the classical natural number series. (If one wishes to really pursue this, one ends up in ethnomathematics.)

Then, the beginning of Hilbert-Bernays, Grundlagen der Mathematik, has Hilbert's very limited phenomenology in reaction to Brouwer. The discussion is continued by Kneebone. Hilbert's opponents immediately objected that difficulties arose when stroke-numerals become too long to realize.

In a later decade, Hans Reichenbach proposed to consider the logic of alphabetic tokens, as distinct from types.

Emil Post's section in The Undecidable, ed. Martin Davis, evinces that Post was involved in speculation on phenomenology. But his section also evinces why this line of thought was perceived by the majority as useless. Like Husserl and others of that generation, Post was so well indoctrinated that he could only imagine phenomenology to confirm the customary picture of abstract reality (a picture which might justly be called Platonism[23]). Then, Post wanted to round out this compliant picture with metaphysical speculation:

a vision of the "Birth of Consciousness." ... these things are in the subconscious regions. ... the Psychic Ether[:] In so far as this contains the things which give meaning to symbols it is just the unconscious. ... Clear symbolizations are then to be regarded like atoms in this ether. In fact one may think of them as Kelvin thought of his vortex atoms in connection with the lumeniferous ether. ... See clearly the symbols we imagine as floating in the psychic ether etc. Raises the question may not matter similarly be the visions of God? ... by saying above we can think of physical things as though just visions in our psychic ether we have a Psychic Principle of Equivalence ... ."

Post's colleagues would have argued, legitimately, that these speculations did not interact with professionally accepted derivations, and that they were as private as a religion.

We see from these examples why the phenomenological avenue was judged futile by the profession. Mathematicians already know what they want to believe. Phenomenological reflections only confirmed it in a dubious way; or else opened up a kind of relativism that threatened to dissolve the entire science. Post, in particular, manufactured a banal, vaguely neo-Platonist mythology to subtend the orthodoxy. Actually, the circumstance that the orthodoxy needs this mythology is profoundly incriminating. However, the profession avoided that lesson by curtailing that avenue of speculation.

So it comes to be that the realm in which the culturally deepest detachment from 2 x 2 = 4 could emerge is forbidden in Foundations of Mathematics. Mathematics stringently forbids, for example, notations which signify differently to different people, or at different times. Such observations bring us to the threshold of my inventive work. However, the task of the present study is a new interpretation of the professionally acknowledged content of mathematics. Thus, the vulnerabilities which show up outside the professional sphere are to be pursued elsewhere.


We saw that to most professionals, the consistency of finite structures is trivial because it is ascertainable by inspection. But let me mention that I entertain the possibility of obtaining inconsistencies in finite systems conventionally regarded as consistent by inspection.

How could an inconsistency be found in a finite structure whose consistency is purported to be ascertainable by inspection? There is already the historical precedent of rules being changed in the middle of a world championship chess match to favor the reigning champion. (So the definition of a win in an actual game is not mechanical.[24]) Then there are zero and the empty set; and Hilbert's stroke-numerals. We have metamathematics to thank for propounding these objects. Can their vulnerabilities be exploited in a mathematically relevant way?

Let us note some considerations which border mathematics but are excluded in professional discussion.

- phenomenological stability of entities for purposes of enumeration
- meaning
- paradoxes of presuming the metalanguage of the finite object-system to be meaningful.


i) Focus on the phenomenological stability of the entities to be enumerated or to serve as notations. Reopen the issues of stable multiplicity, and of space, at the phenomenal level.

ii) Mathematics and the existence of language at all.

iii) Treating the historically oldest mathematical topics in a new way at the beginner's level. I gave suggestions on page 14.

Ideally, I would try to cap off the inconsistency proof of a "transparent" finite structure by emulating Gödel and Tarski: invoke considerations previously excluded as extramathematical (by e.g. Ramsey) and find a way to project them down into mathematics proper.


Let me consider a last claim that mathematics obviously cannot be a pseudo-science. A commentator like Stan Wagon, who considers that real analysis might be false and yet useful, might say that mathematics cannot be a pseudo-science because it is rigorously controlled by technological utility. I have two preliminary replies to this. First, if the claim of rational creditability of mathematics is abandoned overnight--under contemporary scrutiny--and mathematics is justified on purely pragmatic, mercenary grounds, that supports my thesis. (Ancient Egyptian mathematics was more cogent than the three-thousand year Greek detour?) Secondly, those of us who look beyond this civilization to the possibility of a civilization which has passed beyond science (as it has been known) can no longer allow the argument from technological efficacy to shut off discussion (and reflection). How could Wagon and his cohorts suppose that the mercenary apology--the precept that "garbage theories provide superior technology"--is a transparently natural conclusion? The mercenary apology needs to be unraveled painstakingly and imaginatively. While that is far outside the scope of this study, I want to emphasize that to address the question is an integral step in my program.

* * *

Part II

The following are supposed to be leading cases in historically visible sophistry and the negotiation of content in twentieth-century mathematics. Referring back to my terminology of 1980, I would have called them co-opted modern failure theorems. The phrase "co-opted" would indicate that these results have been professionally assimilated. To write the interpretation of these cases remains to be done.

(The book I planned in 1980, "Anti-Mathematics," was to have a chapter entitled "Failure Theorems at the Research Frontier." That chapter would have concerned mainly problems of finiteness, engaging in speculation or argumentation not absorbed in the consensus. Hennix would again have been the main source of the syllabus.)


Cantor abolishes invariance of dimension


postulation of empty set and empty word

Schröder, Markov

1915, 1919

Löwenheim-Skolem paradox


Diagonalization Lemma


Are proof-steps mechanical?


Skolem: nonstandard integers


Church: recursive sets which are not intuitively decidable



Gödel's 1930-31 results

Quine, Selected Logic Papers, p. 119 for NF


finite nonstandard models of the integers

Kleene, Introduction to Metamathematics, pp. 427-8


relativity of the consistency formula



independence of the Axiom of Choice

Cohen (cf. Gödel)

[there is an asymptotic approach to defining truth within a formal system]


Church 1936 endorsed

"there are recursive sets which are not intuitively decidable, issuing from the nonconstructivity of current definitions of recursive sets" -- Jirí Horejs


in a system of arithmetic which is infeasible, exponentiation is inconsistent

Rohit Parikh, "Existence and Feasibility in Arithmetic"


[root]2 is not irrational

(R.L. Goodstein, Jan Mycielski), James Geiser


Heyting arithmetic and Gödel: only the noneliminability of double negation prevents "the provability of absurdity" from being provable


* * *


Wilhelm Ackermann, Die Widerspruchs freiheit der Allgemeinen Mengenlehre, in Mathematische Annalen, Vol. 114, (1937), 305-315

Marcia and Robert Ascher, "Ethnomathematics," History of Science, June 1986, pp. 25-44

F.G. Asenjo, "Review of Paraconsistent Logic," Journal of Symbolic Logic, 1991, p. 1503

E.W. Beth, The Foundations of Mathematics (1959)

E.W. Beth, Mathematical Thought (1965)

Paul Bernays, Sur le Platonisme

Bernard Bolzano, Paradoxes of the Infinite (tr. 1950), (section)13

David Bloor, Knowledge and Social Imagery (1976)

Eric T. Bell, The Development of Mathematics (1945)

Michael Beeson, Foundations of Constructive Mathematics (1985)

Émile Borel, Les Paradoxes de l'Infini (3rd ed., Paris, 1946)

L.E.J. Brouwer, Collected Works, Vol. 1

Native American Mathematics, ed. Michael Closs (1986)

Alonzo Church, "An Unsolvable Problem of Elementary Number Theory," American Journal of Mathematics, Vol. 58, 1936, p. 351, fn. 10 -- also in The Undecidable, ed. Martin Davis (1965), pp. 88-107

Paul Cohen, "The Independence of the Continuum Hypothesis," I, II, Proc. Nat. Acad. Sci. U.S.A. 50 (1963), pp. 1143-1148; 51 (1964), pp. 105-110

H.B. Curry, "The Paradox of Kleene And Rosser," Transactions, American Mathematical Society, Vol. 50, 1941, pp. 454-516

Charles Chihara, "Priest, the Liar, and Gödel," Journal of Phiosophical Logic, 1984, pp. 117-124.

Richard Dedekind, Essays on the Theory of Numbers (1901), pp. 64-65

J.J. de Iongh, "Restricted Forms of Intuitionistic Mathematics," Proceedings of the Xth International Congress of Philosophy, ed. Evert Beth (North-Holland, 1949)

Michael Dummett, Elements of Intuitionism (1977), pp. 10-11

Michael Dummett, "The philosophical basis of intuitionistic logic," Philosophy of Mathematics (second edition), ed. P. Benaceraf & H. Putnam (1983)

Joseph Dauben, Georg Cantor [p. 60]

A. Erenfeucht, "Logic Without Iterations," Procedings of the Tarski Symposium (1974), p. 265-8

Gottlob Frege, selections from Grundgesetze der Arithmetik, in Translations from the Philosophical Writings of Gottlob Frege (2nd ed., 1960)

Gottlob Frege, The Foundations of Arithmetic (Harper, 1960)

A. Fraenkel, Y. Bar-Hillel, and A. Levy, Foundations of Set Theory, 2nd revised ed., 1973

Solomon Feferman "Arithmetization of metamathematics in a general setting," Fundamenta Mathematicae, 49 (1960): 35-92, Corollary 5.10

Robert Goldblatt, Topoi: The Categorical Analysis of Logic (1979)

R.L. Goodstein, Constructive Formalism (1951)

R.L. Goodstein, Development of Mathematical Logic (1971), pp. 98-101

James Geiser, "Rational Constructive Analysis," in Constructive Mathematics, pp. 321-347

James Geiser, "Review of `The Ultra-Intuitionistic Criticism ...'" in Mathematical Review #4938 (1973)

I . Grattan-Guinness, The Development of the Foundations of Mathematical Analysis from Euler to Riemann (MIT, 1970)

Nicholas Goodman, "The Logic of Contradiction," Zeitschr. f. math. Logik und Grundlagen d. Math., 1981

David Hilbert and Paul Bernays, Grundlagen der Mathematik I (1968)

Hao Wang, "Eighty Years of Foundational Studies," Dialectica 1958, No. 3-4

Hao Wang, From Mathematics to Philosophy (1974)

Hao Wang, Essays on the Foundations of Mathematics (1966)

Sandra Harding, The Science Question in Feminism (1986), pp. 47, 114

Arend Heyting, "The Intuitionist Foundations of Mathematics," Philosophy of Mathematics, ed. P. Benacerraf and H. Putnam (1964), pp. 42-49

Arend Heyting, Intuitionism: An Introduction (3rd rev. ed., 1971), p. 119

Constructivity in Mathematics, ed. A. Heyting, for

Arend Heyting, "Some Remarks on Intuitionism"

László Kalmár, "An Argument Against the Plausibility of Church's Thesis "

E.W. Beth, "Remarks on Intuitionistic Logic"

Arend Heyting, "After Thirty Years," Logic, Methodology, and Philosophy of Science, ed. Nagel, Suppes, and Tarski (1962)

Jirí Horejs, "Note on the definition of recursiveness," Zietschrift für mathematische Logik und Grundlagen der Mathematik 10, (1964): 119-120

David Isles, "On the Notion of Standard Non-Isomorphic Natural Number Series," in Constructive Mathematics, pp. 111-134

G.T. Kneebone, Mathematical Logic and the Foundations of Mathematics (1963)

Stephen C. Kleene, Introduction to Metamathematics (1952)

Stephen C. Kleene, Mathematical Logic (1967), p. 327

Intuitionism and Proof Theory, ed. A. Kino, J. Myhill, and R.E. Vesley (1970)

S.C. Kleene and J.B. Rosser, The Inconsistency of Certain Formal Logics, Annals of Mathematics 36 (1935): 630-36

E. Kamke, Theory of Sets [p. 31]

Morris Kline, Mathematical Thought from Ancient to Modern Times, 1972

Georg Kreisel, "Comment on Zinoviev's Paper," Logique et Analyse, September 1979

Paul Lorenzen, "Constructive mathematics as a philosophical problem," Logic and Foundations of Mathematics (1968)

Imre Lakatos, Mathematics, Science, and Epistemology (1978)

Lusin (1921) on Axiom of Choice S. MacLane, ed., Reports of the Midwest Category Seminar IV (1970)

Richard Mansfield and Galen Weitkamp, Recursive Aspects of Descriptive Set Theory (1985) A. Markov, The Theory of Algorithms (1954)

Gregory Moore, Zermelo's Axiom of Choice (Springer, 1982)

Gregory Moore, "Lebesgue's Measure Problem and Zermelo's Axiom of Choice," Annals of the New York Academy of Sciences, vol. 412, (1983) pp. 129-150

Andrzej Mostowski, Sentences Undecidable in Formalized Arithmetic (Amsterdam, 1952)

Jan Mycielski, "Analysis Without Actual Infinity," The Journal of Symbolic Logic, Vol. 46 (1981), pp. 625-631

David Makinson, "Review of Rescher and Brandom, The Logic of Inconsistency," Journal of Symbolic Logic, Vol. 47, p. 233

W. V. O. Quine, Selected Logic Papers (1966), p. 114-120

W. V. O. Quine, "New foundations for mathematical logic," American Mathematical Monthly, Vol. 44, pp. 70-80

Rohit Parikh, "Existence and Feasibility in Arithmetic," Journal of Symbolic Logic 36 (Sept. 1971), pp. 494-508

Graham Priest, "The Logic of Paradox," Journal of Philosophical Logic, 1979, pp. 219-241.

Graham Priest, In Contradiction, 1987

Paraconsistent Logic, ed. Graham Priest et al., 1989

Hans Reichenbach, Elements of Symbolic Logic (1947), Ch. VII

Michael Resnik, Frege and the Philosophy of Mathematics (1980)

Frank P. Ramsey, The Foundations of Mathematics (1931)

J. B. Rosser, "The Burali-Forti Paradox," Journal of Symbolic Logic 7: 1-17

J.B. Rosser, Logic for Mathematicians, 1953

Helen Rasiowa and Roman Sikorski, The Mathematics of Metamathematics, 1963

Nicholas Rescher and Robert Brandom, The Logic of Inconsistency, 1979

Craig Smorynski, Self-Reference and Modal Logic (1985), p. 6

T. Skolem, "Peano's axioms and models of arithmetic," Mathematical Interpretations of Formal Systems (1955), pp. 1-14

T. Skolem, "Über die Nicht-Charakterisierbarkeit der zahlenreihe mittels endlich oder abzählbar unendlich vieler Aussagen mit ausschliesslich Zahlenvariablen," Fundamenta Mathematicae Vol. 23 (1934), pp. 150-161

Specker, 1953 in Proc. Ac. Sci. USA 39: 972

Gabriel Stolzenberg, "Can an Inquiry into the Foundations of Mathematics Tell Us Anything Interesting about Mind?" in The Invented Reality, ed. Paul Watzlawick

Hristo Smolenov, "Paraconsistency, Paracompleteness and Intentional Contradictions" in Epistemology and Philosophy of Science, 1982

Alfred Tarski, Logic, Semantics, Metamathematics (1956)

Alfred Tarski, "The Semantic Conception of Truth," Philosophy and Phenomenological Research (1944), pp. 347, 349, 355, etc.

Robert Tragesser, "Godel's Paradox" (February 1978)

Wim Veldman, Investigations in Intuitionistic Hierarchy Theory (1981), pp. 1-3

Albert Visser, "On the Completeness Principle ..." in Annals of Mathematical Logic, Vol. 22 (1982), pp. xxx

From Frege to Gödel, ed. Jean van Heijenoort (1967)

D. van Dantzig, "Comments on Brouwer's Theorem on Essentially-negative predicates," Indagationes Mathematicae, vol. 11, pp. 347-355 Hermann Weyl, Philosophy of Mathematics and Natural Science (1963) Ludwig Wittgenstein, Tractatus Logico-Philosophicus (1921; tr. 1961), (section)6.241

Ludwig Wittgenstein, Remarks on the Foundations of Mathematics (revised, 1978)

Ludwig Wittgenstein, Philosophical Grammar (1974)

Ludwig Wittgenstein, Wittgenstein's Lectures on the Foundations of Mathematics (1976)

Ludwig Wittgenstein, Philosophical Remarks (1975)

Stan Wagon, The Banach-Tarski Paradox (1985)

A.S. Yessenin-Volpin, "Le programme ultra-intuitioniste des fondements des mathematiques," in Infinitistic Methods (1961), pp. 201-223

A.S. Yessenin-Volpin, "About Infinity, Finiteness, and Finitization," in Constructive Mathematics, ed. Fred Richman (1981)

John Wesley Young, Lectures on Fundamental Concepts of Algebra and Geometry (1911), pp. 143-5 [inconsistency of Euclid's geometry]

E. Zermelo, "Grundlagen der Mengenlehre I" (1908), in From Frege to Gödel, p. 202