Times Higher Education Supplement, 13 August, 2004

As Tom Lehrer famously remarked in the introduction to his song about the New Math of the 1960s with its excessive abstraction, ``in the new approach, the important thing is to understand what you are doing, rather than to get the right answer''. The latest trends in mathematics education, be it calculus reform in the US or the ``anacademisation'' politicians demand of German universities, err to the opposite extreme, predictably with similar results.

These two books by Brian Clegg and David Foster Wallace exemplify this trend's logical extension to popular science writing. Both authors disparage formal mathematical training. Clegg refers to ``all those tedious theorems anyone who has suffered traditional geometry had to go through''. Wallace ``disliked and did poorly in every math course he ever took, save one''. He may also be credited with introducing the expression ``a bunch of abstract math-class vomitus'' into polite conversation.

Although Clegg's book occasionally hits a wrong note, it is on the whole a success. Wallace, on the other hand, is guilty of numerous egregious blunders.

The history of infinity as a mathematical rather than a metaphysical concept arguably begins with the paradoxes of Zeno of Elea in the 5th century BC. The futile race of Achilles against the tortoise and similar conundrums deal with the infinite divisibility of space. This was a problem pre-Hellenistic mathematics could not handle. Aristotle permitted talk only of ``potential'' infinities. His authority effectively ended all serious thought on the subject.

It was not until 1638 that the taboo about mentioning ``actual''
infinities was broken. In the *Discorsi*, Galileo Galilei again
takes up paradoxes surrounding infinity. He discovers a
characteristic of infinite sets: a proper part can be as large as the whole.
Consider all natural numbers 1, 2, 3,... and the subset of squares
1, 4, 9,... Associating 1 with 1, 2 with 4, 3 with 9 and so on, one
sees that there are as many squares as natural numbers.

In the late 19th century, German mathematician Georg Cantor put that phenomenon on a sound footing. Two sets are said to have the same cardinality if one can describe a one-to-one correspondence between the members of these two sets as in the example above.

Cantor's major advance in the theory of the infinite was the proof that the set of all real numbers (numbers with arbitrary decimal expansion) has a larger cardinality than the natural numbers. There are, as it were, different infinities. Clegg and Wallace describe this ingenious proof. Clegg ignores a subtle fact involving the ambiguity of decimal expansions (for example 0.1 is the same as 0.0999...). Wallace addresses the issue, but then fails to see where in the proof that ambiguity matters. In a footnote to the proof he wanders off into a seemingly erudite remark about the well-ordering principle for infinite sets. Contrary to what he believes, this is irrelevant for the proof.

Both books, then, lead up to the continuum hypothesis, a question that remained open until the 1960s. Is there an infinite set whose cardinality lies between that of the natural and that of the real numbers? Again, Wallace gets it wrong. He confuses it with a much simpler question, answered by Cantor, about these cardinalities.

*A Brief History of Infinity* is mainly a history of the personae who
struggled with the infinite. At times, Clegg's style is too gossipy, for
instance when discussing Cantor's mental state. The description of
Cantor's relationship with his teacher Leopold Kronecker
degenerates into tabloid journalism: ``Kronecker regarded Cantor
as a mathematical pornographer''. Wallace is even more tactless: adherents
of Kronecker are labelled ``radical Shiite Kroneckerian''.

At any rate, Clegg
is immensely readable and manages to convey to a lay audience some
of the key mathematical ideas concerning infinity.
*Everything and More*, by contrast, is an example of the most pernicious
kind of popular science writing.
Wallace's prose is littered with technical expressions and mathematical
symbols. Wallace maintains that if he were ``after technical rigor rather
than general appreciation'', he would provide more detail. This
suggests he has mastered the subject at hand.
Alas, this is manifestly not the case. It is bad that Wallace's
garbled exposition, posing as style, merely serves to
obscure. But even the reader bold enough to untangle
the copious footnotes and asides will fail to be enlightened,
because Wallace gets some of the most basic concepts wrong.

Reviewers who write of Wallace's ``justified confidence in his mastery of the matter'' and call him a ``trustworthy and knowledgeable guide'' probably skipped substantial parts of the book and consort with Wallace in degrading mathematics to magic symbolism.

H. Geiges