Early this year, Washington Post op-ed columnist Richard Cohen weighed in on a subject about which he, by his own admission, knew nothing. The subject was algebra, and Cohen's column took the form of advice to a young woman who had dropped out of high school after failing in that subject. Cohen advised the ex-student and the public at large that algebra's importance was overblown -- and that he, Cohen, "had never once used it and never once even rued" that he could not use it.
"Most of math" Cohen explained, "can now be done by a computer or a calculator," and moreover it is a "lie" that algebra teaches reasoning. "Writing is the highest form of reasoning," Cohen affirmed, stating that the most valuable class he himself had taken in high school was ... typing.
Cohen's dismissal of a central branch of mathematics got some negative attention from science bloggers. Biologist P.Z. Myers castigated Cohen for complacency and arrogance in advising a young woman to throw away career options and intellectual tools; Myers also noted that the people who design calculators, among many others, need to know algebra. Gary Stix of Scientific American pointed out that Cohen's advice was particularly inapt in light of growing international economic competition. In uncharacteristically heated language, Stix wrote: "No algebra=No calculus=No science=No technology=We're totally *&$#FRTDG!!!!!" (Disclosure: I freelance sometimes at Scientific American and occasionally nod at Stix in the halls.)
Subsequently, science writer Steven Johnson made his own argument that algebra is overrated. In a Time magazine essay, Johnson contrasted studying algebra with time spent online and playing video games, by way of arguing that the latter activities yield more benefits. "In the office of the future," he wrote, "which skill set will today's kids draw upon in their day-to-day tasks? Mastering interfaces, searching for information, maintaining virtual social networks and multitasking? Or doing algebra?" He added: "It's a good bet that 99% of kids will never use algebra again after they graduate from high school." Johnson went on to lament that the testing establishment devotes so much time to algebra and so little to digital skills.
Johnson may have a point that fears about children getting dumber through digital technology are overstated. But his argument neglects some pertinent facts: One is that the online skills he exalts tend not to require much classroom training, while algebra clearly is not something kids will pick up without taking courses and tests on the subject. Moreover, the very technologies in question depend heavily on algebra (and on more advanced math that requires algebra). With a nod to Stix, one might say "No algebra=No computers=No networks=No information age=We're totally *&$#FRTDG!!!!!"
Students (and pundits) who find algebra hard might consider how difficult such math must have been for the people who actually pioneered it. That story is told in a new book, Unknown Quantity: A Real and Imaginary History of Algebra, by John Derbyshire (Joseph Henry Press). Derbyshire, who wrote a previous book on math, Prime Obsession, and is a frequent contributor to National Review, gives an absorbing account of algebra over the millennia, from its rudimentary origins in ancient Mesopotamia and Egypt to its cutting-edge applications in 21st-century physics.
An interesting feature of this history is just how slow progress often was. Babylonians in the 2nd millennium BCE worked out algebraic word problems on cuneiform tablets, and the ancient Greeks handled similar problems with a geometrical approach, but it was only at the time of Diophantus, who lived in Alexandria in roughly the 3rd century CE, that anyone used letter symbols to keep track of unknowns in equations. The brutal death of the female mathematician-philosopher Hypatia in 415 at the hands of a religious mob marked the twilight of math in the declining Roman Empire.
Around 820, the Islamic scholar al-Khwarizmi wrote a book on algebra (the word comes from the Arabic al-jabr, or "completion," his term for adding the same amount to each side of an equation to put it into a standard form). However, al-Khwarizmi and his contemporaries worked on algebra through word problems and geometry. Diophantus' practice of employing letter symbols in equations had vanished into forgotten archives. It was not until the late 1500s, particularly with the work of French mathematician François Viète, that algebraic symbols were reinvented and started to be used in a systematic way.
Such tortuous history, as Derbyshire points out, suggests that symbolic algebra, with its high level of abstraction, does not exactly come naturally to people. He finds this a bit depressing but also inspiring. The remarkable thing is not that it took humanity so long to learn how to do this stuff, but that we can do it at all. No thanks to some pundits, though.
Kenneth Silber is a TCS contributing writer who focuses on science, technology and economics.