TCS Daily

...When I'm (One Hundred and) Sixty-Four

By Glenn Harlan Reynolds - June 1, 2005 12:00 AM

It's conventional wisdom that people tend to do their most creative work when they're young. This has sometimes been raised as an objection to extending human lifespans: If, as the old adage has it, "science advances funeral by funeral," then won't fewer funerals mean less progress?

A while back, I addressed this objection, writing:


One argument is that it would be a curse. In a way, death and upward mobility go hand in hand. Some professions recognize this openly: Junior officers in the military used to toast "a bloody war, or a sickly season" as enhancing their prospects for promotion, while in academia one hears the old chestnut, "science advances funeral by funeral."


With that in mind, perhaps a dramatic lengthening of lifespans would yield stagnation and resentment. Older people would entrench themselves in their positions, while juniors would fester with no real hope of getting ahead. Progress would dry up as creative minds wasted their best years in uncreative apprenticeships, under the sour scrutiny of their elders. The result: a dull, uncreative gerontocracy.


On the other hand, we've pretty much done that experiment already, and it hasn't worked out that way. Lifespans, after all, have been getting steadily longer since the turn of the twentieth century. According to the Centers for Disease Control, "Since 1900, the average lifespan of persons in the United States has lengthened by greater than 30 years." That's an average, of course, and it's made more striking by reductions in death among juveniles. Nonetheless, there are a lot more old people than there used to be, and they're working longer. Indeed, as Discover magazine recently observed, "A century ago, most Americans lived to be about 50. Today people over 100 make up the fastest-growing segment of the population." You can argue about the details, but it's clear that typical adults are living longer than at any time in human history.


I suggested that extending human lifespans might lead to extended creativity. Now there's a study of innovation in the 20th Century that seems to cast a bit of doubt on that suggestion. Entitled Age and Great Invention, (full-length PDF here) the study by Benjamin Jones of the Kellogg School finds that the age of scientists making great insights has gone up: "The estimates suggest that, on average, the great minds of the 20th Century began innovating at age 23 at the start of the 20th Century, but only at age 31 at the end -- an upward trend of 8 years."


Given that people lived longer, this could be good news: Jones suggests that extended education leads to later innovation. Unfortunately, it doesn't seem to lead to longer periods of creativity. In the late 20th Century people are starting their periods of peak creativity later, but those periods don't extend to greater ages, suggesting that the additional time spent studying comes at the cost of time spent being creative:


We see that the peak ability to produce great achievements in knowledge came around age 30 in 1900 but shifted to nearly age 40 by the end of the century. An interesting aspect of this graph is the suggestion that total lifetime innovation potential has declined. Other things equal, if individuals delay the start of their innovative careers without increasing their productivity at older ages, then their life-cycle output will fall. . . .


Great minds produce their greatest insights at substantially older ages today than they did a century ago. This upward age trend is not due simply to an aging population, but comes from a substantial decline in the innovative output of younger innovators. Meanwhile, there is no compensatory expansion of innovative output at later ages. Innovators are the engines of technological change and, other things equal, the less time an innovator spends successfully innovating, the less her lifetime output. The estimates point to a 30% decline in life-cycle innovation potential over the 20th Century.


So starting later doesn't mean finishing later, it just means a shorter run of peak creativity. This might bode poorly for a life-extended society in which people live to be 150, or 300, but finish their creative years at the same ages they do now.


On the other hand -- aside from the obvious dangers in making too much of a single study -- there may be more at work here. I suspect that the requirement for more education and the fading-out of creativity in midlife may stem from the same cause. With the sum of human knowledge expanding rapidly, it takes more time to get up to speed -- hence more years of education are needed before one is in a position to make a contribution. But the same rapid expansion of knowledge probably means that the intellectual capital people acquire in those years of education has a shorter half-life. People may be living longer, but the lifespan of useful knowledge is getting shorter. By the time you're in your 50's, at least in technological fields, your intellectual capital has passed its "use by" date. And by then, it's too late to retool -- the opportunity costs of going back to school in your fifties are huge.


Would people who lived to 150 or 300 take time to retool? And, if they did, would they be as creative as they were when they were fresh out of school?


I'm not sure. On the one hand, people who live to 300 can't expect to coast for a lifetime on the intellectual capital of their youth. And the opportunity costs in terms of lost time would be much lower as a percentage of lifespan than they are for a 55-year-old today.


But what if there's more going on? People who are 55 often (though not always) seem to have less overall energy than they had in their 30s, and less enthusiasm. Perhaps this is a biological phenomenon linked to aging, in which case presumably treatments that give people the physique of a 30-year-old will also restore the joie de vivre of the same age.


But what if it's not biological? I remember one of my friends breaking up with his girlfriend when she declared a moratorium on sex unless he proposed. Remarking on her surprise when this tactic didn't work, he commented, "It might have worked in my teens but not in my thirties: I still like sex, but after the first few thousand times it's not quite as urgent as it used to be." The real problem may not be biological, but experiential. Perhaps, after a while, the been-there-done-that phenomenon asserts itself on all sorts of subjects. Creative work, like sex, may still be satisfying, but not quite as urgent as it was at first.


I don't know the answers to these questions -- and my own take is that longer-lived people will probably have multiple careers over long lifetimes, which ought to mute this effect somewhat. It's also worth noting that scientific creativity, though undoubtedly important to the readers of TCS, is not the sole measure of success. But at the very least, Jones' study suggests that improvements in education that shorten the delay before people are in a position to make great discoveries would be tremendously beneficial, not just in a life-extended society but in today's world. Alas, my gut says that we're likely to see people living to 300 before we see dramatic improvements in education. Where, after all, are the dramatic discoveries in that field?


TCS Daily Archives