In the early 1980s, the philosopher Paul Oskar Kristeller looked up “creativity” in the dictionary. He groused that the word “had been much used and misused in recent literary and popular discussion,” but remained “vague and ill defined.” However, to the Columbia professor’s astonishment, the word could not be found in the Oxford English Dictionary’s 1971 compact edition, nor in the fifth edition of Webster’s Collegiate Dictionary, published in 1939.
If he were a linguist, Kristeller wryly noted, he would have given up at that point: there’s obviously no such thing as “creativity” in the English language. “Fortunately, I am not a linguistic analyst but an intellectual historian,” he said, “hence quite willing to admit that ordinary language is subject to continuing change, and that thinkers and writers as well as other people are perfectly free to coin new words and phrases to express new objects and thoughts.”[1]
He looked further, and eventually found the otherwise ubiquitous word in 1972’s Seventh New Collegiate Dictionary. Creativity meant “the ability to create,” which might seem like a bit of a parsimonious definition to the contemporary ear. Nonetheless, Kristeller was puzzled. Had creativity only really entered the scene of American English some time between the 1930s and 1960s? He noted that philosopher Alfred North Whitehead used the term in Religion in the Making in 1927, and speculated that Whitehead might have helped popularize it.
However, the relative scarcity of the term prior to the mid-twentieth century remains striking. A quick Google Ngram search reveals that “creativity” experienced a rapid takeoff between 1940 and 1960 in published English, and then steadily rose in usage through the year 2000, when it may have begun to level off. It is hard not to notice that this ascent leads right up to the publication of Richard Florida’s The Rise of the Creative Class in 2002, which helped spawn the contemporary discussion of “creatives,” that wonderful nominalization of a positive attribute (being “creative”) as a type of person (being “a creative”). In Florida’s conception, the creatives include your barista-cum-DJ, your UX designer, your professor, your advertising executive.
“Creative,” of course, was nothing new. Kristeller noted that creativity may have been novel in the mid-twentieth century, but it was simply a new adaptation of older terms, such as “creative” and “create.” Here’s where it gets interesting. Creating, creativity, and being creative are all more complicated than they might seem at first blush—or in that definition from the Seventh New Collegiate Dictionary. As Kristeller observed:
For most of the history of Western thought, the ability to create was attributed exclusively or primarily to God, and to human makers or artists only in a restricted or primarily metaphorical sense… Creation out of nothing was the exclusive prerogative of God, and a human artist who produced a work out of material given to him could not be remotely compared with the divine creator…[2]
In classical Greek, the word to make or create was poiein, the root of our word “poet.” Poets, Kristeller noted, held an exalted place in Greek culture, as they were inspired to create in a way that bordered on the divine. But Plato and others did not think that even the gods created out of a total void—they were believed “to have given shape to a formless matter that preceded their action, and for this very reason the divine creators were often compared to human artisans, architects, or sculptors.”[3]
Subsequently, Latin developed two distinct words—creare and facere–which separated the divine from ordinary human production, making something out of something. St. Augustine also contributed to this evolving language and theology, inferring from the story of Genesis in the Old Testament that God creating something out of nothing, ex nihilo, quite unlike the old Greek gods. From this point forward, for centuries, Christians imagined creation itself as the sole province of the divine. An artisan who made a stained glass window was not a creator—certainly not a “creative,” in today’s parlance—but rather a mere mortal who changed one thing into something else, a maker.
Kristeller points out that attitudes shifted during the Renaissance, when Europeans began to recognize the individual genius of the artist and poet as “creative” in its own right. In a way, the genius did something that mirrored God—using nothing but his brain, he brought something new into being, out of nothing. Hence began the obsession with novelty and the new that carried on through the nineteenth and twentieth centuries in Western culture. Intellectual labor—whether painting or philosophizing, science or poetry—was creative.
It was only in the mid-twentieth century that the idea of creativity gained greater currency—a harbinger, perhaps, of the rise of the knowledge worker as a major figure in the American economy after 1945. The historian Jamie Cohen-Cole has shown how scientists and other intellectuals imagined themselves as “rational, creative, and autonomous” during the Cold War, in contrast, in part, with the repression of thought and expression in Communist societies.[4]
Such self-understanding was, naturally, part of a reaction to the imperatives of the Cold War, but a variety of thinkers began to imagine intellectuals and professionals as a new class with a special role in an emerging postindustrial society, where both advancing technology and rational management were expected to reshape production. As Daniel Bell argued in the 1970s, highly educated knowledge workers would take charge of directing a society where manufacturing declined and automation concomitantly increased. Economist Robert Reich later described this group as “symbolic analysts,” but the basic concept was the same: administrators, policymakers, doctors, lawyers, professors, scientists, writers, and researchers of various kinds with specialized expertise would play a bigger part in the workings of the economy. Such predictions seemed to be borne out, for better or worse, with the deindustrialization of the US economy, the proliferation of professional services, and the growth of sectors such as FIRE (finance, insurance, and real estate) in the late twentieth century. Indeed, the category of “research analyst” is a common one today for many different kinds of jobs in corporate America.
Thus we arrive at the creative class. During the Cold War, “creativity” may have been the hallmark of both intellectual expertise and Western freedom more broadly. But in the post-Communist world of the 1990s, creativity and innovation increasingly became synonymous with progress, especially with the rise of the tech industry. Innovation continued Western society’s long love affair with the New—who could not love it? Creativity, naturally, was an unalloyed good. Meanwhile, those who lacked higher education, who were neither creative nor innovative, risked the fate of autoworkers, travel agents, and guys who used to play saxophone solos in rock songs—total irrelevance.
Indeed, Florida’s influential thesis in The Rise of the Creative Class came at just the right moment to crystallize this emerging conventional wisdom. “Creativity has come to be valued—and systems have evolved to encourage and harness it—because it is increasingly recognized as the font from which new technologies, new industries, new wealth, and all other good economic things flow,” Florida wrote in 2002. “As a result, our lives and society have begun to resonate with a creative ethos.” Creativity was the proverbial fountain of economic youth, the place where the new and innovative came from to improve human life. As economist Paul Romer put it, creativity enhanced growth with “better recipes, not just more cooking”—more invention, the act of creation of something from nothing, not more work, would move us forward as a society.[5]
In the early twenty-first century, this gospel of creativity has been irresistible. Cities have sought Florida and other consultants’ advice to figure how to become more creative to attract both educated workers and new industries. (My forthcoming article in the Journal of Urban History shows how this process began in North Carolina as early as the 1950s.) Arts and culture, tolerance and openness, the values prized by the creative could lead to economic prosperity. Geographer Eliot Tretter has shown how this process unfolded in Austin, Texas, the creative darling par excellence, in his book Shadows of a Sunbelt City.
Lately, the importance of the creative or knowledge worker has taken on a slightly more sinister political valence, though. If innovation is the source of progress, and all good things come out of Silicon Valley, perhaps it is the techies who should be running things. Never mind the staggering gentrification and parallel infrastructure (Google buses anyone?) that have allowed the tech class to wall themselves off in places such as the San Francisco Bay area. Cyberlibertarians have long cherished a fantasy of “going Galt” and taking their talents to make a better society, since they know best. Others would settle for seizing control of the government and managing it better than the fusty bureaucrats who currently run the show.
In fact, as the Baffler observed in 2014, certain “mouth-breathing Machiavellis dream of a Silicon Reich.” Corey Pein wrote of Justine Tunney, a Google engineer who put forward a simple plan:
- Retire all government employees with full pensions.
2. Transfer administrative authority to the tech industry.
3. Appoint [Google executive chairman] Eric Schmidt CEO of America.
A high-handed, Randian stunt perhaps. Or perhaps not. In the ensuing controversy, Tunney suggested that her Twitter followers “Read Mencius Moldbug.”[6]
Who?
The curious moniker refers not to a new member of the Wu-Tang Clan or some strange prog rock band with a penchant for Chinese philosophy. Rather, it is the nom de plume of Curtis Yarvin, a computer scientist and “neoreactionary thinker” who earned an avid following among the early adherents of the alt-right movement or “Dark Enlightenment.” In the most charitable reading, Yarvin and his kin believe, at a minimum, in an extreme version of meritocracy and view with suspicion—if not outright hostility—conventional wisdom about the value of an egalitarian society. They are prone to make deplorable statements about racial inequality and even slavery, rejecting the fundamental liberal democratic ideal that everyone should be presumed equal. Some people are just better than others—innately smarter or more talented, or merely more hard-working, though more often the former than the latter. To think otherwise is to ignore reality and hasten the decline of America, they say, and possibly Western civilization as a whole.

In fact, the ideology has prompted on-line expressions of rank bigotry, as alt-rightists openly condemn non-white races as inferior and fear the white, Christian West being overrun by Muslim or Mexican rapists. Others, though, take a more genteel point of view. Shedding the pseudo-scientific trappings of biological racism, they simply believe that those who are smarter or more talented should be in charge—rather like Tunney, in her presumption that an America run by Silicon Valley would work a lot better than one managed by representatives elected by the people.
One especially noteworthy example is Jason Brennan, an associate professor of “strategy, economics, ethics and public policy”—whatever that means—at Georgetown University. Brennan recently published an essay called “The Right to Vote Should Be Restricted to Those with Knowledge” in the otherwise estimable Aeon, which appears to have resorted to publishing a bit of clickbaity crypto-racism in order to stir the pot. Aeon has done us a public service, though, in pulling back the curtain from Silicon Valley-style elitism. In this way, alt-rightists like Moldbug are rather like the Ku Klux Klan, who black radicals such as Marcus Garvey and Malcolm X appreciated more for making no bones about their racism, in contrast to white liberals who disguised their prejudice. Meanwhile, Brennan hides frank support for inequality behind a well-meaning rhetoric about maximizing public welfare.
In some ways, Brennan’s treatise is a logical reductio ad absurdum of Daniel Bell’s notion of the society managed by knowledge workers—or, going further back, Thorstein Veblen’s vision of a system rationally managed by engineers in the early twentieth century. If knowledge means authority, perhaps only those with knowledge ought to have political authority. Or those with more knowledge simply ought to have more power—a gradation by degrees. One can go back much further, of course, to Plato’s philosopher-king: we might be better off if a truly wise ruler could transcend the narrow and petty passions and prejudices of the demos. Plato’s ideal leader, though, was above the grubby truck and barter that our Silicon Valley epistocrats engage in.
Brennan lays out a notion of epistocracy—not rule by the king, or nobility, or people, but by the educated. The epistocrat’s moral wager is that the masses actually end up hurting themselves more because uninformed voters make unfortunate choices at the ballot box, resulting in suboptimal outcomes in terms of public policy and general well-being. “Concentrating power in the hands of a few—in monarchy, dictatorship or oligarchy—tends to result in power for personal benefit at the expense of others,” Brennan explains in his 2016 essay. “Yet in spreading power among the many—as in a democracy—individual votes no longer matter, and so most voters remain ignorant, biased and misinformed.”
As Americans, we thought we had a solution for this problem—a system of democracy constrained by rule of law and mechanisms such as judicial review and the Bill of Rights—but Brennan suggests that this approach has produced a lower standard of living than another, better system could provide. The author suggests that we should have an epistocracy, an arrangement that “apportion[s] political power, by law, according to knowledge or competence,” while still maintaining basic republican protections for individual rights and equal protection before the law.
Brennan provides no explanation of how epistocracy would respect fundamental democratic norms, even as it either completely excludes the less educated from power or assigns political influence according to one’s relative level of “knowledge or competence,” as determined by some kind of testing or licensing. In doing so, he evinces a near complete ignorance of history—the fact that equal rights and protection under the law have only been won through assiduous political struggle, both in and outside the voting booth, by women, workers, the civil rights movement, and other disadvantaged groups. Consider the fight for suffrage by white working men in the nineteenth century United States or Britain, or women’s long struggle for the right to vote around the world, which unfolded from as early as 1893 in New Zealand to as late as 1944 in France or 2003 in Oman.
Historically, people with power or privilege have not necessarily let go of their advantages for no reason, nor have they respected standards of equality out of the goodness of their own hearts. And it seems naive, to say the least, to believe that a system that sets up a hierarchy based on knowledge would not shade into racial or biological justifications for difference.
Just as important, Brennan has not given a convincing explanation for how impartial, democratically legitimate measures of “knowledge or competence” would be devised. He nods to the United States citizenship test as a possible model, though this metric was worked out within a system that permitted broad democratic participation and was shaped by such pressures. How would a fair measurement of the necessary knowledge to make decisions on a broad range of political choices be designed—outside the bounds of partisanship, let alone the influence of an epistocratic class that got to decide what counted as the necessary price of admission to its own ranks?

It seems transparently obvious that any system of licensing or restricting access to political power to people based on the possession of some particular knowledge would lead to social stratification—a hierarchy that would then be compounded, as opportunities for social mobility worsened and those with political power worked to tighten their grip on it. One does not even need to look to history, though examples abound: the priestly castes who controlled access to knowledge and power in traditional societies where literacy was limited, for instance. Economist Harold Innis aptly described the control of information by priests in ancient Egypt or medieval Europe as “monopolies of knowledge”—those who could read, who controlled libraries, who could afford education and expensive parchment codices in a time before print and mass literacy.
America’s system of Jim Crow segregation offers a much more recent example, with the same essential lesson—barring people from access to power based on an intellectual test creates and reinforces inequity. A test would contribute to widening and hardening class lines even if it were technically “fair,” in the sense that it effectively measured actual knowledge; in a society like our own, where access to education is unequally distributed, it would merely reproduce and multiply existing inequities. And Jim Crow, of course, makes it beyond obvious that such tests are likely to be abitrary and discriminatory—expressly designed to exclude, not just to sort the informed from the uninformed.
In this light, epistocracy looks like a self-serving version of meritocracy on steroids, even if its proponents are naïve enough to believe that only the truly wise would rule under such a system, if we just worked out the right test.
After all, holding a fancy degree doesn’t guarantee one wisdom. The best and the brightest helped us make it to the Moon; they also got us in to the Vietnam War. Donald Trump says he has a “very good brain” and “the best words,” and he has the papers to prove it: after all, he did go to one of the nation’s best business schools, Wharton. But as my Grandpa loves to say, the most dangerous animal on Earth is an educated man with no common sense. Deciding what counts as knowledge and competence is a far more difficult proposition than the proponents of rule by “those with knowledge” allow.
The concept of epistocracy also presumes that the educated and privileged have no class interests of their own. If that language smacks too much of Marxism, think in terms of interest-group pluralism. Everyone has interests—each individual, group, or stratum of society—the retired, tax accountants, firefighters, pornographers, fans of Adele. They care about their taxes, and they care about which people and priorities they’re spent on; they hold views about foreign policy and cultural values that reflect their particular worldview. And they do so, at least in part, because policies inevitably benefit some people to the direct or indirect detriment of others.
Indeed, epistocracy carries with it the implicit premise that all political problems are technical in nature, as if those with knowledge or expertise could easily find the ideal solution if all the information were available. We already have a form of this in politicians’ overreliance on the Congressional Budget Office to “score” legislation (like the Affordable Care Act) according to supposedly neutral and unimpeachable statistical models that make projections about things that might happen in the future.
It’s telling that Brennan makes an analogy between licensing voters based on knowledge and the government licensing plumbers or electricians. Figuring out how to fund schools, parks, or Social Security becomes a matter of technical expertise, like unclogging a drain or wiring a house. One might respectfully submit that ending terrorism, or even a more immediate problem like defeating the Islamic State is slightly more complicated than installing a new thermostat. Hence, the democratic adage that “war is too important to be left to the generals.” Generals might know a lot about war, but if my home or my very own body—as a soldier or civilian—were at stake in the matter, I might want to have a say in what happens in military affairs. Likewise, democracy might be too important to leave to the experts.
Our poor epistocrat might contest that knowledge, broadly construed, is one thing—knowing who the president is, or understanding the basics of how the House and Senate work—whereas specialized expertise in computer programming or military strategy is something else entirely. All we’re asking for is a minimal threshold for people to be allowed in to the decision-making process for society.
In fact, this is rather like the republican ideal of the early United States, which not only limited the franchise to white men but required a minimum amount of property ownership to vote. You needed to have some skin in the game—you don’t have to own Trump Tower, exactly, but you should meet at least a certain threshold of investment in society if you were going to make material decisions about its affairs, especially where other people’s property was concerned (which is to say, taxation). Founding father James Madison notoriously feared that a “majority faction” might get together and vote to confiscate the property of the better-off—a rich man’s version of the idea that democracy, left unchecked, is little more than two lions and a lamb voting on what to have for dinner.
But this assumption leaves open the question of why someone with little or no property or education ought not have a say in decisions that affect them too. Indeed, does a person who is illiterate have no grip on a moral vision of what is right or wrong, fair or unfair? I think most of us would concede that the answer is “no.”
In the end, the argument for epistocracy rests largely a nervous emotional tic—ironically, for people who pledge fealty to dispassionate expertise—that other people make political decisions that one does not like. Some people voted for George W. Bush in 2004 who still believed we had found weapons of mass destruction in Iraq; some people voted for Mitt Romney in 2012 who believed President Obama was not a native-born citizen. That was not great. Then again, for most of the twentieth century, college-educated voters supported Republicans in presidential elections. Does that mean the GOP was always the right party, and Alf Landon and Gerald Ford and George H.W. Bush deserved to win their elections, because their ideas were better and smarter?

Perhaps. But one suspects that the notion of epistocracy has grown more appealing to socially liberal intellectuals and Silicon Valley types as the most educated voters have increasingly moved toward the Democratic Party in the early twenty-first century, to the point that Hillary Clinton may be the first Democratic presidential candidate to lead among college-educated white voters in a long time—possibly ever. If you’re smart and your candidate wins, all is right in the world; if the dumb-dumbs win, then democracy is a dubious proposition indeed.
In fact, the changing electoral map in 2012 reflects a shifting of social and demographic allegiances in American politics, as states such as Colorado, Virginia, and North Carolina—once solidly Republican—have drifted toward Democrats, with their high-tech industries, better-educated voters, and more diverse populations. Meanwhile, states such as Ohio, Iowa, and Wisconsin with a greater proportion of white, working-class, and often union voters have lately seen their Democratic sympathies diminish.
What does this have to say for the fantasy of epistocracy? Perhaps electoral votes could be apportioned by education levels, so that Ohio gets fewer and Virginia gets more than they each currently have. Such a proposal seems to embroider an even more warped and arbitrary system on one—the Electoral College—that is already warped and arbitrary enough.
One person, one vote may have the unfortunate result of letting your Aunt Bertha have the same say as you, even if she thinks Hillary Clinton is a reptilian alien space lesbian. But it offers a far safer rubric than relying on any other set of qualifications to participate in the process of collectively deciding how to allocate our resources and govern our shared society. Churchill’s oft-repeated statement that democracy is the worst system except for all the other ones comes to mind.
The temptation to set one group—the creative, the knowledgeable, the expert and the informed—over another is strong these days, given the ascendance of the tech sector and a highly educated, professional workforce more generally. We can trace the arc of this story from Daniel Bell to Richard Florida and beyond, with the rise of the knowledge worker in an ever-evolving form of postindustrial capitalism.
Yet we would do well to remember Paul Oskar Kristeller and his musings on creativity. To be creative might seem to be good in the most axiomatic sense. But creativity is a virtue of relatively recent vintage. We once thought that to create was to make something out of nothing. Epistocracy, with its emphasis on privileging the educated and creative, might just make nothing out of the something we already have in representative democracy.
A downloadable version of the paper can be found here. Questions, comments, critiques and suggestions are most welcome, as this is part of a new and evolving book project. Follow the author on Twitter here.
References
[1] Paul Oskar Kristeller, “‘Creativity’ and ‘Tradition’,” Journal of the History of Ideas 44 (Jan.-Mar., 1983), 105.
[2] Kristeller, “‘Creativity’ and ‘Tradition’,” 106.
[3] Ibid.
[4] Jamie Cohen-Cole, The Open Mind: Cold War Politics and the Sciences of Human Nature (Chicago: University of Chicago Press, 2014).
[5] Richard Florida, The Rise of the Creative Class, and How It’s Transforming Work, Leisure, Community, and Everyday Life (New York: Basic Books, 2002), 15.
[6] Corey Pein, “Mouth-Breathing Machiavellis Dream of a Silicon Reich,” The Baffler, May 19, 2014, http://thebaffler.com/blog/mouthbreathing-machiavellis.