https://www.newyorker.com/magazine/2017/09/18/the-case-against-civilization
Science and technology: we tend to think of them as siblings, perhaps even as twins, as parts of stem (for “science, technology, engineering, and mathematics”). When it comes to the shiniest wonders of the modern world—as the supercomputers in our pockets communicate with satellites—science and technology are indeed hand in glove. For much of human history, though, technology had nothing to do with science. Many of our most significant inventions are pure tools, with no scientific method behind them. Wheels and wells, cranks and mills and gears and ships’ masts, clocks and rudders and crop rotation: all have been crucial to human and economic development, and none historically had any connection with what we think of today as science. Some of the most important things we use every day were invented long before the adoption of the scientific method. I love my laptop and my iPhone and my Echo and my G.P.S., but the piece of technology I would be most reluctant to give up, the one that changed my life from the first day I used it, and that I’m still reliant on every waking hour—am reliant on right now, as I sit typing—dates from the thirteenth century: my glasses. Soap prevented more deaths than penicillin. That’s technology, not science.
In “Against the Grain: A Deep History of the Earliest States,” James C. Scott, a professor of political science at Yale, presents a plausible contender for the most important piece of technology in the history of man. It is a technology so old that it predates Homo sapiens and instead should be credited to our ancestor Homo erectus. That technology is fire. We have used it in two crucial, defining ways. The first and the most obvious of these is cooking. As Richard Wrangham has argued in his book “Catching Fire,” our ability to cook allows us to extract more energy from the food we eat, and also to eat a far wider range of foods. Our closest animal relative, the chimpanzee, has a colon three times as large as ours, because its diet of raw food is so much harder to digest. The extra caloric value we get from cooked food allowed us to develop our big brains, which absorb roughly a fifth of the energy we consume, as opposed to less than a tenth for most mammals’ brains. That difference is what has made us the dominant species on the planet.
The other reason fire was central to our history is less obvious to contemporary eyes: we used it to adapt the landscape around us to our purposes. Hunter-gatherers would set fires as they moved, to clear terrain and make it ready for fast-growing, prey-attracting new plants. They would also drive animals with fire. They used this technology so much that, Scott thinks, we should date the human-dominated phase of earth, the so-called Anthropocene, from the time our forebears mastered this new tool.
We don’t give the technology of fire enough credit, Scott suggests, because we don’t give our ancestors much credit for their ingenuity over the long period—ninety-five per cent of human history—during which most of our species were hunter-gatherers. “Why human fire as landscape architecture doesn’t register as it ought to in our historical accounts is perhaps that its effects were spread over hundreds of millennia and were accomplished by ‘precivilized’ peoples also known as ‘savages,’ ” Scott writes. To demonstrate the significance of fire, he points to what we’ve found in certain caves in southern Africa. The earliest, oldest strata of the caves contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch.
Anatomically modern humans have been around for roughly two hundred thousand years. For most of that time, we lived as hunter-gatherers. Then, about twelve thousand years ago, came what is generally agreed to be the definitive before-and-after moment in our ascent to planetary dominance: the Neolithic Revolution. This was our adoption of, to use Scott’s word, a “package” of agricultural innovations, notably the domestication of animals such as the cow and the pig, and the transition from hunting and gathering to planting and cultivating crops. The most important of these crops have been the cereals—wheat, barley, rice, and maize—that remain the staples of humanity’s diet. Cereals allowed population growth and the birth of cities, and, hence, the development of states and the rise of complex societies.
The story told in “Against the Grain” heavily revises this widely held account. Scott’s specialty is not early human history. His work has focussed on a skeptical, peasant’s-eye view of state formation; the trajectory of his interests can be traced in the titles of his books, from “The Moral Economy of the Peasant” to “The Art of Not Being Governed.” His best-known book, “Seeing Like a State,” has become a touchstone for political scientists, and amounts to a blistering critique of central planning and “high modernism,” the idea that officials at the center of a state know better than the people they are governing. Scott argues that a state’s interests and the interests of subjects are often not just different but opposite. Stalin’s project of farm collectivization “served well enough as a means whereby the state could determine cropping patterns, fix real rural wages, appropriate a large share of whatever grain was produced, and politically emasculate the countryside”; it also killed many millions of peasants.
Scott’s new book extends these ideas into the deep past, and draws on existing research to argue that ours is not a story of linear progress, that the time line is much more complicated, and that the causal sequences of the standard version are wrong. He focusses his account on Mesopotamia—roughly speaking, modern-day Iraq—because it is “the heartland of the first ‘pristine’ states in the world,” the term “pristine” here meaning that these states bore no watermark from earlier settlements and were the first time any such social organizations had existed. They were the first states to have written records, and they became a template for other states in the Near East and in Egypt, making them doubly relevant to later history.
The big news to emerge from recent archeological research concerns the time lag between “sedentism,” or living in settled communities, and the adoption of agriculture. Previous scholarship held that the invention of agriculture made sedentism possible. The evidence shows that this isn’t true: there’s an enormous gap—four thousand years—separating the “two key domestications,” of animals and cereals, from the first agrarian economies based on them. Our ancestors evidently took a good, hard look at the possibility of agriculture before deciding to adopt this new way of life. They were able to think it over for so long because the life they lived was remarkably abundant. Like the early civilization of China in the Yellow River Valley, Mesopotamia was a wetland territory, as its name (“between the rivers”) suggests. In the Neolithic period, Mesopotamia was a delta wetland, where the sea came many miles inland from its current shore.
This was a generous landscape for humans, offering fish and the animals that preyed on them, fertile soil left behind by regular flooding, migratory birds, and migratory prey travelling near river routes. The first settled communities were established here because the land offered such a diverse web of food sources. If one year a food source failed, another would still be present. The archeology shows, then, that the “Neolithic package” of domestication and agriculture did not lead to settled communities, the ancestors of our modern towns and cities and states. Those communities had been around for thousands of years, living in the bountiful conditions of the wetlands, before humanity committed to intensive agriculture. Reliance on a single, densely planted cereal crop was much riskier, and it’s no wonder people took a few millennia to make the change.
So why did our ancestors switch from this complex web of food supplies to the concentrated production of single crops? We don’t know, although Scott speculates that climatic stress may have been involved. Two things, however, are clear. The first is that, for thousands of years, the agricultural revolution was, for most of the people living through it, a disaster. The fossil record shows that life for agriculturalists was harder than it had been for hunter-gatherers. Their bones show evidence of dietary stress: they were shorter, they were sicker, their mortality rates were higher. Living in close proximity to domesticated animals led to diseases that crossed the species barrier, wreaking havoc in the densely settled communities. Scott calls them not towns but “late-Neolithic multispecies resettlement camps.” Who would choose to live in one of those? Jared Diamond called the Neolithic Revolution “the worst mistake in human history.” The startling thing about this claim is that, among historians of the era, it isn’t very controversial.
The other conclusion we can draw from the evidence, Scott says, is that there is a crucial, direct link between the cultivation of cereal crops and the birth of the first states. It’s not that cereal grains were humankind’s only staples; it’s just that they were the only ones that encouraged the formation of states. “History records no cassava states, no sago, yam, taro, plantain, breadfruit or sweet potato states,” he writes. What was so special about grains? The answer will make sense to anyone who has ever filled out a Form 1040: grain, unlike other crops, is easy to tax. Some crops (potatoes, sweet potatoes, cassava) are buried and so can be hidden from the tax collector, and, even if discovered, they must be dug up individually and laboriously. Other crops (notably, legumes) ripen at different intervals, or yield harvests throughout a growing season rather than along a fixed trajectory of unripe to ripe—in other words, the taxman can’t come once and get his proper due. Only grains are, in Scott’s words, “visible, divisible, assessable, storable, transportable, and ‘rationable.’ ” Other crops have some of these advantages, but only cereal grains have them all, and so grain became “the main food starch, the unit of taxation in kind, and the basis for a hegemonic agrarian calendar.” The taxman can come, assess the fields, set a level of tax, then come back and make sure he’s got his share of the harvest.
It was the ability to tax and to extract a surplus from the produce of agriculture that, in Scott’s account, led to the birth of the state, and also to the creation of complex societies with hierarchies, division of labor, specialist jobs (soldier, priest, servant, administrator), and an élite presiding over them. Because the new states required huge amounts of manual work to irrigate the cereal crops, they also required forms of forced labor, including slavery; because the easiest way to find slaves was to capture them, the states had a new propensity for waging war. Some of the earliest images in human history, from the first Mesopotamian states, are of slaves being marched along in neck shackles. Add this to the frequent epidemics and the general ill health of early settled communities and it is not hard to see why the latest consensus is that the Neolithic Revolution was a disaster for most of the people who lived through it.
War, slavery, rule by élites—all were made easier by another new technology of control: writing. “It is virtually impossible to conceive of even the earliest states without a systematic technology of numerical record keeping,” Scott maintains. All the good things we associate with writing—its use for culture and entertainment and communication and collective memory—were some distance in the future. For half a thousand years after its invention, in Mesopotamia, writing was used exclusively for bookkeeping: “the massive effort through a system of notation to make a society, its manpower, and its production legible to its rulers and temple officials, and to extract grain and labor from it.” Early tablets consist of “lists, lists and lists,” Scott says, and the subjects of that record-keeping are, in order of frequency, “barley (as rations and taxes), war captives, male and female slaves.” Walter Benjamin, the great German Jewish cultural critic, who committed suicide while trying to escape Nazi-controlled Europe, said that “there is no document of civilization which is not at the same time a document of barbarism.” He meant that every complicated and beautiful thing humanity ever made has, if you look at it long enough, a shadow, a history of oppression. As a matter of plain historical fact, that seems right. It was a long and traumatic journey from the invention of writing to your book club’s discussion of Jodi Picoult’s latest.
We need to rethink, accordingly, what we mean when we talk about ancient “dark ages.” Scott’s question is trenchant: “ ‘dark’ for whom and in what respects”? The historical record shows that early cities and states were prone to sudden implosion. “Over the roughly five millennia of sporadic sedentism before states (seven millennia if we include preagriculture sedentism in Japan and the Ukraine),” he writes, “archaeologists have recorded hundreds of locations that were settled, then abandoned, perhaps resettled, and then again abandoned.” These events are usually spoken of as “collapses,” but Scott invites us to scrutinize that term, too. When states collapse, fancy buildings stop being built, the élites no longer run things, written records stop being kept, and the mass of the population goes to live somewhere else. Is that a collapse, in terms of living standards, for most people? Human beings mainly lived outside the purview of states until—by Scott’s reckoning—about the year 1600 A.D. Until that date, marking the last two-tenths of one per cent of humanity’s political life, “much of the world’s population might never have met that hallmark of the state: a tax collector.”
The question of what it was like to live outside the settled culture of a state is therefore an important one for the over-all assessment of human history. If that life was, as Thomas Hobbes described it, “nasty, brutish, and short,” this is a vital piece of information for drawing up the account of how we got to be who we are. In essence, human history would become a straightforward story of progress: most of us were miserable most of the time, we developed civilization, everything got better. If most of us weren’t miserable most of the time, the arrival of civilization is a more ambiguous event. In one column of the ledger, we would have the development of a complex material culture permitting the glories of modern science and medicine and the accumulated wonders of art. In the other column, we would have the less good stuff, such as plague, war, slavery, social stratification, rule by mercilessly appropriating élites, and Simon Cowell.
To know what it is like to live as people lived for most of human history, you would have to find one of the places where traditional hunting-and-gathering practices are still alive. You would have to spend a lot of time there, to make sure that what you were seeing wasn’t just a snapshot, and that you had a real sense of the texture of lived experience; and, ideally, you would need a point of comparison, people with close similarities to your hunter-gatherers, but who lived differently, so that you would have a scientific “control” that allowed you to rule out local accidents of circumstance. Fortunately for us, the anthropologist James Suzman did exactly that: he spent more than two decades visiting, studying, and living among the Bushmen of the Kalahari, in southwest Africa. It’s a story he recounts in his new book, “Affluence Without Abundance: The Disappearing World of the Bushmen.”
The Bushmen have long been of interest to anthropologists and scientists. About a hundred and fifty thousand years ago, fifty thousand years after the emergence of the first anatomically modern humans, one group of Homo sapienswas living in southern Africa. The Bushmen, or Khoisan, are still there: the oldest growth on the human family tree. (The term “Bushman,” once derogatory, is now used by the people themselves, and by N.G.O.s, “invoking as it does a set of positive if romantic stereotypes,” Suzman notes, though some Khoisan prefer to use the term “San.”) The genetic evidence suggests that, for much of that hundred and fifty thousand years, they were the largest population of biologically modern humans. Their languages use palatal clicks, such as a tsk, made by bringing the tongue back from the front teeth while gently sucking in air, and the “click” we make by pushing the tongue against the roof of the mouth, then bringing it suddenly downward. This raises the fascinating possibility that click languages are the oldest surviving variety of speech.
Suzman first visited the Bushmen in 1992, and went to stay with them two years later, as part of the research for his Ph.D. The group he knows best are the Ju/’hoansi, between eight and ten thousand of whom are alive today, occupying the borderlands between Namibia and Botswana. (The phonetic mark /’ represents a tsk.) The Ju/’hoansi are about ten per cent of the total Bushman population in southern Africa, and they are divided into a northern group, who retain significant control over their traditional lands, and who therefore still have the ability to practice hunting and gathering, and a southern group, who were deprived of their lands and “resettled” into modern ways of living.
To a remarkable extent, Suzman’s study of the Bushmen supports the ideas of “Against the Grain.” The encounter with modernity has been disastrous for the Bushmen: Suzman’s portrait of the dispossessed, alienated, suffering Ju/’hoansi in their miserable resettlement camps makes that clear. The two books even confirm each other’s account of that sinister new technology called writing. Suzman’s Bushman mentor, !A/ae, “noted that whenever he started work at any new farm, his name would be entered into an employment ledger, documents that over the decades had assumed great mystical power among Ju/’hoansi on the farms. The secrets held by these ledgers evidently had the power to give or withhold pay, issue rations, and determine an individual’s right to stay on any particular farm.”
It turns out that hunting and gathering is a good way to live. A study from 1966 found that it took a Ju/’hoansi only about seventeen hours a week, on average, to find an adequate supply of food; another nineteen hours were spent on domestic activities and chores. The average caloric intake of the hunter-gatherers was twenty-three hundred a day, close to the recommended amount. At the time these figures were first established, a comparable week in the United States involved forty hours of work and thirty-six of domestic labor. Ju/’hoansi do not accumulate surpluses; they get all the food they need, and then stop. They exhibit what Suzman calls “an unyielding confidence” that their environment will provide for their needs.
The web of food sources that the hunting-and-gathering Ju/’hoansi use is, exactly as Scott argues for Neolithic people, a complex one, with a wide range of animal protein, including porcupines, kudu, wildebeests, and elephants, and a hundred and twenty-five edible plant species, with different seasonal cycles, ecological niches, and responses to weather fluctuations. Hunter-gatherers need not only an unwritten almanac of dietary knowledge but what Scott calls a “library of almanacs.” As he suggests, the step-down in complexity between hunting and gathering and domesticated agriculture is as big as the step-down between domesticated agriculture and routine assembly work on a production line.
The news here is that the lives of most of our progenitors were better than we think. We’re flattering ourselves by believing that their existence was so grim and that our modern, civilized one is, by comparison, so great. Still, we are where we are, and we live the way we live, and it’s possible to wonder whether any of this illuminating knowledge about our hunter-gatherer ancestors can be useful to us. Suzman wonders the same thing. He discusses John Maynard Keynes’s famous 1930 essay “The Economic Possibilities for Our Grandchildren.” Keynes speculated that if the world continued to get richer we would naturally end up enjoying a high standard of living while doing much less work. He thought that “the economic problem” of having enough to live on would be solved, and “the struggle for subsistence” would be over:
When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession—as distinguished from the love of money as a means to the enjoyments and realities of life—will be recognized for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease.
The world has indeed got richer, but any such shift in morals and values is hard to detect. Money and the value system around its acquisition are fully intact. Greed is still good.
The study of hunter-gatherers, who live for the day and do not accumulate surpluses, shows that humanity can live more or less as Keynes suggests. It’s just that we’re choosing not to. A key to that lost or forsworn ability, Suzman suggests, lies in the ferocious egalitarianism of hunter-gatherers. For example, the most valuable thing a hunter can do is come back with meat. Unlike gathered plants, whose proceeds are “not subject to any strict conventions on sharing,” hunted meat is very carefully distributed according to protocol, and the people who eat the meat that is given to them go to great trouble to be rude about it. This ritual is called “insulting the meat,” and it is designed to make sure the hunter doesn’t get above himself and start thinking that he’s better than anyone else. “When a young man kills much meat,” a Bushman told the anthropologist Richard B. Lee, “he comes to think of himself as a chief or a big man, and he thinks of the rest of us as his servants or inferiors. . . . We can’t accept this.” The insults are designed to “cool his heart and make him gentle.” For these hunter-gatherers, Suzman writes, “the sum of individual self-interest and the jealousy that policed it was a fiercely egalitarian society where profitable exchange, hierarchy, and significant material inequality were not tolerated.”
This egalitarian impulse, Suzman suggests, is central to the hunter-gatherer’s ability to live a life that is, on its own terms, affluent, but without abundance, without excess, and without competitive acquisition. The secret ingredient seems to be the positive harnessing of the general human impulse to envy. As he says, “If this kind of egalitarianism is a precondition for us to embrace a post-labor world, then I suspect it may prove a very hard nut to crack.” There’s a lot that we could learn from the oldest extant branch of humanity, but that doesn’t mean we’re going to put the knowledge into effect. A socially positive use of envy—now, that would be a technology almost as useful as fire. ♦
This article appears in other versions of the September 18, 2017, issue, with the headline “How Civilization Started.”
- John Lanchester, the author of “How to Speak Money,” is a contributing editor at The London Review of Books, and has written for The New Yorker since 1995.