Chapter 1 – Hunter-Gatherers vs Farmers

Hunter

In the early 2000s a student of mine alerted me to a book that appeared to eulogise hunter gatherers and demonise farmers. The author, Daniel Quinn, in his 1992 novel Ishmael, characterised the former as Leavers and the latter as Takers. A large proportion of well-educated middle-class Americans, only too happy to have their guilt about prosperity justified, agreed with the moral inferiority implied by the label Takers. Indeed, as far as a high-ranking journalist with the Chicago Tribune, Pete Reinwald was concerned, Quinn’s novel “…sounded an alarm on civilization’s war on Earth, embarked on 10,000 years ago at the beginning of the agricultural revolution.” (Chicago Tribune, 2013) This mad, bad, rapacious onslaught-without-end on nature was not the fault of humans in general but “our culture” which I take to mean Western culture.

Reinwald and his ilk do not want to revert back to a hunter-gatherer lifestyle because they love all the good stuff from ‘our culture’. It’s the “aggressiveness and mindlessness” of this civilisation that is the problem. I can understand a successful comedy writer like Tom Shadyak, who drastically downsized, embracing the anti-competitive ethos of Ishmael. Also, I can empathise with Aeron Davis, a legal advocate for chemical victims, who liked Ishmael for crystallizing his thoughts on how “…sustainability and adaptability were being jeopardized by corrupted group-think in our mainstream culture.” Furthermore, it makes sense for Barbara Ridd, part time faculty director of the School of Continuing and Professional Studies at DePaul university, to incorporate Ishmael into a course fashionably named Ecology of Personal Life. Of course, the spiritual dimension of the book is going to “resonate” with Laura M. Hartman, a professor of religion at Augustana College in Rock Island. But when Clive Finlayson, adjunct professor at Toronto University, with a DPhil in evolutionary ecology, ends his book on Neanderthals by denigrating us profligate humans as becoming utterly divorced from our biology, I worry. (The Humans Who Went Extinct– Why Neanderthals died out and we survived 2009) Again the agricultural ‘revolution’ comes in for some stick. Its technologically driven expansion “… marked the start of the illusion of progress towards a world of unsustainable growth, a dream that has turned into a nightmare as we procrastinate today while the current state and the future of our planet hang in the balance as a result of our voracity.” (p.214, Finlayson)

To me it is utterly wrong-headed to imply wilful ‘voracity’ to a group of humans who happened to be hunting and gathering in a part of the world which was propitious for farming.

Also, the same impulse to self flagellate is behind white Australia’s uncritical acceptance of Bruce Pascoe’s elevation of indigenous hunter-gatherer practices to agriculture. So much so that the information in his book quickly made it onto school curricula on Australian Aboriginal culture. Inconveniently, though, a couple of Australian academics questioned Pascoe’s interpretation of the archaeological and anthropological record. While minimising hunter-gatherers’ need for farming, Sutton and Walshe are keen to acknowledge the complexity of Indigenous social and spiritual connection to nature. A far more rewarding experience of Aboriginal life-ways is provided by Tyson Yunkaporta in his book Sand Talk. He manages to weave contemporary Western technology with ancient practice without a skerrick moral judgement. He asks tantalising questions concerning the possibility of true Aboriginal holistic thinking becoming a heuristic for Western science. While I would welcome two-way street with knowledge exchange between Aboriginal and Western culture to help mitigate current ecological threats, I hesitate to denigrate the entire trajectory of human culture since settled agriculture.

To properly assess the extent of Human agency regarding the shift from hunter-gathering to farming I will outline a few milestones in hominid biological and cultural evolution. I aim to show that sapiens was not in any way more special than any of the 21 other hominid species. I will also debunk a couple of myths in our evolutionary story. One concerning the relationship of brain size enlargement and controlled use of fire and the other a supposed explosion in human cleverness known as the Cognitive Revolution. The purpose of this explication is to demonstrate the absence of conscious human will or intentionality in how we developed over the last several millennia. So, rather than pouring scorn on humans’ decision to work longer hours doing back-breaking work and eating a less varied diet resulting in stunted growth and poor health compared to their hunter-gatherer forebears, we should cut them some slack.

When we started ploughing, planting seeds and harvesting we were extracting more from the soil than mother nature intended. Whereas, the noble hunter gatherers were far more attuned to nature so they could seamlessly blend in with natural ecosystems where the fauna and flora could easily recover from roaming bands stepping lightly over their environment. Such was the ‘successes of their ‘leaving’ ethos that they happily bred, fed, kept warm and had time for a spot of carving and cave painting for 490,000 years. I will tease out the practices that enabled hunter gatherers to farm over a 10,000-year period from 23,000 years ago. This will clearly demonstrate how cultural evolution operates without human volition or intentionality.

In that non-extractive economy, the total number of Homo Sapiens, around 74,000 years ago, is estimated to have been as little as 15,000, possibly due to the Mount Toba catastrophic volcanic event in Indonesia. Growing by a mere 78 souls a year for the next 38,000 years resulted in a population of 3 million spread out as far as China and Tasmania. This period of slow fertility rate could have had something to do with the last ice age which began 115,000 years ago and reached its maximum 22,000 years ago in northern France. Temperate deciduous forests with fertile brown soils were replaced by an arid steppe landscape with few trees but lots of grass land which fed millions of reindeer. With an average July temperature of 5ºC this was a harsh environment for those humans that had reached Europe 45,000 years ago. And yet the discovery of 27 skeletons, bone points, stone blades and body ornamentation in a cave near Aurignac in southwestern France, led to naming these people from the Upper Palaeolithic the Aurignacian Culture. They earned the title ‘culture’ not just for slightly more sophisticated stone blades but also for wonderful cave paintings in a variety of media. The remarkable Lowenmensch figurine carved in ivory of a lion’s head on a human body presents evidence of early myth creation. Carbon dating puts it bang in the middle of Aurignacian culture. Venus figurines which emphasised fertility and bone flutes, all add up to rich and complex artistic expression. We know that groups of humans differentiated themselves by identifying different animal teeth used for decoration by different groups even though the same animals existed throughout the region.

A good example of their non-exploitative interaction with the environment as well as their ingenuity of adaptation to ice age conditions was their use of antlers. As noted earlier plentiful herds of reindeer shedding their antlers provided the ideal pre-metal material for whittling into a sharp point for a g a spear. Whereas the wood for the shaft was scarce so instead of splitting the wood and shoving a stone flint into it, as was the practise in earlier forest covered pre–Ice Age Africa, these guys split the bone point and wedged the wooden spear into that. So, the weakness of the weapon moved from the shaft to the point which meant that after plunging the weapon 8 inches into the flesh of the prey they could withdraw the spear leaving the disposable point in the wound thus hastening the animal’s demise through fast loss of blood. (YouTube, Life in Palaeolithic Europe, Stefan Milo,2020)

If longevity of life is a marker of success in a species, then the Aurignacian hunter gatherer did well. If you are a Palaeophile you cannot accept that hunter-gatherers mostly died before the age of 30. According to them the increase in human population would have been a mathematical impossibility if that was the case. Furthermore, by dying at 30, they assert, that there would have been no evolutionary benefit of menopause. However, rather than a discreet process with an evolutionary benefit it was simply an aspect of senescence for women. A study of 768 hominin fossils, especially teeth, showed that, between 100,000 and 30,000 years ago, for every 4 old adults there were 10 young adults but after, during the Upper Palaeolithic this ratio was reversed. (Findings by Rachel Caspari, Central Michigan University, 2004)

The extent of Aurignacian culture stretched from Crimea in the east to Greece in the south, to Spain in the West, Northern France and Southern England in the northwest and to Silesia in the northeast. Artefacts and human skeletons found in caves across this vast expanse of Europe are dated between 37,000 and 33,000 years ago. The population over the entire territory is estimated to have been only about 15,000.

A sparse human population meant that we only had to compete for resources with other fauna. This was not hard because 6-7 million years ago 21 species of hominid had the fortuitous combination of opposable thumbs, stereoscopic sight, mouth tongue and language-capable vocal cords and a large brain relative to body size which other species didn’t have and we stood upright. By 300,000 years ago, fire was harnessed by the remaining 8-12 species of proto humans. The chief evolutionary advantage achieved by cooking meat resulted in smaller intestines due to easier digestion which in turn allowed the brain to use up more calorific value from nutritionally dense meat and hence grow in size. The trouble with this theory of fire causing encephalisation is that even if we credit humans first use of controlled fire to 780,000 years ago, that is much later than when we see big brains appear in the fossil record. So, what did Homo Erectus do in 500,000 years that achieved a brain size increase from 540cc to 1,200cc? Luckily, a peer reviewed article in the journal Frontiers in Neuroscience answers this question.

Essentially mathematics can explain the relationship between six variables. They are: the body’s energetic need; a correction factor for specific physical activity; body mass; basal metabolic rate; weight, height and age. Then fill in some numbers and correlate the results with known brain volumes of hominid species throughout evolution. Then another bunch of equations enables the calculation of the relationship between foraging efficiency to brain size. The resultant number of foraging hours in the day necessary to achieve the right amount of calorific intake to achieve brain growth was 5. Ample time in the day left to create all those pro-social behaviours which accelerated our bio-cultural evolution.

All that foraging efficiency was achieved by cooperative hunting and butchery as well as new forms of communitarianism. Also, 2 genes associated with language appeared at the same time. This new sophisticated communicative skill facilitated trans egalitarian exchanges for socio-sexual, material and celebratory purposes at special times of the year between different bands. What else is going to enhance brain development? We know neuronal connections are made when we are challenged to do things differently. So, you could argue that because the design of the double-edged Acheulean hand axe did not change for one million years, we needed cooked meat to catapult us out of neuronal inertia. However, the evidence is simply not there. Instead, it was the efficiency of that hand axe that allowed access to nutritionally rich food for 500,000 years that contributed to brain growth.

This same paper from Frontiers in Neuroscience wanted to augment its already robust defence of its thesis– that human brain expansion during evolution is independent of fire control and cooking– by feeding raw and cooked meat to mice. The experiment lasted 4 days, the results of which indicated “…that energetic gain in a diet based exclusively on raw meat is similar to, or even higher than, a diet of cooked meat.” I’m not sure I’m comfortable with extrapolating metabolism of mice to humans. However, an article in an anthropology magazine, Sapiens, (Raw Deal, Emma Marris, March 2016) on steak tartare, offered the salient observation that humans could digest raw meat if it was chopped up. The archaeological record shows that hominins were using sharp stone tools and eating meat 3.3 million years ago. Brain growth was paralleled with the diminution of jaw size and canine teeth. Thus, hominins must have been slicing chopping and pounding raw meat with stone tools because we knew trying to chew up whole chunks of raw flesh is not possible with our current dental equipment. Evidence for increased meat-eating dates back to 2.6 million years ago yet evidence for cooking doesn’t turn up until 300,000 years ago. A couple of evolutionary biologists Daniel Lieberman and Katherine Zink reckon that the changes to jaw size and brain growth became apparent in Homo erectus from 1.89 million to 143,000 years ago due to meat preparation without cooking.

Yet another anthropologist, Jessica Thompson, interpreted the same tool and animal bone evidence differently. Early hominins could well have used stone blades to cut up flesh to eat. However, far less time consuming and safer from competing animal eaters was to drag bones and skull back to relative safety where they could use flint hammerstones to crush them. The resultant marrow and brains provided the rich nutritional “…precursor to the fatty acids involved with brain and eye development.” (Sapiens, Richard Kemeny, March 2019)

So, if larger brains equated to better innovative strategies for survival, how come Neanderthals, with bigger brains, were supplanted by Sapiens? The story to explain this illustrates my thesis that biological and cultural evolution can account for much human behaviour which undermines any assignation of moral or intellectual judgement. When Sapiens’ herd following practices led them into the Levant from Africa, they met a similar species and decided to have sex with them rather than make dinner out of them. Of course, the mating decisions could have been the other way round where Sapiens’ body appeared cute and desirably fragile for the rather more robust Neanderthal. This amorous encounter occurred 100,000 years ago. Yet it took another 60,000 years for Neanderthals to disappear. Part of the reason for their demise is accounted for by that multigenerational timespan. I will not dwell on sparsely distributed small populations which lead to inbreeding or climate change but instead focus on the findings of a Stanford paper in 2019 which concentrates on differential pathogen immunity.

As you can imagine,the pathogens in tropical Africa would be different to those in a temperate Eurasian climate. Amongst Sapiens, who’s immune system evolved in the former climate, a biological parameter known as the ‘disease burden’ was higher than that evolved in Neanderthals. According to Gili Greenbaum (scientist in Biology Dept. at Stanford) it seems Neanderthals’ fate was sealed from the moment the two groups interbred. The disease burden favoured Sapiens who arrived from tropical Africa. “So, by the time modern humans were almost entirely released from the added burden of Neanderthal diseases, Neanderthals were still very much vulnerable to modern humans’ diseases. Moreover, as they expanded deeper into Neanderthal territory, they would have encountered Neanderthal populations that did not receive any protective immune genes via hybridisation.” Humans’ smartness is comforting and somehow self-affirming but it is an illusion.

Now, turning to the other belief about how we became very smart so quickly between 70 and 30 thousand years ago, there is plenty of evidence to support this. It was labelled the Cognitive Revolution. If we accept the appearance of humans into Australia 65,000 years ago during the last glacial period, they must have invented sea going boats.

Meanwhile, as outlined earlier, tool kits and art products of Aurignacian culture in Europe supply ample evidence of sophisticated cognitive ability. Sapiens painting in a cave on East Kalimantan dated to 40,000 years ago, used animal fat in clay lamps to illuminate their work. However, this evidence which is supposed to prove the Cognitive Revolution, has been labelled Eurocentric.

Sally McBrearty et al point out that aspects of modernity such as: hafted microliths, bone tools, increased geographic range, specialized hunting in large groups, aquatic resources, long distance trade, pigment, art and decoration, all appeared at sites all over Africa thousands of years earlier in the Middle Stone Age– 250,000-300,000 years ago. These feats of accomplishment were achieved by Homo Erectus who on anatomical and behavioural grounds can be considered our immediate ancestors. Along with creating the first hand-axe, two genes associated with language appeared in Homo Erectus without which social cooperation enabling complex cultural behaviour would not have been possible. The aspect of complex cultural behaviour here that was crucially enabled by language was teaching and therefore the aggregation of skills.

In a paper titled Late Pleistocene demography and the appearance of modern human behaviour, Adam Powell et al stated that “Genetic estimates of regional population size over time show that densities in Upper Palaeolithic Europe were similar to those in Sub-Saharan Africa where modern behaviour first appeared 90,000 years ago.” It is well known that for cultural complexity to survive the generations, tribal bands greater than 50 must be reached. They conclude that: “Demographic factors can thus explain geographic variation in the timing of the first appearance of modern behaviour without invoking increased cognitive capacity.”

As I have mentioned, Neanderthals met their end through a combination of factors such as diseases from Sapiens, isolated low populations and environmental changes but not lack of cognitive capacity. So how smart were they? To further refute the idea of a cognitive revolution here are some capacities Sapiens could learn from. Long before Sapiens reached Indonesia and followed the kelp trail island hopping to Sahul (when Papua Guinea was joined onto Australia) Neanderthals were rowing across the Aegean Sea to Crete. Whether they rowed or sailed we don’t know but Mousterian tool kits (tools used by Neanderthals) have been found on Greek islands including Crete which is 40 kilometres from mainland. Those tools have been dated to 100,000 years ago. Importantly, beyond tool making Neanderthals were scratching on rocks perforating shells, carving holes into mammoth tusks and blowing pigment onto stalactites all for the same reasons we make art today. In addition to their ability to reflect on the future as well as the past through visual expression they also buried their dead. This demonstrates a consciousness about the finality of life. Also, there is some evidence they cared for the injured. (Stefan Milo 2020)

Regarding their tool-making, they improved on the Acheulean hand axe by developing Levallois points. This involved a 4-step process to achieve a sharp flake which could be attached to a spear. This new method allowed multipole points to be produced which demonstrated forward planning. The other salient point–forgive the pun–is that the use of language was needed to teach these skills and we know they possessed the FOXP2 gene which is important for speech. Recent archaeological work on Neanderthal sites in Germany found a black substance attached to stone blades. Analysis of the black stuff indicated the use of fire to achieve pitch for gluing from birch bark. This discovery of the first complex industrial process is yet another example that demonstrates Neanderthal intelligence dated to 115,000 years ago. (NOVA, Evan Hadingham, Jan. 3013) Even earlier, 176,000 years ago, the discovery of combustion residues on portable grease lamps in a cave in France, demonstrate Neanderthals’ symbolic behaviour. (Plos One, 2021)

Before turning to the proto-agricultural Natufians here is one last analysis of Neanderthals’ behavioural complexity. The abstract of a paper entitled Behavioural complexity in Eurasian Neanderthal populations: a chronological examination of the archaeological evidence, indicated there were 98 instances of symbolic or complex behaviour between 160,000 and 40,000 years ago. (Michelle Langley et al, Cambridge Archaeological Journal, 2008) That doesn’t sound much for 120,000 years but if you apply mathematics to frequency probability models it’s enough to be significant.

It stands to reason that in the more than 10,000 (if we include Asian Neanderthals) years that Neanderthals and Sapiens were co-habiting Eurasia and sometimes successfully interbreeding there must have been two-way cultural influence.

The point of detailing complex and symbolic behaviour of Neanderthals is to show the possibility of cultural influences on the Levantine Aurignacian who in turn passed on their accumulated complex of social and technological practices to the first generation of Natufians 12,500 years ago. An examination of the archaeological record indicating sedentism for the first time shows how natural or logical and incremental these proto-farming practices were. The Natufians, preceded the official start of agriculture by some 3000 years. They built permanent houses, cultivated edible plants and there is ample archaeological evidence of mortar and pestles used to separate the husks from wild barley and wheat to make a kind of pitta bread.

Millions xox years before humans started farming Leaf Cutter ants had altered their natural environment by selecting farming practices that guaranteed them a food supply from fungus. This was possible for them because they had evolved really complex social networks involving millions. The analogy with humans allows us to understand how we also naturally selected the practice without conscious agency. Humans adopted farming in 11 regions of the world independently which suggests domestication of plants and animals was an unconscious natural cultural selection.

The so-called Neolithic Revolution was a good 13,000 years in the making. Not a time frame we associate with revolutions. A warmer and wetter climate ushered in the Holocene 23,000 years ago. Climate change led to a depletion of large game for hunting, so humans adapted by ceasing to travel long distances following herds and instead go after rabbits and other smaller animals locally. (Stefan Milo, The Evolution of Farming in the Near East, 2019) They are still hunting and gathering but more intensively. Regarding plant use, 90,000 seeds from100 different species was uncovered in Israel. Grinding stones and flint sickles found at the same site also provide evidence for a more intense use of resources. 8,500 years of not travelling so far led to sedentism. This lifestyle was widespread in the Levant by 14,500 years ago. (Milo, 2019)

For women it must have been a huge relief to have been able to heave the sprog off your back and leave it while you ground seeds to make porridge. Also, with a regular supply of goat’s milk and the security afforded by settled villages women had more babies. So much so that the population in Southern Levant increased six-fold between 14,500 and 13,500 years ago.

During the 1700 years after that the need to increase yields to feed larger populations drove the evolution of the specialization of plants. As noted earlier, the change in climate favoured wheat, peas, lentils and chickpeas so the use of a wide variety of plants was abandoned to make way for slow plant domestication. As Stefan Milo pointed out in his video this practice was not deliberate. The extent of human agency attached to plant domestication is discussed later. For now, Milo points out that grains that were larger and stronger were favoured which commenced the process of domestication. Thus by 11,700 years ago wheat had 22% modern traits and 3000 years later that figure increased to 90%. 500 years later sheep and goats started to be fenced in or at least herded into a specific area. 10,000 years ago, with the addition of pigs and cattle, the Levantine farm resembled the present day’s. (Milo 2019) So, these hunter-gatherers decided not to go hunting further afield and stayed home to guard their crops and animals. A totally reasonable decision to take even though the daily grind was worse than their previous lifestyle. With this newfound regular supply of digestible baby food women must have been happy to produce many more offspring because the population rapidly expands. Again, naturally, the people doing all the work and guarding all this property want to assert some kind of ownership. Thus, stone buildings begin to appear in the record from 11,130 years ago. Because farmers now had a delayed return on their investment in toil, they need a permanent physical place where demarcation issues could be negotiated. The hunter-gatherer lifestyle allowed you to walk away from your own refuse and problem people but now a sedentary lifestyle required basic issues like the disposal of human waste to be organised at a communal level.

To summarise so far: I hope you are convinced that the more you study the behaviour of humans from the Palaeolithic through to the Natufians and their cultural responses to extremely varied environmental conditions the more you cannot credit humans ever having experienced any revolution or inventing anything ex-nihilo. Moreover, without all those incremental bio-eco-socio-cultural steps agriculture would not have evolved.

It is worth bearing in mind, when proposing a division between different historical epochs, that, in this case, the transitional changes were incremental and lasted thousands of years and indeed are still occurring in some parts of the world. Another aspect worth noting is that the decision to farm was not made after meetings and consultations with fellow hunter-gatherers weighing up the rival merits of ecological sustainability of the two systems. As I have shown each milestone in biological and cultural evolution represents adaptive biology or behaviour which is selected for an advantage. Moreover, each change could not occur without the preceding change. There appears to be a logical sequence. So far, I cannot identify any human agency in the process. The evidence for all of these evolutionary plot points is physically extant in the archaeological and biological record. Obviously, there is plenty of intellectual room for controversy when interpreting behaviours derived from physical artifacts from such a long time ago but the increasing sophistication of multiple dating methods, chemical analysis and genetic knowledge at least ensure a level of accepted scientific fact. The same cannot be said, however, for the application of slippery notions like agency to pre-history. What follows is an attempt to research the extent, if any, of conscious decision making by humans in relation to cultural evolution from late Pleistocene onwards.

Most of the hits on the internet you get when when you pose a question about human evolution that include the word ‘agency’ stem from the social sciences and philosophy departments. I guess we have always wanted to shed light on the extent of our responsibility for our manifold triumphs or disasters from the earliest of times. Essentially the philosophical approach uses scientific language to detail the mechanisms for cultural evolution in an attempt to contextualise human behaviour within a multidisciplinary framework. It has the same access to archaeological and anthropological evidence and mathematical models with which it attempts to ground its broader conceptual speculations in science. Disciplinary cross fertilisation (or contamination) can work the other way. For example, an article in a biology journal cited property rights (!) as an element in the success of agriculture in the Fertile crescent.

Having a naturally curious mind a single human somewhere in Israel, Jordan, Lebanon or Syria over 16,500 years ago noticed something growing out of his poo. After a few similar observations he shared them with friends who corroborated his story with their own identical experiences. It has been speculated that ‘something’ was probably wild barley which had been gathered and stored for leaner foraging periods. The seeding could have occurred accidentally. (Michael Gross, Current Biology Vol 23 No 16, The paradoxical evolution of agriculture, 3013) The point Gross makes is not the mutually beneficial relationship between seed propagation and feeding humans but that those accidentally growing crops in their ‘back yard’ encouraged husbandry, the key element of which was the negotiated responsibility for a particular crop by a particular person or persons. Indeed as “Bowles and Choi argue that it was the co-evolution of food production and property rights — rather than technological progress based on inventions — that secured the success of agriculture in the Fertile Crescent…” (Gross, 2013)

Aside from domesticating animals and building grain storage silos which led to concepts of ownership, those two shifts from the hunter gatherer lifestyle point to a fundamentally different relationship to the environment. Cave paintings of wholly mammoth, lions, aurochs and gazelle attest to the respect they had for their non-human cohabitors. In fact, foraging peoples regarded animals as their equals according to Tim Ingold’s paper The Perception of the Environment, 2000. Whereas “…pastoralists… tend to regard animals as servants, to be mastered and controlled.” So, the relationship passes from trust to domination. Now, the crucial aspect or cascading deleterious consequences of said change is that once we start treating animals as subordinates it becomes easier to do the same with people. The hard evidence for this unempathetic but understandable shift is provided by scribes of Sumeria who “…used the same categories to describe captives and temple workers as they used for state-owned cattle.” (Initial Social Complexity in South Western Asia, The Mesopotamian Advantage by Guillermo Algaze, Current Anthropology, Journal Chicago Uni. 2001) Of course, before we’re appalled by this bureaucratic inhumanity, we need to bear in mind it took six and a half thousand years from those early Natufian settlements for the need to hierarchically organise society. Again, cultural evolution without human agency.

At this stage in the argument it’s probably a good idea to define what is meant by human agency. Thomas Dietz et aI outline the following criteria for the individual such that power must be exercised; actions must be intentional; there must be other possibilities that were not taken and the actor must be aware of the effects of the actions. (Human Agency and the Evolutionary Dynamics of Culture, T. Dietz and T.R. Burns, Acta Sociologica Vol.35, No.3 1992) Accepting that, then pretty much every action taken at the micro level such as moving plants and animals to new environments do demonstrate conscious agency as defined above. Just as a lion has the agency to decide to bring down a slightly limping wildebeest who is lagging behind the herd so did humans’ everyday decisions display similar conscious thought. The discussion around human agency and domestication centres on where you draw the line between the evolution of the accumulated decisions and the extent of foreknowledge of the likely outcomes of such accumulation. The transition to fully fledged agriculture and all the necessary institutional arrangements that flowed from it took a thousand years. Anticipating the next harvest was one thing but planning for the next millennia was clearly out of the question. (Core questions in domestication Research, Melinda Zeder, PNAS, 2015) Even some archaeobotanical studies have concluded that “…the increasing emphasis placed on cereal storage and processing will have been accompanied by an increase in social organisation.” The evidence to come to this conclusion was unearthed at a site in northern Syria in the form of “…the emergence of specialised storage rooms and food processing rooms with querns to crush the grains in.” All these proto agricultural practices occurred between 9500 and 9000 BCE. (G. Willcox and D. Stordeur, Antiquity, 2012, 86, 99–114).

The practices outlined above demonstrate the concept of ‘niche construction’. This is a term beloved of interdisciplinary journals for philosophical perspectives on biology who concern themselves with ‘human domestication’. When non-human species like beavers of birds alter the selective environments in which they evolve they too have constructed a ‘niche’. However, one does not talk about beaver or crow agency because those damn or nest building behaviours have been ‘naturally selected’ and are instinctive which fit Darwin’s evolutionary theory. Darwin would have had no time for the notion of human self-domestication because he did not believe we have intentionally selected certain traits which domesticate us. It is totally non-controversial to say we evolved biologically without conscious decisions to assess the usefulness of say, opposable thumbs. But doubt inflects the literature on ‘naturally selecting’ cultural behaviours. Is natural selection the same as automatic selection? In other words, will logic determine the selection of a beneficial idea? According to Chris Buskes, niche-construction is “culture’s ability to alter the path of biological evolution…”

There is much in the biological/anthropological/socio-cultural literature that accepts the processes of variation, mutation, selection and adaptation of biological evolution can be applied to culture. The only major difference is that establishment of novel behaviours is much quicker in the latter case when populations reach a critical mass. Also learning successful strategies does not just rely on vertical generational transmission but also on people beyond kin, peers and societal norms. (PNAS Cultural evolutionary theory: how culture evolves and why it matters, Nicole Creanza et al 2017) Richard Dawkins coined the term meme to substitute the gene which is a valid analogy but does get complicated when gene-culture co-evolution is considered. Here the idea is that cultural traits can alter selection pressures on genetic traits. A number of separate papers have used the example of the acquisition of a lactose enzyme enabling digestion of readily available milk resulting from sedentism and domestication of animals.

I agree with Darwin regarding self-domestication. Domus is Latin for home. Surely the home is simply another tool which had evolved via a series of environmental push factors and socio-cultural responses to them. Singling out the home to label a process which differentiates humans for all other species seems redundant to me. Far more consequential to the story of how singular humans became are genes and behaviours that contributed to pro-sociality and cooperation in trans egalitarian hunter gatherers. A research paper in evolutionary biology talks about humans self-domesticating by which they mean displaying traits of non-aggression. Pro-sociality and non-aggression are linked to the gene BAZ1B found in anatomically modern humans but not in Neanderthals or Denisovans. This gene may have provided the biological conditions to facilitate the cultural behaviour labelled ‘reverse dominance hierarchy’. According to Christopher Boehm (Current Anthropology 1993) our late Pleistocene ancestors formed alliances to punish alpha males over long periods which produced selection pressures against aggression. Therefore, generationally, aggressive behaviour became less prevalent and conversely egalitarian political forms became more prevalent. (Human domestication and the role of human agency in human evolution, Lorenzo del Savio and Matteo Mameli, Springer Link, 2020) While some group cooperation can dissolve because of the actions of selfish free riders, “…the frequency of co-operators may still globally increase because…groups of co-operators beat groups of selfish members.” (The Encultured Primate: Thresholds and Transitions in Hominin Cultural evolution. Chris Buskes, 2019)

Unfortunately, non-aggression, cooperation and trans-egalitarian political choices were not the only cultural traits to have evolved. A colloquium paper, Cultural evolutionary theory: How culture evolves and why it matters, (Nicole Creanza et al, PNAS, 2017) is an attempt to tease out in biological language the similarities and differences between methods of acquisition and transmission of genes and cultural traits. One aspect that that I have not covered so far is in a section on non-random assortment and biased transmission. The salient point here relates to “…types of transmission biases [that] reflect not how common a trait is in a population, but the characteristics of the people who have the trait.” The particular bias I’m interested in is ‘prestige’ as opposed to ‘conformity’, ‘novelty’ or ‘success’. Individuals displaying this bias, as can be inferred, want to acquire the cultural traits of high-status people in society. (Creanza,2017) I would like to suggest, though not discussed in the paper, the negative implications inherent in prestige bias. If a perceived high-status person achieved that position by foul means, then might not characteristics such as ruthlessness also be transmitted culturally? This very human trait was manifest during the Pre-Pottery Neolithic B– 8,500 years ago– according to Brian Hayden writing about the social consequences of agriculture. (Encyclopedia of Archaeology, 2008) “With each increase in social and political complexity, a concomitant but geometrical increase in surplus production was required to fund the ever-augmenting needs of the political apparatus which continued to be based on feasting, prestige items, and status.” (My emphasis added.) I guess the role of potlach today amongst indigenous Canadians hints at its ancient ancestral origins. Next time you see someone pull up in a Maserati, try not to judge too harshly– we evolved this flashy trait since hunter-gatherer days.

This idea is taken further by Will Storr in his book The Status Game (2021) which touches on the evolution of misbehaviour. He would agree with Hayden regarding the longevity of the role of status in shaping societies. Storr contends that humans have an “in-built or subconscious” desire to copy, flatter and conform to high status individuals’ behaviour. “We mimic not just their behaviour but their beliefs. The better we believe, the higher we rise. And so, faith, not truth, is incentivised. People will believe almost anything if high-status people…suggest them.” He includes priests in his list which could relate to the success of early power structures which revolved around religion and particularly the need to believe in elaborate mumbo-jumbo.

Casting an eye over thousands of years of recorded history it’s hard to be persuaded by the insight about the evolution of co-operation above. But of course, to organise and equip an army to conquer and occupy neighbouring and not so neighbouring cities required the evolution of institutions from those early Natufian settlements. The key feature of any ‘regulator of social interaction’– institution– is a body of rules that have evolved to assist co-operation so that an institution can perform its particular function. Apparently, Wittgenstein had plenty to say about rule following but all we need to know for the purposes of origins and evolution of institutions is that “This ability to create [them] helps bond societies together.” They allow “…collective behaviours with genetically unrelated individuals on a scale not seen in other species.” (The cultural evolution and ecology of institutions, T.E. Currie et al, Philosophical Transactions B, Royal Society publishing, 2021) Without needing to read the whole paper one can infer that our behaviour and the resulting technological and artistic artefacts evolved and institutions are dependent on a multiplicity of dynamic interactions. In the latter case individual lawmakers had agency but were constrained by pre-existing unwritten rules of co-operation or institutional norms.

Now, as jared Diamond pointed out, the Fertile Crescent was so named because that part of the Middle East had naturally- occurring domesticable plant and animals and a temperate climate which were ideal for successful agriculture. This fortuitous set of geographical circumstances allowed settled agriculture to flourish. The other major ingredient in Mesopotamia’s growth into cities was clay which not only provided the means to store products in jars but also the method to record ownership in a durable way by inserting sticks into the soft clay to make a mark and then baking it. The birth of writing led to exponentially more information being stored outside the human brain. It was this ability to accumulate culture through the written word that has been labelled the ‘ratchet effect’. This is only possible when societies are complex enough and populations big enough. The ratchet was not able to hold for Tasmanian Aborigines who lost some skills due to isolation and dwindling numbers. As Joseph Henrich from the American Psychological Association was keen to assert “The effectiveness of cultural evolutionary processes depends on the size, interconnectedness and social networks of populations.” Being smart is a result of being social, not the other way round– at the population level. (A cultural species: how culture drove human evolution, J. Henrich, 2011) If Henrich’s hypothesis that humans “created a suite of cognitive adaptations we call norm psychology” is correct and that “…[we] are programmed to attend to cues that activate an expectation of learning normative information” then agency is further sidelined.

Before finishing on a couple of different ways to divide the history of planet Earth, I just want to briefly mention a psychological cognitive capacity called ‘attentional capture’. I believe this subconscious cognitive tendency to unwittingly pay attention to a distraction illustrates yet another non-agentic process in our cultural evolution. The source of this idea is a paper titled Attraction, Distraction and Action. (W. Johnston et al, Advances in Psychology, 2001) A definition of attentional capture states that it is “the involuntary focusing of attention, for example by a change in stimulus, which interrupts other processing.” The authors suggest this cognitive process was significant in mediating cultural change. Obviously, the crucial word here is involuntary. For the pivotal example of this process, we could speculate on our lone hunter gatherer who was planning the day’s hunt when his attention was captured by the visual stimulus of barley growing out of his poo. We know where that led!

The fewer pivotal moments or transitional epochs you divide the history of the planet into the less human agency appears relevant. Three is always a good number. Two puts far too much significance on the one transition. Four and more we lose interest. Starting with the ultimate beginning of everything, the Big Bang a mere 13.7 billion years ago, featured a bunch of rocks and gasses expanding at ridiculous speeds– no human agency there. We could call that period ‘material’. The second, ‘biological’ period sees life form on one planet that we know a bit about 4 billion years ago. The Cambrian explosion a mere detail. For the third epoch, the ‘ideological’ we have to collapse all those billions into mere thousands because the date is between 40 and 8 thousand years ago when institutional order began. Clearly, we had a lot to do with that but were not particularly conscious of what drastic forces we were unleashing.

Another triad confined to the evolution of the hominid genus starts with ‘mimetic culture’ 2 million years ago practised by Homo Erectus where they imitated with gestures. Then, coinciding with the appearance of Sapiens 200,000 years ago, ‘mythic culture’ ushered in spoken language. This superior communicative method allowed the accumulation of myths and knowledge of the Pleistocene world. Finally, similar to the previous divisions– only a label change from ‘institutional order’ to ‘theoretic culture’– the advent of agriculture is chosen 10,000 years ago. Being able to store information outside the human brain, the Neolithic culture could begin science and philosophy. (Merlin Donald’s 3 stages of evolution)

Thus it makes no sense to morally denigrate humans for naturally evolving into systems that grew out of the need to store, protect and administrate the distribution of surplus grain. If a group of people can do all that they must have invented writing so they qualify for the term civilisation. In addition to writing the encyclopaedia Britannica requires settled permanent villages, planting own food, domesticated animals and specialised in jobs to qualify for that term. Nowhere in that list is there any implied moral superiority.

It could be argued the need to administrate a complex society required the establishment of a hierarchy. For the apex of that hierarchy to gain legitimacy, a higher authority had to be co-opted. Hence places of worship were prioritised over fortifications for allocating resources in the world’s first cities. Kings and Pharos wielded immense power by claiming to be the incarnation of various deities or at least having sole access to their communications. Safe to say religion remained the chief source of monarchical power–along with conquering armies– until 1500 CE.

The charge that farming begot large-scale violence appears to have some merit. However, we need to note that some 7,466 years of passivity existed since the official start of agriculture and the first Akkadian empire in 2,334BCE which assumes empires cannot be peacefully acquired.

“From the first chipped stone to the first smelted iron took nearly three million years; from the first iron to the hydrogen bomb took only three thousand years.” (A Short History of progress, Ronald Wright,2004) This observation tells me technology is easy but morality isn’t.

It might be correct to say that in the last 200 years of the Holocene we humans have become so influential on the planet’s ecosystems that we can label this epoch the Anthropocene. As far as our ‘agency’ on the future of said ecosystem is concerned, the best we can do is collectively nudge systems to evolve to sustainable equilibrium. The reason we never learn from history is because we’ve never had agency beyond immediate micro decisions affecting the immediate term. Therefore, the larger macro consequential events evolved unintentionally. We continue to blunder into the human and ecological maelstrom unwittingly as we always have done. What empire changed course because it learned from the previous one?

In my next chapter I attempt to answer a question about humans’ preparedness to accept the suffering of others from our hunter-gatherer days to today.

Hopefully will be uploaded in the coming weeks!

2 Comments

Leave a comment