Did the modern brain shape only evolve recently?

Study claims that brain did not reach present-day range of variation until between 100,000 and 35,000 years ago.

A new study (Neubauer, et al., 2018) has suggested that globular form of the human cranial vault did not reach its present-day range of variation until between 100,000 and 35,000 years ago, and that this was linked to continuing evolutionary change affecting the shape and proportions of the brain. Fully modern human behaviour, it is claimed, did not emerge until that time.

Present-day humans are distinguished from archaic humans such as Neanderthals by a globular as opposed to a long, low cranial vault. The earliest representatives of our species (‘archaic Homo sapiens’), who lived around 300,000 years ago, retained the archaic brain shape; but by 200,000 years ago this had given way to the modern, globular form – or had it?

Paleoanthropologists at the Max Planck Institute for Evolutionary Anthropology in Germany used CT scans to generate virtual endocasts of modern human skulls from 315,000 to 195,000 years ago, 120,000 years ago, 35,000 to 8,000 years ago, along with skulls of Neanderthals, Homo heidelbergensis, and Homo erectus. They applied statistical methods to these, and they concluded that globularity within the present-day range of variation did not appear until between 100,000 and 35,000 years ago.

The transition from the long, low to globular condition has been long attributed to changes in the proportions of rather than the size of the brain. However, the Max Planck report suggested that this happened in two stages. In the first stage, the cerebellum, parietal, and temporal areas increased in size. This was followed by a second stage in which the cerebellum continued to increase in size, but this was accompanied by size increases in the occipital lobes. This second stage was not completed until between 100,000 and 35,000 years ago. The report suggested that the most important changes were the expansion of the parietal areas and the cerebellum.

The parietal areas are associated with orientation, attention, perception of stimuli, sensorimotor transformations underlying planning, visuospatial integration, imagery, self-awareness, working and long-term memory, numerical processing, and tool use. The cerebellum is associated not only with motor-related functions including coordination of movements and balance but also with spatial processing, working memory, language, social cognition, and affective processing.

The report links these changes with evidence for the emergence of modern human behaviour in the archaeological record. It notes that, firstly, the onset of the Middle Stone Age in Africa 300,000 years ago corresponds closely in time the earliest known fossils of Homo sapiens (the Jebel Irhoud remains from Morocco). Secondly, behavioural modernity gradually developed over time in concert with increasing globularity. Thirdly, the point at which the modern condition was achieved corresponds to the transition from the Middle to the Later Stone Age in Africa and from the Middle to the Upper Palaeolithic in Europe around 50,000 to 40,000 years ago.

The idea that anatomically modern humans were not behaviourally modern in the first instance is an old one, based on the idea changes in the archaeological records of Europe and Africa 50,000 years ago were linked to a cognitive ‘Great Leap Forward’. This, it was argued, was the result of a favourable genetic mutation that somehow ‘rewired’ the human brain, enabling it to function more efficiently. The Max Planck report rejects this conclusion, suggesting that the Great Leap Forward simply represented the end-point of the globularization process.

The problem is that the notion that changes in the archaeological record could be linked to a cognitive advance 50,000 years ago was thoroughly debunked by anthropologists Sally McBrearty and Alison Brooks almost two decades ago – ironically in a paper cited by the authors of the Max Planck report. (McBrearty & Brooks, 2000) In Europe, there is no doubt that a dramatic change is seen with the onset of the Upper Palaeolithic. Cave paintings, carved figurines, and other art appears for the first time. Nobody doubts that these artefacts are products of wholly modern human minds – but they simply herald the arrival of modern humans in Europe, not a cognitive advance by people already living there. Similarly, the transition from Middle to Later Stone Age in Africa is more parsimoniously explained by the need of growing populations for better tools and more sophisticated hunting techniques. Many supposed innovations can be found tens of thousands of years earlier at African Middle Stone Age sites. These include:

  • 60,000-year-old ostrich eggshells engraved with graphic patterns from Diepkloof Rock Shelter, South Africa.
  • Evidence for a well-developed catfish harvesting industry at Katanda on the Upper Semliki River in the Democratic Republic of the Congo, 90,000 years ago.
  • Ochre pieces engraved with abstract patterns from Blombos Cave, South Africa, in some cases over 100,000 years old.
  • Microliths from Pinnacle Point, South Africa, dating to 164,000 years ago. Microliths are used in multi-component tools, and they are associated with the most advanced (mode 5) stone tool technologies.

Furthermore, many traits once considered to be markers of late emerging modern human behaviour have now been identified much further back in the archaeological record, and indeed are not restricted to modern humans. These include fowling and use of seafood, both of which have since also been attributed to Neanderthals.

This evidence suggests that modern human behaviour had certainly emerged by 100,000 years ago, and probably by 164,000 years ago. While a link between globularity and modern human behaviour is likely, the associated cognitive changes probably only occurred during the first phase of globularization between 315,000 to 195,000 years ago. Subsequent increases in globularity might be linked to factors other than changes in brain shape. Early modern humans were far more powerfully built than present-day people, and the more gracile, fully-modern form did not appear until after 35,000 years ago. Brains actually show a slight decrease in average size during this period.

References:

McBrearty, S. & Brooks, A., 2000. The revolution that wasn’t: a new interpretation of the origin of modern human behaviour. Journal of Human Evolution, Volume 39, pp. 453-563.

Neubauer, S., Hublin, J. & Gunz, P., 2018. The evolution of modern human brain shape. Science Advances, 24 January, Volume 4, p. eaao5961.

Advertisements

Craniofacial changes after 80,000 years ago may reflect increased social tolerance

Did technology boom result from reduced male aggression?

An ambitious study published in the journal Current Anthropology has proposed a link between craniofacial changes and technological advances occurring after around 80,000 years ago. The authors suggest that a reduction in browridge projection and a shortening of the upper facial skeleton (referred to by the authors as ‘feminization’) was the result of either reduced testosterone levels or reduced androgen receptor densities. This reduced male aggression and increased social tolerance, in turn enabling larger groups to live together and the building of long-distance social networks. The consequence was that between around 80,000 and 30,000 years ago, there was acceleration of cumulative technological evolution or ‘cultural ratcheting’ where innovations accumulate over time as a result of cooperation between individuals and groups.

Robert Cieri and his colleagues considered over 1,400 skulls from prehistoric people living before and after 80,000 years ago, and from recent foragers and farmers. They found that a trend towards browridge reduction and facial shortening is apparent from after 80,000 years ago, consistent with the hypothesised higher testosterone levels or androgen receptor densities prior to that time.

The study is the latest attempt to resolve an apparent lag in the archaeological record between the appearance of anatomically modern humans and clear evidence for modern human behaviour in the form of symbolic expression, innovation and planning depth. Such evidence does not become apparent until after 80,000 years ago, leading some anthropologists to postulate a ‘smartness mutation’ occurring at about that time that somehow ‘rewired’ the human brain and enabled it function far more efficiently.

In the last fifteen years or so, the trend has been to look for demographic rather than cognitive explanations. The African Middle Stone Age has examples of seemingly precocious technological traditions such as Stillbay and Howieson’s Poort that were relatively short-lived and were replaced by more conservative traditions. It is argued that periodically, population levels fell to below the level needed to preserve the complex technological traditions from one generation to the next. It has also been suggested that evidence for symbolic behaviour in the form of beads and ornaments is indicative of more complex societies rather than more complex brains. Both fit well with the idea of more tolerant societies emerging after 80,000 years ago.

However, the most recent archaeological data has failed to correlate the disappearance of advanced technological traditions with demographic collapse, and indeed the extent to which such technologies were repeatedly lost has been questioned. For example, microliths are now known to have first appeared 165,000 years ago and their periodic disappearance from the archaeological record may be apparent rather than actual.    

References:

1. Cieri, R., Churchill, S., Franciscus, R., Tan, J. & Hare, B., Craniofacial Feminization, Social Tolerance, and the Origins of Behavioral Modernity. Current Anthropology 55 (4), 419-443 (2014).

The engraved ochres of Blombos Cave

World’s earliest abstract art

Seventy-three thousand years ago, an elderly man sat in the mouth of a limestone cave, intently working on a small rectangular piece of reddish-brown ochre. From time to time, he paused in his work and stared out towards the sea that lay a kilometre away. First he scraped and ground the piece flat on two sides and then, with the aid of a sharp stone tool, he engraved a cross-hatched geometrical design upon one of the newly-ground facets. It was the second such piece he had made, but there would be no others because a few weeks later he fell ill with a respiratory disease. A younger man would probably have recovered, but at 48 he was old and worn out. Within a few days, he was dead.

When his companions eventually decided it was time to move on from the cave that had been their home for the last six months, they took most of their few possessions with them. But the old man’s ochres were forgotten and left behind. Within a few months, the ochres and all other traces of the brief occupation were buried beneath wind-blown sand, and they would not see the light of day again for very long time indeed….

In actuality, we have no idea who engraved the two pieces of ochre found at Blombos Cave in 2002, and although it is likely that they were both the work of the same individual we cannot be certain. The cave is located on the southern coast of South Africa in a limestone cliff 35 m (115 ft.) above sea level, 100 m (330 ft.) from the coast, though when occupied the coastline was further away. Discovered by archaeologist Christopher Henshilwood, it is one of the most extensively researched sites from the African Middle Stone Age.

Henshilwood has been familiar with the site since his childhood, since it is located on land owned by his grandfather. As a boy, he found a number of artefacts, dating from comparatively recent prehistoric times. In 1991, he returned there as a PhD student, hoping to discover similar artefacts. Instead, he found a number of bone tools and stone points that dated from a far earlier period, over 70,000 years ago.

After completing his PhD at the University of Cambridge, Henshilwood obtained funding to commence excavations at Blombos Cave, and he continues to lead work at the site to this day. Three Middle Stone Age phases of occupation have been identified. These are known as M1 (73,000 years ago), M2 (subdivided between an upper phase 77,000 years ago and a lower phase 80,000 years ago) and M3 (125,000 years ago). Each phase contains a number of occupation layers, but they are quite shallow indicating that the cave was only occupied sporadically and for relatively short periods of time.

However, while the cave was in use its occupants enjoyed a varied diet of large fish, shellfish, seals, dolphins and land mammals. Mole rats were often roasted over a fire and eaten; these large burrowing rodents are considered a delicacy by local farm workers to this day. The later occupations of the cave are associated with the Stillbay tool industry, which has first identified in the 1920s and named for the village of Still Bay, which lies a short distance from Blombos Cave.

The Stillbay was one of the cutting-edge industries of the African Middle Stone Age, and was noted for its finely-worked leaf-shaped stone points, which were made from high-quality materials including chert, quartzite and silcrete. The Stillbay people also made and used bone awls and projectile points, similar to those seen in ethnographic collections. Such implements are far more difficult to manufacture than stone tools. Other examples of Stillbay high tech included compound adhesives for hafting stone points to spears, and the use of heat treatment and pressure flaking for finishing stone artefacts. The Stillbay industry was widespread, and not confined to the region around Blombos Cave and Still Bay. Curiously, however, it was short-lived and persisted for less than a thousand years, before being replaced by seemingly less advanced industries of the type more typical of the African Middle Stone Age.

Perhaps the most important discovery at Blombos Cave has been the engraved ochres. Ochres are a range of minerals containing iron oxide that exist in a range of colours including red, yellow, brown and purple. They have long been used as pigments and their first known use was at least 266,000 years ago at Twin Rivers, a complex of caves in southern Zambia – before the appearance of modern humans. It is possible that the Twin Rivers ochre was used for utilitarian purposes, such as for making adhesives, for medicinal purposes, hide processing, or even as Stone Age sunblock. However, the consistent selection of brightly-coloured and often hard to grind materials suggests an ornamental explanation such as for body painting. In South Africa, red ochre appears in the archaeological record from at least 160,000 years ago, and invariably material with the reddest hues seems to have been preferred – again suggesting a non-utilitarian purpose.

Several thousand pieces of ochre have been found at Blombos Cave, and in 2002, Henshilwood reported the two pieces of engraved ochre from the 73,000 year old M1 phase, which were catalogued as AA 8937 and AA 8938. Both had been engraved with cross-hatched patterns, using a sharp stone tool to make wide grooves upon surfaces previously prepared by grinding. On AA 8938, in addition to cross-hatching, the pattern is bounded top and bottom by parallel lines, with a third parallel line running through the middle. The fact that the two pieces are so similar suggests a deliberate intent, rather somebody absent-mindedly scratching away at the pieces with a sharp object. Somebody who knew just what they were doing must have sat down and engraved the two pieces. Other engraved ochres were later identified, although they were less spectacular than AA 8937 and AA 8938. They came from all three phases of the site, and some were over 100,000 years old.

The Blombos Cave ochres are central to the debate about the emergence of modern human behaviour, which anthropologists define as the ability of humans to use symbols to organise their thoughts and actions. Symbols are anything that refers to an object or an idea, and can take the form of sounds, images or objects. They may refer directly to an object or idea, for example a representational image; or they may be totally abstract, such as spoken or written words. Thus for example a drawing of a cat, the sound ‘cat’ or the written letters ‘c-a-t’ may all be used to refer to a cat. We use symbols all the time – whenever we read a newspaper, check the time, consult a map or admire a painting or sculpture. All of these activities involve symbolic behaviour: human society could not function without it. Modern syntactical language is a system of communication that uses symbols in the form of spoken and (in the last six thousand years) written words to enable an effectively infinite range of meanings to be conveyed.

The earliest human species such as Homo habilis and Homo erectus are thought to have had only a very limited ability to use symbols, but it is hotly debated when the fully symbolic behaviour of present-day people emerged. Was it shared with some archaic humans – in particular the Neanderthals – or was it limited to Homo sapiens, emerging at the same time as anatomical modernity around 200,000 years ago? Some go further and argue that even Homo sapiens lacked behavioural modernity until about 40,000 to 50,000 years ago. Proponents of this view believe that a behavioural ‘Great Leap Forward’ occurred at this time as a result of a favourable genetic mutation that rewired the brain and enabled syntactical language and other trappings of behavioural modernity to develop.

They base this view around the observation that artefacts that are unequivocally the products of modern thought processes – cave paintings, figurines and carvings – do not appear in the archaeological record until that time. But Henshilwood argues that the engraved geometrical patterns on the ochres imply the existence of modern syntactical language, and that modern human behaviour must therefore have emerged much earlier.

Henshilwood dismisses other possible explanations for the marks on the ochres. For example, he claims that they could not the by-product of testing for the quality of powder that could be obtained from the ochres, as only a few lines would be required for this purpose. He also believes that the marks are unlikely to have been the result of absent-minded doodling because great care was taken in completing the patterns and ensuring the incisions matched up. Furthermore, engraving lines on hard pieces of ochre requires full concentration in order to apply the right pressure and keep the depth of incision constant. He believes that not only were the marks on the ochres made with deliberate intent, but recurring motifs on ochres found in all three of the Middle Stone Age phases are evidence of a tradition of engraved geometric patterns that began more than 100,000 years ago and persisted for tens of millennia.

The most obvious question is what was the significance of the geometric patterns? Henshilwood notes that the Christian cross would appear equally abstract to somebody unfamiliar with religious iconography. It is possible that the patterns meant something quite specific to the people who made them, though just what we simply don’t know and probably never will know.

Another question is if humans were able to produce abstract art over 100,000 years ago, why was figurative art not seen until so much later? One possibility is that humans were still not behaviourally fully modern and were not at this stage able to make figurative images, though Henshilwood rejects this possibility. He notes that there are many cultures that do not make figurative art, and many others that do so using perishable materials that would not survive for tens of thousands of years.

Henshilwood believes the fact that the ochre engravings were intentionally created and depict distinctive geometrical patterns is enough to demonstrate that they are the product of a society of behaviourally-modern humans. Given the evidence for technological sophistication during the Stillbay period, we should not find this unduly surprising.

Photo credit:

Image copyright held by author, Chris Henshilwood
Photo by Henning (2007)
Webpage available at: http://commons.wikimedia.org/wiki/File:BBC-artefacts.jpg
Licence: CC by Share Alike 3.0 unported

Comment on Villa & Roebroeks (2014) ‘An Archaeological Analysis of the Modern Human Superiority Complex’.

A paper by Paola Villa and Wil Roebroeks ( 1) in the open access journal PLOS ONE has reviewed archaeological evidence for the view that the ‘inferiority’ of Neanderthals to modern humans was responsible for their demise. See this post for a quick summary.

Villa and Roebroeks are critical of view that comparisons between the archaeological records of the African Middle Stone Age and European Middle Palaeolithic can be used to demonstrate that Neanderthals were ‘inferior’ to modern humans in terms of a wide range of cognitive and technological abilities, and have made a very good case.

However, they seem to be dismissive of the impact of what they describe as ‘subtle biological differences’ between Neanderthals and modern humans, which they state ‘tend to be overinterpreted’. They cite Pearce, Stringer and Dunbar (2013) as an example.

Pearce, Stringer and Dunbar used eye socket size as a proxy for the size of the eye itself and showed that Neanderthals had larger eyes than modern humans. This is not an unexpected result; living at high latitudes, Neanderthals experienced lower light levels than people living in the tropics, and larger eyes might have been an evolutionary response. The consequence is that in comparison to a modern human brain, a greater proportion of the Neanderthal brain might have needed to be dedicated to the visual cortex, with the trade-off that less was available for other cognitive functions. Pearce, Stringer and Dunbar suggested that Neanderthals were less able than modern humans to maintain the complex social networks required to manage long-distance trade networks effectively, and learn about the existence of distant foraging areas unaffected by local shortages. Furthermore, their ability to develop and pass on innovations might have been limited in comparison to modern humans ( 2).

On a less subtle level, it is only to be expected that the neural organisation of the Neanderthal brain would have differed from that of modern humans. The globular brain case of Homo sapiens differs from the long, low braincase that characterised archaic human species, including Neanderthals, and the change reflects a change in the actual proportions of the brain. In comparison to archaic humans, the parietal lobes of modern humans are expanded, and these are associated with the processing of speech-related sounds ( 3, 4). It is possible that their development played a role in the development of syntactic language in Homo sapiens and that Neanderthals used different and partially non-verbal forms of communication ( 5).

While Villa and Roebroeks have demonstrated the risks of over-reliance on archaeological evidence, the biological differences between Neanderthals and modern humans are real. These differences must be included within a holistic approach to understanding the cognitive abilities of the Neanderthals, who as we now know are far from extinct in that they live on in the genome of modern populations.

References:

1. Villa, P. & Roebroeks, W., Neandertal Demise: An Archaeological Analysis of the Modern Human Superiority Complex. PLoS One 9 (4), e96424 (2014).
2. Pearce, E., Stringer, C. & Dunbar, R., New insights into differences in brain organization between Neanderthals and anatomically modern humans. Proceedings of the Royal Society B 280 (1758) (2013).
3. Wynn, T. & Coolidge, F., in Rethinking the human revolution, edited by Mellars, P., Boyle, K., Bar-Yosef, O. & Stringer, C. (McDonald Institute, Cambridge, 2007), pp. 79-90.
4. Coolidge, F. & Wynn, T., The Rise of Homo sapiens (Wiley-Blackwell, Hoboken, NJ, 2009).
5. Mithen, S., The Singing Neanderthal (Weidenfeld & Nicholson, London, 2005).

Projectile weapons invented almost 280,000 years ago, by pre-modern humans

Study suggests Ethiopian Rift stone points were used as hafted javelin tips.

The invention of projectile weaponry was clearly an important advance for early humans, enabling large mammals or enemies to be killed or wounded at a distance, without the dangers of a confrontation at close quarters.

The earliest humans probably hunted to an extent, but unequivocal evidence for the hunting of large mammals does not appear in the archaeological record until the Middle Pleistocene. In 1995, four wooden spears were discovered at an open cast mine near the town of Schöningen in Germany. The 400,000-year-old weapons were found with the carcasses of the horses they had been used to kill: the earliest-known association of hunting weapon with quarry. Each spear was over 2 m (6 ft. 6 in.) long, sharpened at both ends, and scraped smooth with stone tools (Thieme, 1997). However, these were unlikely to have been projectile weapons. They are closer in thickness to ethnographically-recorded thrusting spears rather than throwing spears, and if thrown would have had a killing radius of less than 8 m (26 ft.) (Shea, 2006).

Even earlier are the 500,000-year-old stone points from the site of Kathu Pan 1 (KP 1) in South Africa. Some exhibit fractures to their ends, bases and edges that are consistent with a short-ranged weapon striking a target – but not with use for cutting or scraping. The points are shaped near the base in a way that suggests that they were hafted to wooden spears. Experiments with replicas of the KP 1 points, made from similar raw materials, suggest that they made effective spear tips. This makes them the earliest-known multi-component tools; however, they were thrusting spears rather than projectile weapons (Wilkins, et al., 2012).

Throwing spears or javelins were once thought to be a technology unique to modern humans. However, a newly-published study suggests that they predate the emergence of Homo sapiens by 80,000 years. The Gademotta Formation is an archaeological site located on the flanks of an ancient volcanic caldera in the Ethiopian Rift. Investigations since 2010 have yielded over two hundred intact or fragmentary stone points, nearly all of which made from locally-available obsidian. Obsidian is a naturally-occurring volcanic glass that is well-suited to the production of implements with a sharp cutting edge. Argon-argon dating suggests that the oldest of the artefacts are 279,000 years old. Many of the points were found to bear fracture patterns on their tips consistent with impact damage arising from their use as hafted javelin tips, rather than as thrusting weapons (Sahle, et al., 2013).

The pre-modern humans living in Africa at this time are commonly referred to as Homo heidelbergensis. It is commonly supposed that they lacked the cognitive abilities of modern humans (Klein & Edgar, 2002), but the emerging view is that the sophistication of Middle Pleistocene humans has been severely underestimated. The Gademotta projectile tips are an important piece of evidence in this new picture.

References:

1. Thieme, H., Lower Paleolithic hunting spears from Germany. Nature 385, 807-810 (1997).

2. Shea, J., The origins of lithic projectile point technology: evidence from Africa, the Levant, and Europe. Journal of Archaeological Science 33, 823-846 (2006).

3. Wilkins, J., Schoville, B., Brown, K. & Chazan, M., Evidence for Early Hafted Hunting Technology. Science 338, 942-946 (2012).

4. Sahle, Y. et al., Earliest Stone-Tipped Projectiles from the Ethiopian Rift Date to.279,000 Years Ago. PLoS One 8 (11) (2013).

5. Klein, R. & Edgar, B., The Dawn of Human Culture (John Wiley & Sons, Inc., New York, NY, 2002).

Study highlights differences in brain organisation between Neanderthals and modern humans

Neanderthals focussed on vision at expense of social networking.

A new study has suggested that there were significant differences in the neurological organisation of Neanderthals and modern humans, reflecting physiological differences between the two species. Neanderthals, as has long been known, were larger and more powerfully-built than modern humans. Consequently, it is suggested that they required proportionately more ‘brain power’ to carry out body maintenance ‘housekeeping’ tasks and control functions. In addition, it is suggested that Neanderthals had larger eyes than modern humans, which also used up brain power. They lived at high latitudes in Eurasia, where they experienced lower light levels than people living in the tropics.

Researchers considered the remains of 21 Neanderthals and 38 modern humans dating from between 27 to 200 thousand years ago. They adjusted brain sizes to compensate for the greater Neanderthal body size, and estimated the size of the visual cortex from eye socket measurements. The average size of the Neanderthal eye socket was found to 44 by 36 mm (1.73 by 1.42 in.) compared with 42 by 30 mm (1.65 by 1.18 in.) for the modern humans. This equates to an eyeball volume of 34 cc against 29.5 cc; a 15 percent difference.

With more brain power required for housekeeping and visual functions, less would have been available for social interactions, and it has been suggested the Neanderthal maximum social group size was smaller than the ‘Dunbar Number’ of 150 associated with modern humans. The area covered by extended Neanderthal communities would have been smaller than those of modern humans. Their ability to trade would have been reduced, as would their capacity to learn of distant foraging areas potentially unaffected by local shortages. Furthermore, their ability to acquire and pass on innovations may have been limited in comparison to modern humans.

In the high latitudes of Eurasia, far from their African homeland, modern humans were disadvantaged in as much as they lacked the enhanced visual acuity, as well as other Neanderthal adaptations to the colder climate. Unable to adapt their bodies, modern humans adapted their technology, and thus became more reliant on it than were the Neanderthals. However, technological change can greatly outpace evolutionary change. The combination of adaptable technology and enhanced social networks gave the first modern humans in Europe a competitive advantage over the physically-adapted Neanderthals, eventually bringing about the demise of the latter.

References:

1. Pearce, E., Stringer, C. &  Dunbar, R., New insights into differences in brain organization between Neanderthals and anatomically modern humans. Proceedings of the Royal Society B 280 (1758) (2013).

The Singing Neanderthals (2005), by Steven Mithen

Steven Mithen, Professor of Archaeology at the University of Reading, is a leading figure in the field of cognitive archaeology and a Fellow of British Academy. In 1996, drawing together many diverse strands, he described the possible evolutionary origins of the human mind in his seminal The Prehistory of the Mind: A Search for the Origins of Art, Science and Religion, in which he proposed that full consciousness only arose when the previously-separate cognitive domains that make up the mind became integrated by a process he described as “cognitive fluidity” (Mithen, 1996). Subsequent archaeological discoveries in Africa forced Mithen to revise some of his timescales without affecting the validity or otherwise of his theory (McBrearty & Brooks, 2000). However Mithen, who is himself a lover of music, felt that its role in the development of language had largely been dismissed as “auditory cheesecake”, as Steven Pinker had described it.

Mithen pleaded guilty to himself failing to consider music in his 1996 work. Accordingly, in The Singing Neanderthals, he set out to redress the balance. He begins by considering language.

Language is a very complex system of communication which must have evolved gradually in a succession of ever more complex steps generally referred to as proto-language. But what was the nature of this proto-language? There are two schools of thought – “compositional” and “holistic”. The compositional theories are championed by Derek Bickerton, who believes that early human species including the Neanderthals had a relatively large lexicon of words related to mental concepts such as “meat”, “fire”, “hunt”, etc (Bickerton, 1990). These words could be strung together, but in the absence of syntax, only in a crude fashion. Mithen, however, favours the holistic view, which is championed by linguist Alison Wray. Wray believes that proto-language comprised utterances that were holistic i.e. they conveyed complete messages. Words – where the utterances were segmented into shorter utterances – only occurred later.

Mithen presents evidence that there is a neurological basis for music and that this is distinct from language. He draws on a variety of sources: studies of brain-damaged patients, individuals with congenital impairments, brain activity scans and psychological tests carried out on both children and adults.

Just as definite regions of the brain are involved with language, and that damage to these regions can selectively or totally impair linguistic skills, so is the case for music. The musical regions appear to be primarily located in the right hemisphere of the brain, in regions corresponding to the Broca’s area on the left. However there does seem to some linkage between the linguistic and musical regions.

Infant directed speech (IDS) – that is to say the way in which adults and indeed quite young children speak to infants – has a musical quality that infants respond to. Mithen believes that infants have a highly-developed musical ability, but that this is later suppressed in favour of language. For example, infants often have perfect pitch, but very few adults do. Relative pitch is better that perfect pitch for language acquisition, as the latter would result in the same word spoken by two speakers being interpreted as two different words.

This Mithen argues may give us an insight into how Early Humans, such as Homo erectus and the Neanderthals communicated with one another. He falls back on the notion that “Ontogeny recapitulates Phylogeny”, i.e. our developmental history mirrors our evolutionary history. He rejects the notions that music arose from language or that language arose from music. Instead, he argues, music and language both evolved from a single system at some stage in our primate past.

A central point of Mithen’s theory is emotion, which he believes underpin our thoughts and actions. A fear response, for example, was necessary to force a flight response from a dangerous predator. Conversely, happiness was a “reward” for successfully completing a task. There are four basic emotions – happiness, sadness, fear and anger, with more complex emotions such as shame and jealousy being composite of these four. Emotions were crucial for the development of modern human behaviour and indeed for the development of any sapient species. Beings relying solely on logic, such as Vulcans, could never have evolved.

Experiments suggest that apes and monkeys and humans – and by implication Early Humans – all share the same basic range of emotions. Now Mithen pulls together two ideas – firstly, music can be used to both express and manipulate human emotions; secondly the vocalizations of primates serve much the same function in these animals. For example vervet monkeys use predator-specific calls to warn others of their kind. Thus a human would shout “get up the nearest tree, guys, there’s a leopard coming” but a vervet would utter a single specific “holistic” call conveying the same meaning. The difference is that the human utterance is referential, referring to a specific entity and instructing a specific response – a command”. By contrast the vervet monkey is using its utterance to manipulate the emotions of its fellows – the call is associated with a specific type of danger, inducing fear. The fear achieves the caller’s desired effect by inducing its fellows to climb into the trees for safety.

Mithen believes that in Early Humans, living in groups, extended child-rearing and the increased use of gestural communications led to an extention of the “holistic and manipulative” vocalization of monkeys and other primates into a communication mode he refers to as “Hmmmmm” – Holistic, manipulative, multi-modal, musical and mimetic”, with dance and mime being added to the repertoire. He cites a circular arrangement of animal bones at a Middle Pleistocene Homo heidelbergensis (the common ancestor of both modern humans and the Neanderthals) site at Bilzingsleben, in Germany and claims it was a demarcated space for song and dance routines, in other words a theatre. As with the vocalizations of vervet monkeys, Hmmmmm was intended to manipulate the actions of others. It was more complex than the vocalizations of any present-day non-human primate, but less so than that of modern humans. (For another viewpoint on the role of hominin group living in language evolution, see Dunbar (1996).)

The Hmmmmm of the large-brained Neanderthals was richer and more complex than that of earlier humans. It enabled them to survive in the harsh conditions of Ice Age Europe for 200,000 years, but their culture remained static and their scope for innovation limited by the lack of a true language which would have enabled complex ideas to be framed. Indeed, the sheer conservatism, lack of innovation, symbolic and artistic expression in the Neanderthal archaeological record is, to Mithen, proof that they lacked language. He dismisses the “problem” of the Châtelperronian culture, where there is indeed evidence of innovation and symbolic behaviour. Although the archaeological record is ambiguous with some claiming that the Châtelperronian horizon predates the Aurignacian horizon and the arrival of modern humans (Zilhão et al, 2006), Mithen believes this is incorrect and the Châtelperronian is a result of Neanderthal acculturation from modern humans. The coincidence of independent origin just before the arrival of modern humans is just too great to be believed, he states.

If Neanderthals lacked language, how did Homo sapiens acquire it? Mithen believes that language as we know it came about through the gradual segmentation of holistic utterances into smaller components. Though initially holistic, utterances could be polysyllabic, for example suppose “giveittome” was a holistic, polysyllabic utterance meaning “give it to me”. But if there was also a completely different utterance, “giveittoher”, meaning “give it to her”, then in time the “givitto” part would become a word in its own right. That two random utterances could have a common syllable or syllables that just happened to mean the same thing, and that this could happen often enough for a meaningful vocabulary to emerge strikes me as being implausible. However Mithen cites a computer simulation by Simon Kirby of Edinburgh University in support. Mithen also claims that Kirby’s work is turning Chomsky’s theory of a Universal Grammar on its head. Chomsky claimed that it was impossible for children to learn language without hard-wired linguistic abilities already being present, but Kirby’s simulations apparently suggest the task is not as daunting as Chomsky believed.

Language would have been the key to the “cognitive fluidity” proposed in Mithen’s earlier work (Mithen 1996) as the basis of modern human behaviour. Language would have enabled concepts held in one cognitive domain to be mapped into another. Derek Bickerton believes that language and the ability for complex thought processes arose as a natural consequence of the human brain acquiring the capacity for syntax and recursion (Bickerton, 1990, 2007) but if these capacities were also required for “Hmmmmm” then if the Kirby study is to believed, a changeover to full language could have occurred gradually and without any rewiring of the brain. Mithen argues that this was the case and that the first wave of modern humans to leave Africa, who established themselves in Israel 110-90,000 years ago (Lieberman & O’Shea, 1994; Oppenheimer, 2003) were still using “Hmmmmm”. By 50,000 years ago, “Hmmmmm” had given way to modern language and at this point modern humans left Africa, eventually colonising the rest of the world and replacing the Eurasian populations of archaic humans. That language was crucial to the emergence of modern human behaviour has also been suggested by Jared Diamond (Diamond, 1991).

“Hmmmmm”, for its part, did not disappear and music retains many of its features.

To sum up, this is a fascinating theory that clearly demonstrates that music is as much a part of the human condition as is language. Its main weakness as a theory is that it cannot, by definition, be falsified since all the “Hmmmmm”- using human species such as the Neanderthals are now extinct.

Another problem for me is the idea that anatomically-modern humans got by with “Hmmmmm” for at least 100,000 years and only gradually drifted into full language by the method outlined above. Given that creoles can arise from pidgins in a single generation, this seems implausible, unless we allow some change in the mental organization of modern humans occurring after then.

Mithen mentions the FOXP2 gene, which has been shown to have a crucial role in human language. One study suggested the human version of this gene emerged some time after modern humans diverged from Neanderthals (Enard et al, 2002). Supporters of a “late emergence” for modern human behaviour such as Richard Klein cited have cited this as evidence that otherwise fully-modern humans did in fact undergo some form of “mental rewiring” as late as 50,000 years ago (Klein & Edgar, 2002). However it has since been shown that the Neanderthals had the same version of the gene that we do (Krause et al, 2007), weakening the “late emergence” argument.

References:

Bickerton D (1990): “Language and Species”, University of Chicago Press, USA.

Bickerton D (2007): “Did Syntax Trigger the Human Revolution?” in Rethinking the human revolution, McDonald Institute Monographs, University of Cambridge.

Diamond J (1991): “The Third Chimpanzee”, Radius, London.

Dunbar R (1996): “Grooming, Gossip and the Evolution of Language”, Faber and Faber, London Boston.

Wolfgang Enard, Molly Przeworski, Simon E. Fisher, Cecilia S. L. Lai,
Victor Wiebe, Takashi Kitano, Anthony P. Monaco & Svante Paabo (2002): Molecular evolution of FOXP2, a gene involved in speech and language, Nature, Vol. 418 22 August 2002.

Klein R & Edgar B (2002): “The Dawn of Human Culture”, John Wiley & Sons Inc., New York.

J. Krause, C. Lalueza-Fox, L. Orlando, W. Enard, R. Green, H. Burbano, J. Hublin, C. Hänni, J. Fortea, M. de la Rasilla (2007): The Derived FOXP2 Variant of Modern Humans Was Shared with Neandertals, Current Biology, Volume 17, Issue 21, Pages 1908-1912.

Daniel E. Lieberman and John J. Shea (1994): Behavioral Differences between Archaic and Modern Humans in the Levantine Mousterian, American Anthropological Association.

McBrearty S & Brooks A (2000): “The revolution that wasn’t: a new
interpretation of the origin of modern human behaviour”, Journal of Human Evolution (2000) 39, 453–563.

Mithen S (1996): “The Prehistory of the Mind”, Thames & Hudson.

Mithen S (2005): “The Singing Neanderthal”, Weidenfeld & Nicholson.

Mithen S (2007): “Music and the Origin of Modern Humans”, in Rethinking the human revolution, McDonald Institute Monographs, University of Cambridge.

Oppenheimer S (2003): “Out of Eden”, Constable.

João Zilhão, Francesco d’Errico, Jean-Guillaume Bordes, Arnaud Lenoble, Jean-Pierre Texier and Jean-Philippe Rigaud (2006): Analysis of Aurignacian interstratification at the Châtelperronian -type site and implications for the behavioral modernity of Neandertals, PNAS August 15, 2006 vol. 103 no. 33.

© Christopher Seddon 2009