Waterlow Park

Waterlow Park in North London was the gift to the public of Sir Sydney Waterlow in 1889. Over the last five years I’ve shot innumerable photographs there at different times of the year, day and under varying weather conditions. These are a selection of shots of the same scene, featuring a covered bench.

3 November 2002

22 March 2003

4 May 2003

28 October 2006

15 September 2007

22 December 2007

© Christopher Seddon 2008

Advertisements

Stanton Drew Stone Circle

Located in a field just outside the village of Stanton Drew near Bristol (OS Map reference: ST 601634) this is the second largest stone circle in Britain, surpassed only by Avebury. The Great Circle is 113 m in diameter and consists of 27 stones. There are two smaller circles – a 30 metre circle to the north-east with 8 stones, and a 40 metre circle to the south-west with 11.

The site is on private land, with no visitor facilities, for which reason it is far less well known than its impressive size warrants. Admission costs one pound, to be placed in an honesty box.

Below are a few of the pictures I took on New Year’s Day, 2008.

© Christopher Seddon 2008

The Ages of Man

The familiar terms “Stone Age”, “Bronze Age” and “Iron Age” are part of the so-called Three Age system, introduced by the Danish archaeologist Christian Jurgensen Thomsen in 1819 when he was curator of the collection of antiquities that subsequently became the National Museum of Denmark in Copenhagen. Thomsen was looking for a simple and logical system by which to arrange the collection which in common with those of other museums was in a chaotic state, overrun with prehistoric artefacts from all over the world. Thomsen was not the first to think of applying tool-making materials as a basis for classifying prehistoric cultures, although he was the first to actually do so. Thomsen lacked any means of dating his artefacts, but correctly guessed that stone had preceded bronze, which in turn had preceded iron. At the time – forty years before Darwin’s Origin of the Species – few suspected the true antiquity of mankind, with many still believing that the Earth was just 6,000 years old. Although the Scottish geologist Charles Hutton and others had begun to call this figure into question, in the early 19th Century it was still widely accepted.

As far back as 1860s, Thomsen’s original scheme was beginning to look lopsided and in 1865 the archaeologist Sir John Lubbock, a friend of Charles Darwin, published Pre-historic Times, which was probably the most influential archaeological textbook of the 19th Century. In it he introduced the terms “Palaeolithic” (Old Stone Age) and “Neolithic” (New Stone Age). We now know that the Palaeolithic encompasses all but a tiny fraction of human prehistory, beginning approximately 2.5 million years ago with the emergence of the first members of Genus Homo – i.e. the first human beings. Accordingly the Palaeolithic is in turn divided into Lower, Middle and Upper. The Lower/Middle transition is taken to be the point at which Mode 3 industries enter the archaeological record such as the predominantly Neanderthal Mousterian culture, at very roughly 300,000 years ago. The Middle/Upper transition, approximately 40,000 years ago, is the point at which unequivocal evidence for modern human behaviour is found.

In Africa the terms Early, Middle and Late Stone Age, or ESA, MSA and LSA respectively, are preferred, but the LSA also encompasses the Neolithic and Bronze Age as neither metallurgy nor agriculture reached sub-Saharan Africa until Iron Age times. To avoid confusion, I shall use only the term “Palaeolithic”, with its sub-divisions occurring at different times in different parts of the world. Such a scheme is generally used for later prehistory and I see no reason not to use it here also.

The division between the Palaeolithic and the Neolithic is now taken to be the Pleistocene/Holocene boundary, that is to say the end of the last ice age, at around 11,550 years ago. This is somewhat illogical division, equating a purely geological change to a system based on technology. Agriculture was independently adopted in several parts of the world and spread outwards from these nuclear zones, taking many millennia to reach some places, and necessitating the introduction of another division, the Mesolithic (Middle Stone Age) for regions where hunter gathering persisted. Conversely in parts of the world where proto-agriculture was practiced in late Pleistocene times, such as the Levant, the term Epipalaeolithic is used.

The transition from Neolithic to Bronze Age is equally ill-defined – there is generally a transitional period where stone and native copper tools are in mixed use; this transitional period is referred to variously as Chalcolithic, Eneolithic or simply Copper Age. This transition began at different times in different parts of the world, and was of different duration – the Copper Age began earlier in the Middle East, but in Europe the transition to the fully-fledged Bronze Age was more rapid.

The working of iron begins around 1200 BC in India, the Middle East and Greece, but again took time to spread to other parts of the world. The Iron Age continues on into historical times, not ending in Northern Europe until the Middle Ages.

This does to all intents and purposes give us a nine-age system:

Table 1.0: The career of Mankind (YA = Years Ago)

Archaeological/

Geological Time period

Events

Miocene (26m – 5m YA)

Proconsul (27m-17m YA)

Pliocene (5.0m – 1.64m YA)

Ardipithecus ramidus (5m – 4.2m YA)

Australopithecus anamensis (4.2m – 3.9m YA)

A. afarensis (4.0m 3.0m YA)

A. africanus (3.3m – 2.5m YA)

A. Garhi ()

Paranthropus aethiopicus (2.5m – 2.4m YA)

P. robustus (2.4m – 1.2m YA)

P. boisei (2.3m – 1.2m YA)

Lower Palaeolithic

(2.4m – 200,000 YA)

2.4m YA. Earliest true humans appear in Africa, though apparently sympatric with later “robust” australopithecines (Paranthropus). Now believed that early fossil hominids represent at least two synchronous (though not sympatric) human species, Homo habilis (brain size 590-690 cc) and Homo rudolfensis (750 cc). It is not known which if either was ancestral to later types.

Tools: Mode 1 Oldowan (2.4m – 1.5 m YA) flakes and choppers.

Mode 2 Acheulian (1.4m – 100,000 YA) handaxes and cleavers.

Lower Pleistocene (1.64m – 900,000 YA)

Middle Pleistocene (900,000 – 127,000 YA)

1.9m YA. Homo ergaster (brain size 700-850 cc) appears in Africa; migrates to Far East; migrants now widely regarded as becoming a separate species, Homo erectus (orig. both classed as erectus).

500,000 YA (poss. as early as 1.0m YA). Use of fire.

800,000 YA. Homo Antecessor. Controversial taxon known only from Atapuerca in Northern Spain, believed by some to be the common ancestor of both modern man and the Neanderthals.

500,000 YA. Larger-brained (1,200 cc) and bigger-boned hominids are found in the fossil record in Africa, Asia and Europe. Traditionally referred to as “archaic Homo sapiens” but Homo heidelbergensis now favoured. Other types have been proposed such as Homo rhodesiensis and H. helmei. It’s all very confusing!

250,000 YA. Homo neanderthalensis “the Neanderthals” appear in Europe, possibly descended from Homo heidelbergensis. They later spread to the Middle East.

250,000 – 35,000 YA. Mousterian culture in Europe.

Middle Palaeolithic (200,000 – 45,000 YA)

Late Pleistocene (127,000 – 11,600 YA)

Tools: Mode 3. (from 200,000 YA) flaking of prepared cores. Increasing use of the Levallois method to prepare cores, though this method was also used in late Acheulian times.

160,000 YA. Earliest near-anatomically modern humans, Homo sapiens idaltu, Herto, Ethiopia.

150,000 YA. Birth of putative “mitochondrial Eve” in East Africa.

100,000 YA. Homo sapiens in Israel (Skhul and Qafzeh).

50-60,000 YA. H. sapiens in Australia (Lake Mungo).

Upper Palaeolithic (45,000 – 11,600 YA)

43,000 YA. H. sapiens reach Europe.

Tools: Mode 4 (narrow blades struck from prepared cores).

35-29,000 YA. Châtelperronian culture, central and south-western France, final phase of Neanderthal industry.

34-23,000 YA. Aurignacian culture in Europe and south-west Asia.

32,000 YA. Chauvet-Pont-d’Arc cave paintings, southern France.

28,000 YA. Last Neanderthals die out.

28-22,000 YA. Gravettian culture, Dordogne, France. “Venus” figurines.

21-17,000 YA. Solutrean culture, France and Spain.

20-18,000 YA. Last Glacial Maximum (LGM), maximum glacier extent of last Ice Age.

16,500 YA. Lascaux cave paintings, Dordogne, France.

15-11,600 YA. Magdelanian culture in western Europe, final European Palaeolithic culture.

15,000-12,900 YA. Bølling-Allerød interstadial.

12,900 YA. Beginning of the Younger Dryas stadial.

12,000 YA. Jōmon culture in Japan, first use of pottery.

Epipalaeolithic (20,000 – 11,600 YA)

Ohalo II (20-19,000 YA)

Natufian culture (14,000-11,600 YA) in the Levant.

Holocene

Mesolithic (11,600 YA until adoption of agriculture)

11,600 YA. Last Ice Age ends.

11.6-6,000 YA. Hunter-gathering persists in many parts of the world.

Neolithic (11,600 – 6,500 YA and later in various parts of the world)

11,600 YA. Rapid transition to agriculture in Middle East and Anatolia.

Tools: Mode 5 (microliths).

9,200 YA Catalhoyuk – very large Neolithic settlement in Anatolia.

9,000 YA. Beginning of the “Wave of Advance” – expansion of proto Indo-European farmers from Anatolia.

9,500 YA. Çatal Höyük, Anatolia, apparently no more than a very large village.

8,500 YA. As sea levels rise, Britain becomes an island.

Chalcolithic (6,500 – 4,000 YA in various parts of the world)

Copper and stone tools in mixed use.

6,500 – 3,500 YA. The age of the great megaliths in Europe.

5,100 – 4,000 YA. Construction of Stonehenge.

Bronze Age (5,300 – 2,700 YA in various parts of the world)

4,500 YA. Construction of the pyramids in Egypt.

5,300-2,700 YA. Indus Valley civilization, India.

4,700-3,450 YA. Minoan civilization, Crete.

3,600-2,100 YA. Mycenaean civilization, Greece.

2,200 YA. Mediterranean Bronze Age collapse.

Iron Age (1800 BC into historical times)

1800 BC. First working of iron, in India.

800-450 BC. Hallstatt culture,

Central Europe.

450 BC. La Tene culture.

AD 43. Romans invade and conquer Britain.

Taxonomy

Within Class Mammalia (the mammals) humans are grouped with apes, monkeys and prosimians (lemurs, lorises, etc) within the order Primates. The term is due to Linnaeus, representing his view that humanity sat firmly at the top of creation’s tree (the self-styled Prince of Botany was also responsible for the term “mammal”, reflecting his now quite fashionable views about breast-feeding).

The majority of the 200 or so living species of primate are tropical or subtropical, living in rainforests. Most are arboreal (tree-dwelling) or at least spending much of their time in the trees. Even those that have forsaken this habit show arboreal adaptations in their ancestry. These include manipulative hands and often feet, with opposable thumbs and big toes; replacement of claws with nails; a reduced sense of smell and enhanced sight including colour and stereoscopic vision; locomotion based heavily on hind limbs and a common adoption of an upright posture; and finally a tendency for larger brains than comparably-sized mammals of other orders.

The anthropoids or simians (Suborder Anthropoidea) basically comprise the more human-like primates and include Old World monkeys, New World monkeys (including marmosets and tamarins), apes and finally humans. Other primates are traditionally lumped together as prosimians.

Historically, membership of Family Hominidae was restricted to humans and australopithecines, with the Great Apes being banished to a separate family, Pongidae. Both families were grouped with the gibbons, etc. in Superfamily Hominoidea (the Hominoids).

However this scheme is now known to be incorrect as chimps and gorillas are more closely related to humans than they are to orang-utans. Accordingly Pongidae is now “sunk” into Hominidae (it would also be incorrect to give the orang-utans their own family). The term “hominin” (from Tribe Hominini) is now gaining popularity, because it comprises humans and australopithecines, i.e. the “traditional” hominids. The term “hominine” (from Subfamily Homininae) is also sometimes encountered; this grouping adds gorillas and chimps, but not orang-utans. To get back to the original meaning of “hominid” and subtract the chimps we have to go down to the level of Subtribe Hominina. To my mind this is very confusing and pushing the envelope of what we can reasonably ask from Linnaean taxonomy, which is after all firmly rooted in Platonic Realism (Linnaeus was a creationist), rather than Darwinian principles. I see nothing wrong with the use of the term “hominid” so long as we are aware that it includes our cousins, the Great Apes.

Table 2.0 Family Hominidae (The Hominids)

Species

Av. Brain size/cc

Dates known/years ago

Distribution

Pongo pygmaeus(Orang-utan)

400

Present day

Sumatra, Borneo

Gorilla gorilla (Gorilla)

500

Present day

central and west Africa

Pan trogladytes (Chimpanzee)

400

Present day

central and west Africa

Pan paniscus (Bonobo)

400

Present day

DR Congo

Ardipithecus ramidus

400 – 500

5.8m – 4.4m

Australopithecus anamensis

400 – 500

5.0m – 4.2m

A. afarensis

400 – 500

4.0m – 3.0m

A. africanus

400 – 500

3.3m – 2.5m

A. garhi

400 – 500

3.0m – 2.0m

Parantropus aethiopicus

400 – 500

2.5m – 2.4m

P. robustus

410 – 530

2.4m – 1.2m

P. boisei

410 – 530

2.3m – 1.2m

Homo habilis

500 – 650

2.4m – 1.6m

H. rudolfensis

600 – 800

2.0m – 1.6m

H. ergaster

750 – 1,250

1.9m – 1.5m

H. erectus

750 – 1,250

1.8m – 400,000 (poss. later)

H. antecessor

>1,000?

800,000

Atapuerca, Spain

H. heidelbergensis

1,100 – 1,400

500,000 – 250,000

H. neanderthalensis

1,200 – 1,750

250,000 – 30,000

Europe, Middle East

H. sapiens idaltu

1,200 – 1,700

160,000

Herto, Ethiopia

H. sapiens sapiens

1,200 – 1,700

From 115,000

Worldwide

© Christopher Seddon 2008

Plato’s Theory of Forms

Plato (circa 427-347 BC) made contributions to practically every field of human interest and is undoubtedly one of the greatest thinkers of all times. However it is just as well that his political ideas didn’t catch on (except possibly in North Korea); additionally Platonic Realism bogged down biological science until Darwin and Wallace’s time.

Plato was influenced by Pythagoras, Parmenides, Heraclitus and Socrates (Russell (1946)). From Pythagoras he derived the Orphic elements in his philosophy: religion, belief in immortality, other-worldliness, the priestly tone, and all that is involved in the allegory of the cave; mathematics and his intermingling of intellect and mysticism. From Parmenides he derived the view that reality is eternal and timeless and that on logical grounds, all change must be an illusion. From Heraclitus he derived the view that there is nothing permanent in the world of our senses. Combining this with the doctrine of Parmenides led to the conclusion that knowledge is not to be derived from the senses but achieved by intellect – which ties in with Pythagoras. Finally from Socrates came his preoccupation with ethics and his tendency to seek teleological rather than mechanical explanations.

Realism, as opposed to nominalism, refers to the idea that general properties or universals have a mode of existence or form of reality that is independent of the objects that possess them. A universal can be a type, a property or a relation. Types are categories of being, or types of things – e.g. a dog is a type of thing. A specific instance of a type is known as a token, e.g. Rover is a token of a dog. Properties are qualities that describe an object – size, colour, weight, etc, e.g. Rover is a black Labrador. Relations exist between pairs of objects, e.g. if Rover is larger than Gus then there is a relation of is-larger-than between the two dogs. In Platonic Realism universals exist, but only in a broad abstract sense that we cannot come into contact with. The Form is one type of universal.

The Theory of Forms (or Ideas) is referred to in Plato’s Republic and other Socratic Dialogues and follows on from the work of Parmenides and his arguments about the distinction between reality and appearance. The theory states that everything existing in our world is an imperfect copy of a Form (or Idea), which is a perfect object, timeless and unchanging, existing in a higher state of reality; for example there are many types of beds, double, single, four-poster etc but they are only imperfect copies of the Form of the bed, which is the only real bed. Plato frowned upon the idea of painting a bed because the painting would merely be a copy of a copy, and hence even more flawed. The world of Forms contains not only the bed Form but a form for everything else – tables, wristwatches, dogs, horses, etc. Forms are related to particulars (instances of objects and properties) in that a particular is regarded as a copy of its form. For example, a particular apple is said to be a copy of the form of Applehood and the apple’s redness is a copy of the form of Redness. Participation is another relationship between forms and particulars. Particulars are said to participate in the forms, and the forms are said to inhere in the particulars, e.g. redness inheres in an apple. Not all forms are instantiated, but all could be. Forms are capable of being instantiated by many different particulars, which would result in the form having many copies, or inhering many particulars.

Needless to say, the world of the Forms was only accessible to philosophers, a view which justified the Philosopher Kings of the Republic, and casts philosophers in the same role as shamans and priests as people with exclusive access to worlds better than our own, and hence the basis of a ruling elite. That animals have ideal Forms is a view that bogged down biological science for centuries, as it rules out any notion of evolution. (The Republic also advocated such unsavoury practices as eugenics (dressed up as a rigged mating lottery); abolition of the family; censorship of art; and a caste-system based on a “noble lie” of the “myth of metals” (which I suppose is better than a war based on the ignoble lie of the myth of weapons of mass destruction). The Republic seems to have influenced Huxley’s Brave New World, Orwell’s 1984 and the Federation of Heinlein’s Starship Troopers).

The inheritance criticism questions what it means to say that the form of something inheres in a particular or that the particular is a copy of the form. If the form is not spatial, it cannot have a shape, so the particular cannot be the same shape as the form.

Arguments against the inherence criticism claim that a form of something spatial can lack a concrete location and yet have abstract spatial qualities. An apple, for example, can have the same shape as its form. Such arguments typically claim that the relationship between a particular and its form is very intelligible and people apply Platonic theory in everyday life, for example “car”, “aeroplane”, “cat” etc don’t have to refer to specific vehicles, aircraft or cats.

Another criticism of forms relates to the origin of concepts without the benefit of sense-perception. For example, to think of redness-in-general is to think of the form of redness. But how can one have the concept of a form existing in a special realm of the universe, separate from space and time, since such a concept cannot come from sense-perception. Although one can see an apple and its redness, those things merely participate in, or are copies of, the forms. Thus to conceive of a particular apple and its redness is not to conceive of applehood or redness-in-general.

Platonic epistemology, however, addresses such criticism by saying that knowledge is innate and that souls are born with the concepts of the forms. They just have to be reminded of those concepts from back before birth, when they were in close contact with the forms in the Platonic heaven. Plato believed that each soul existed before birth with “The Form of the Good” and a perfect knowledge of everything. Thus, when something is “learned” it is actually just “recalled.”

Plato stated that knowledge is justified true belief, i.e. if we believe something, have a good reason for doing so, and it is in fact true, then the belief is knowledge. For example, if I believe that the King’s Head sells London Pride (because I looked it up in the Good Beer Guide), I get a bus to the pub and see a Fullers sign outside, then I have knowledge that it sells London Pride. This view has been central to epistemological debate ever since Plato’s time.

Plato drew a sharp distinction between knowledge which is certain, and mere opinion which is not certain. Opinions derive from the shifting world of sensation; knowledge derives from the world of timeless forms, or essences. In the Republic, these concepts were illustrated using the metaphor of the sun, the divided line and the allegory of the cave.

Firstly, the metaphor of the sun is used for the source of “intellectual illumination”, which Plato held to be The Form of the Good. The metaphor is about the nature of ultimate reality and how we come to know it. It starts with the eye, which is unusual among the sense organs in that it needs a medium, namely light, in order to operate. The strongest source of light is the sun; with it, we can discern objects clearly. By analogy, we cannot attempt to understand why intelligible objects are as they are and what general categories can be used to understand various particulars around us without reference to forms. “The domain where truth and reality shine resplendent” is Plato’s world of forms, illuminated by the highest of all the forms – the Form of the Good. Since true being resides in the world of the forms, we must direct our intellects there to have knowledge. Otherwise we have mere opinion, i.e that which is not certain.

Secondly, the divided line has two parts that represent the intelligible world and the smaller visible world. Each of those two parts is divided, the segments within the intelligible world represent higher and lower forms and the segments within the visible world represent ordinary visible objects and their shadows, reflections, and other representations. The line segments are unequal and their lengths represent “their comparative clearness and obscurity” and their comparative “reality and truth,” as well as whether we have knowledge or instead mere opinion of the objects. Hence, we are said to have relatively clear knowledge of something that is more real and “true” when we attend to ordinary perceptual objects like rocks and trees; by comparison, if we merely attend to their shadows and reflections, we have relatively obscure opinion of something not quite real.

Finally Plato drew an analogy between human sensation and the shadows that pass along the wall of a cave – the allegory of the cave. Prisoners inside a cave see only the shadows of puppets in front of a fire behind them. If a prisoner is freed, he learns that his previous perception of reality was merely a shadow and that the puppets are more real. If the learner moves outside of the cave, they learn that there are real things of which the puppets are themselves mere imitations, again achieving a greater perception of reality. Thus the mere opinion of viewing only shadows is steadily replaced with knowledge by escape from the cave, into the world of the sun and real objects. Eventually, through intellectualisation, the learner reaches the forms of the objects – i.e. their true reality.

© Christopher Seddon 2008

Radiometric dating techniques

A major problem for archaeologists and palaeontologists is the reliable determination of the ages of artefacts and fossils.

As far back as the 17th Century the Danish geologist Nicolas Steno proposed the Law of Superimposition for sedimentary rocks, noting that sedimentary layers are deposited in a time sequence, with the oldest at the bottom. Over a hundred years later, the British geologist William Smith noticed that sedimentary rock strata contain fossilised flora and fauna, and that these fossils succeed each other from top to bottom in a consistent order that can be identified over long distances. Thus strata can be identified and dated by their fossil content. This is known as the Principle of Faunal succession. Archaeologists apply a similar principal, artefacts and remains that are buried deeper are usually older.

Such techniques can provide reliably relative dating along the lines of “x is older than y”, but to provide reliable absolute values for the ages of x and y is harder. Before the introduction of radiometric dating in the 1950s dating was a rather haphazard affair involving assumptions about the diffusion of ideas and artefacts from centres of civilization where written records were kept and reasonably accurate dates were known. For example, it was assumed – quite incorrectly as it later turned out – that Stonehenge was more recent than the great civilization of Mycenaean Greece.

The idea behind radiometric dating is fairly straightforward. The atoms of which ordinary matter is composed each comprise a positively charged nucleus surrounded by a cloud of negatively charged electrons. The nucleus itself is made up of a mixture of positively charged protons and neutral neutrons. The atomic weight is total number of protons plus neutrons in the nucleus and the atomic number is the number of protons only. The atom as a whole has the same number of electrons as it does protons, and is thus electrically neutral. It is the number of electrons (and hence the atomic number) that dictate the chemical properties of an atom and all atoms of a particular chemical element have the same atomic number, thus for example all carbon atom have an atomic number of six. However the atomic weight is not fixed for atoms of a particular element, i.e. the number of neutrons they have can vary. For example carbon can have 6, 7 or 8 neutrons and carbon atoms with atomic weights of 12, 13 and 14 can exist. Such “varieties” are known as isotopes.

The physical and chemical properties of various isotopes of a given element vary only very slightly but the nuclear properties can vary dramatically. For example naturally-occurring uranium is comprised largely of U-238 with only a very small proportion of U-235. It is only the latter type that can be used as a nuclear fuel – or to make bombs. Many elements have some unstable or radioactive isotopes. Atoms of an unstable isotope will over time decay into “daughter products” by internal nuclear change, usually involving the emission of charged particles. For a given radioisotope, this decay takes place at a consistent rate which means that the time taken for half the atoms in a sample to decay – the so called half-life – is fixed for that radioisotope. If an initial sample is 100 grams, then after one half-life there will only be 50 grams left, after two half-lives have elapsed only 25 grams will remain, and so on.

It is upon this principle that radiometric dating is based. Suppose a particular mineral contains an element x which has a number of isotopes, one of which is radioactive and decays to element y with a half-life of t. The mineral when formed does not contain any element y, but as time goes by more and more y will be formed by decay of the radioisotope of x. Analysis of a sample of the mineral for the amount of y contained will enable its age to be determined provided the half-life t and isotopic abundance of the radioisotope is known.

The best-known form of radiometric dating is that involving radiocarbon, or C-14. Carbon – as noted above – has three isotopes. C-12 (the most common form) and C-13 are stable, but C-14 is radioactive, with a half-life of 5730 years, decaying to N-14 (an isotope of nitrogen) and releasing an electron in the process (a process known as beta decay). This is an infinitesimal length of time in comparison to the age of the Earth and one might have expected all the C-14 to have long since decayed. In fact the terrestrial supply is constantly being replenished from the action of interstellar cosmic rays upon the upper atmosphere where moderately energetic neutrons interact with atmospheric nitrogen to produce C-14 and hydrogen. Consequently all atmospheric carbon dioxide (CO2) contains a very small but measurable percentage of C-14 atoms.

The significance of this is that all living organisms absorb this carbon either directly (as plants photosynthesising) or indirectly (as animals feeding on the plants). The percentage of C-14 out of all the carbon atoms in a living organism will be the same as that in the Earth’s atmosphere. The C-14 atoms it contains are decaying all the time, but these are replenished for as long as the organism lives and continues to absorb carbon. But when it dies it stops absorbing carbon, the replenishment ceases and the percentage of C-14 it contains begins to fall. By determining the percentage of C-14 in human or animal remains or indeed anything containing once-living material, such as wood, and comparing this to the atmospheric percentage, the time since death occurred can be established.

This technique was developed by Willard Libby in 1949 and revolutionised archaeology, earning Libby the Nobel Prize for Chemistry in 1960. The technique does however have its limitations. Firstly it can only be used for human, animal or plant remains – the ages of tools and other artefacts can only be inferred from datable remains, if any, in the same context. The second is that it only has a limited “range”. Beyond 60,000 years (10 half-lives) the percentage of C-14 remaining is too small to be measured, so the technique cannot be used much further back than the late Middle Palaeolithic. Another problem is the cosmic ray flux that produces C-14 in the upper atmosphere is not constant as was once believed. Variations have to be compensated for by calibration curves, based on samples that have an age that can be attested by independent means such as dendochronology (counting tree-rings). Finally great care must be taken to avoid any contamination of the sample in question with later material as this will introduce errors.

The conventions for quoting dates obtained by radiocarbon dating are a source of considerable confusion. They are generally quoted as Before Present (BP) but “present” in this case is taken to be 1950. Calibrated dates can be quoted, but quite often a quoted date will be left uncalibrated. Uncalibrated dates are given in “radiocarbon years” BP. Calibrated dates are usually suffixed (cal), but “present” is still taken to be 1950. To add to the confusion, Libby’s original value for the half-life of C-14 was later found to be out by 162 years. Libby’s value of 5568 years, now known as the “Libby half-life”, is rather lower than the currently-accepted value of 5730 years, which is known as the Cambridge half-life. Laboratories, however, continue to use the Libby half-life! In fact this does make sense because by quoting all raw uncalibrated data to a consistent standard means any uncalibrated radiocarbon date in the literature can be converted to a calibrated date by applying the same set of calculations. Furthermore the quoted dates are “futureproofed” against any further revision of the C-14 half-life or refinement of the calibration curves.

If one needs to go back further than 60,000 years other techniques must be used. One is Potassium-Argon dating, which relies on the decay of radioactive potassium (K-40) to Ar-40. Due to the long half-life of K-40, the technique is only useful for dating minerals and rocks that are over than 100,000 years old. It has been used to bracket the age of archaeological deposits at Olduvai Gorge and other east African sites with a history of volcanic activity by dating lava flows above and below the deposits.

© Christopher Seddon 2008

The Day of the Triffids, by John Wyndham (1951)

Science Fiction does not often make an appearance on the school curriculum, but The Day of the Triffids is one work that has been required reading for generations of pupils. I first encountered the book nearly forty years ago, in fact just months after the death of its author at the comparatively early age of 66. At school, I must confess, my enthusiasm for Wuthering Heights, Return of the Native and I Claudius was (shamefully!) less than these great works warranted. But The Day of the Triffids was unputdownable. Instead of reading the two chapters set for homework that evening, I read the entire book!

It is reasonable to say that I could have been presented with many other works of science fiction and devoured them with equal gusto. Few of these would be regarded as great works of SF, let alone English Literature. But no other book has ever appealed to two more differing arbiters of what constitutes a good read, myself at the age of fourteen and those seemingly determined to stuff down pupils’ throats the dullest books imaginable.

So why is a somewhat dated science fiction novel, written from a seemingly rather prim post-war middle class perspective, still popular now – almost half a century after it was written?

Read the first few pages and you will see why. There is something for everybody, from the most inattentive schoolboy to the stodgiest academic. The first line is one of the finest opening sentences to any book ever written, SF or otherwise….

When a day you happen to know is Wednesday starts off by sounding like Sunday, there is something seriously wrong somewhere.

Tension mounts immediately as we sense that the hospitalised narrator, not named until the tenth page as Bill Masen, is helpless. Realisation is slow to come that he is blind – at least temporarily so. His eyes are bandaged following emergency treatment to save his sight. And his plight is nightmarish. Not just the hospital, but the world outside, has apparently ceased to function. Nothing can be heard – not a car, not even a distant tugboat. Nothing but church clocks, with varying degrees of accuracy, announcing first eight o’clock, then quarter-past, then nine…
We learn that the previous night, the whole Earth had been treated to a magnificent display of green meteors, believed at first to be comet debris. Masen is bitterly disappointed at being one of the few people to miss the display. He wonders has the whole hospital, the whole of London made such a night of it that nobody has yet pulled round. Eventually, he takes off the bandages, which were in any case due to come off, by himself. He is greatly relieved to find that he can see – he soon finds out that is one of the few people left who can.

The hospital has been transformed into a Doréan nightmare of blinded patients milling helplessly around. The only doctor Masen encounters hurls himself from a fifth floor window after finding his telephone is dead. After giving only cursory consideration to trying to help the blinded, he flees the hospital. What, he rationalises, would he do if he did succeed in leading them outside? It is already becoming apparent that the scale of the disaster extends way beyond the hospital. He makes for the nearest pub, desperately in need of a drink. But this is a nightmare from which there is no escape. The pub landlord is also blind, to say nothing of blind drunk. He blames the meteor shower for his condition. He says that having discovered their children were also blinded, his wife gassed them and herself, and he intends to join them once he is drunk enough.
Anybody who describes this as “cosy catastrophism” really needs to re-read just this first chapter to be firmly disabused of the notion.

At a single stroke, mankind’s complex civilisation has been brought down, all but a tiny handful of the world’s population blinded. Nor is this the extent of humanity’s troubles. Within hours, triffids have broken out of captivity and are running amok, and within a week London is smitten by plague. Only near the end of the book do we learn that mankind, in all probability, brought this triple-whammy down upon himself.

The Day of the Triffids is set in the near future, although no date is given. Masen, who is apparently an only child, is in his late twenties when the story begins and his father had reached adulthood before the war. The catastrophe, that turns out to have been caused by a satellite weapon having been accidentally set off in space rather than close to the ground, probably occurs around 1980.

Masen lives in a world in which food shortages are the biggest challenge to mankind. The triffid, a mobile carnivorous plant equipped with a lethal sting, is being farmed world-wide as a source of vegetable oil and cattle-food. Originally bred in secret in the Soviet Union, they are distributed world-wide when an attempt to steal a case of fertile triffid seeds backfires. Masen himself is making a successful career in the triffid business and is hospitalised when one stings him in the eyes – thus it is the triffids who are responsible for his escaping the almost universal blindness.

The story follows the adventures of Masen and fellow-survivor Josella Playton and explores the differing attempts of various groups to deal with the catastrophe. Some want to somehow cling on to a vestige of the social and moral status quo, others see the situation as an opportunity for personal advancement. The well-meaning but ultimately hopeless attempts of Wilfred Coker to keep as many blind people alive for as long as possible end in failure within a week when the plague strikes. Miss Durrant’s attempt to build a Christian community fares little better, and it too succumbs to the plague. The dictator Torrence tries to set up a feudal state, using the blind as slave labour, fed upon mashed triffid.

From the start, though, Masen and Ms. Playton take the same view as Michael Beadley, the avuncular leader of a group of survivors holed up in Senate House. Nothing can be done for the vast majority of the blind – mankind’s best hope for the future is to set up a community of largely sighted survivors, in a place of comparative safety.

Thus Wyndham explores from different angles the question of how ordinary people face up to the task of trying to run a small community, something that is quite challenging under even normal circumstances, with everybody seemingly having different views on how things should be done.

Coker’s shenanigans see to it that many adventures must pass before Masen and Ms. Playton eventually link up with Beadley’s group, by now ensconced on a triffid-free Isle of Wight.

The Day of the Triffids has been likened to Orwell’s Nineteen Eighty-four for both its cold-war extrapolations and its gloomy perspective of misery for evermore. But this view is wrong on both counts. Wyndham’s remarks about the Soviet Union could have been written by almost any author between the end of the war and the rise of Mikhail Gorbachev. And despite the magnitude of the disaster to have overtaken mankind, the tone of The Day of the Triffids is an optimistic one. Its recurring message is that a portion of mankind has been spared to begin again, and the human race has in fact escaped the even worse fate that was becoming increasingly inevitable in a world threatened by both global nuclear war and mass starvation. The triffids’ possession of the world will be a temporary thing, and in the last paragraph of the book, Wyndham suggests that research into ways to destroy them is well underway. Within two or three generations at most, mankind will be in a position to strike back and reclaim all he has lost.

It is perhaps the upbeat endings and veneer of British middle-class values, a constant feature of Wyndham’s work, which fools people into labelling him with the “cosy catastrophe” tag. In fact, there is much more to his work than met even my enthusiastic eye when, in the Autumn of 1969, I first encountered an author I still count as one of my great favourites.

The Day of the Triffids was made into a truly appalling Hollywood movie, starring country and western singer Howard Keel (1963), and a superior BBC television series (1981). Simon Clark wrote a sequel, The Night of the Triffids, in 2001. My personal feeling is that another movie version is long overdue.

© Christopher Seddon 2008