Category Archives: Uncategorized

Stop the eco-triumphalism now

ET6ZIXgWsAAvQQk

There is no excuse for this. Image taken from the Twitter feed of XR East (@xr_east)*. Note that Extinction Rebellion UK have disavowed both this group and the sentiments expressed in the post. Similar opinions have been circulating on social media for some time though.

I’m angry. I was reluctant to write anything on this blog about COVID-19, pandemics or epidemiology, given that none of them are among my specialisms*. But lately I’ve been appalled by a subset of ecologists and environmental campaigners — admittedly small — who have used this crisis to score points for their personal obsessions.

This is not to criticise those who seek positive messages to cling onto when there is so much to mourn or fear. Some of this desire is misdirected, including numerous stories of nature returning to areas which have seen dramatic reductions in human activity. Whether it’s drunk elephants in Chinese villages or dolphins in Venice, these have usually turned out to be false reports. Does it matter that many people share stories that are quickly disproven if they offer a brief cheering distraction? Perhaps not so much. There are more dangerous lies out there.

Yet linked to many of these stories is an agenda, sometimes implicit, other times actively advanced, which has at its core a desire to show that something good has come of the pandemic. Few would say so with quite that lack of subtlety, but there is no other way to interpret many of the articles on declines in air pollution, clear skies or the potential for advancing a green agenda during the anticipated recovery. I’m more sympathetic to those who point out that the international response shows that we have the ability to pull together for the collective good that could be harnessed again. There may be opportunities to be taken once we can see the light at the end of the tunnel, but that still seems a long way off.

Others are even less subtle, for example the assertion by the European Agroforestry Federation that all this wouldn’t have happened if we’d had more agroforestry. Even if true, and despite being a longstanding advocate of agroforestry I’d struggle to make an evidence-based case for it***, now is surely not the time.

The most pernicious narrative, one which lurks in the fringes of the Deep Green movement and other more mainstream environmentalist groups, is that anything that reduces human populations and activities can only be a good thing for the environment. I have even seen it argued (and I refuse to link to it) that COVID-19 is an expression of Gaia immunity, the Earth fighting back against the infestation of humans that is taking it out of some mythical balance****. The refrain ‘we are the virus’ has been expressed repeatedly on social media. The sentiment at the top of this article is not a isolated incident. Nor is it a new idea; in 1988 Prince Philip told a German interviewer that he was ‘tempted to ask for reincarnation as a particularly deadly virus’ so as to aid in population control. Yes, he really did say that.

To believe that a catastrophe causing widespread mortality is somehow beneficial is simple eco-fascism. Privileged people in the Global North, living in splendid isolation whilst delighting (even indirectly) in the death, privation and suffering of others, is not morally acceptable on any grounds. There is no positive side to a pandemic.

There are two strands to eco-fascism. One has a long history in the belief that restricted national resources compel populations to resist immigration in the name of supposed sustainability. This narrative has already inspired acts of terrorism, but also been made in more mainstream settings by France’s National Rally (Rassemblement national) among others. Most reasonable, compassionate people would disavow this set of arguments.

The other narrative, however, holds that any reduction in human population can only help the natural world. This is sometimes expressed as the suggestion that not having (so many) children is the best way of combating climate change, an argument which is both logically and morally suspect. Any case for global population control always ends up being anti-female, racist and ultimately anti-human. Even if we might agree on a ‘sustainable’ human population level — and that’s not an easy figure to decide upon — the process of reaching it is unconscionable by any means, and certainly with the speed required for it to be an effective measure. As Jenny Turner put it, “How can a planet lose seven or eight billion humans… without events of indiscriminate devastation? When people start thinking about getting rid of other people, which sorts of people does history suggest are usually got rid of first?”

On this point, in which countries do you expect the death rate from COVID-19 to be the highest? It will almost certainly be in the Global South where access to medical care, nutrition and government support is most lacking. They are not the ones who bear the greatest responsibility for global change, and once again they will carry the heaviest burden. This pandemic is first and foremost a tragedy. It will not in itself be of any lasting benefit to the environment.

We are rightly offended when politicians use disasters to advance their own partisan agendas. As scientists, including ecologists, we need to step back from the campaigns that are usually at the forefront of our minds and accept that this situation is a distinct crisis of its own. Yes climate change remains a pressing issue; yes extinctions continue apace; yes the international wildlife trade is abhorrent. But these are separate problems which the world’s attention will return to again. For now simple compassion and decency requires us to stand back and accept that for once it isn’t about us.

 


 

* I suspect (and rather hope) that the post will be taken down, but here’s the evidence. There is a dispute going on about whether this represents a genuine faction of XR, or ‘infiltrators’, which are difficult for an autonomous movement with no central leadership to protect themselves against. It wouldn’t be the first time that someone associated with XR has said something offensive though. For the record, I continue to be a strong supporter of XR and their core principles.

gscreenshot_2020-03-25-101141

** Not that this has stopped quite a number of people.

*** The only relevant project I’ve been involved with, on mosquito-borne diseases in Northern Thailand, actually implicated orchard expansion as a likely cause of greater risk of infection. But this is a complicated subject and no single study can tell the whole story, particularly one which wasn’t designed to test that hypothesis.

***** I don’t have time here to explain in detail why Gaia theory is nonsense and has been rejected by the majority of ecologists. For more see pages 175–176 in my textbook and the papers it refers to. I’m also writing another book about it.

Advertisement

Junk pedagogical research

stickoftruth

Can I teach? I think so. Should I aspire to publish papers telling other people how to teach? Probably not. This is me showing students how to estimate tree height.

In my last job I was employed on a teaching-track position*. For many years this worked reasonably well for me. I enjoy teaching, I think I’m quite good at it, and I didn’t mind a slightly higher load as the price of not needing to satisfy arbitrary targets for research grant income or publication in high-impact journals. That’s not to say I stopped doing research, because obviously that didn’t happen, but I accepted that there was a trade-off between the two and that I was closer to one end of the spectrum. It still left me three clear months every summer to get out into the field and collect data.

Many UK universities developed teaching-track positions in response to the national research assessment exercise (the REF**) which incentivised them to concentrate resources in the hands of a smaller number of staff whilst ensuring that someone else got on with the unimportant business of running the university and the distraction of educating undergraduates. Such is the true meaning of research-led teaching.

A problem began to arise when those staff who had been shuffled into teaching-track positions applied for promotion. The conventional signifiers of academic success weren’t relevant; you could hardly expect them to bring in large grants, publish in top-tier journals or deliver keynotes at major conferences if they weren’t being given the time or support to do so.

Some head-scratching took place and alternative means were sought out to decide who was performing well. It’s hard enough to determine what quality teaching looks like at an institutional level***, and assessing individuals is correspondingly even more difficult.

The first thing to turn to is student evaluations. These largely measure how good a lecturer is at entertaining and pleasing their students, or how much the students enjoy the subject. Evidence suggests that evaluations are inversely proportional to the amount that students learn, as well as being biased against women and protected minorities. In short they’re not just the wrong measure, they’re actively regressive in their effects. Not that this stops many universities using them of course.

What else is there? Well, being academics, the natural form of output to aim for is publications. It’s the only currency some academics understand. Not scientific research papers, of course, because teaching staff aren’t supposed to be active researchers. So instead the expectation became that they would publish papers based on pedagogical research****. This sounds, on the face of it, quite sensible, which is why many universities went down that route. But there are three major problems.

1. Pedagogical research isn’t easy. There are whole fields of study, often based in departments of psychology, who have developed approaches and standards to ensure that work is of appropriate quality. Expecting an academic with a background in biochemistry or condensed matter physics to publish in a competitive journal of pedagogical research without the necessary training is unreasonable. Moreover, it’s an implicit insult to those colleagues for whom such work is their main focus. Demanding that all teachers should publish pedagogical research implies that anyone can do it. They can’t.

2. Very few academics follow pedagogical research. That’s not to say that they shouldn’t. Most academics teach and are genuinely interested in doing so as effectively as possible. But the simple truth is that it’s hard enough to keep track of the literature in our areas of research specialism. Not many can make time to add another, usually unrelated field to their reading list. I consider myself more engaged than most and even I encounter relevant studies only through social media or articles for a general readership.

3. A lot of pedagogical research is junk. Please don’t think I’m talking about the excellent, specialist work done by expert researchers into effective education practice. There is great work out there in internationally respected journals. I’m talking about the many unlisted, low-quality journals that have proliferated over recent years, and which give education research a bad name. Even if they contain some peer review process, many are effectively pay-to-publish, and some are actively predatory. I won’t name any here because that’s just asking for abusive e-mails.

Why to these weak journals exist? Well, we have created an incentive structure in which a class of academics needs to publish something — anything — in order to gain recognition and progress in their careers. A practice which we would frown upon in ‘normal’ research is actively encouraged by many of the world’s top universities. Junk journals and even junk conferences proliferate as a way to satisfy universities’ own internal contradictions.

What’s the alternative? I have three suggestions:

1. Stop imposing an expectation based on research onto educators. If research and teaching are to be separated (a trend I disagree with anyway) then they can’t continue to be judged by the same metrics. Incentivising publications for their own sake helps no-one. Some educators will of course want to carry out their own independent studies, and this should be encouraged and respected, but it isn’t the right approach for everyone.

2. Put some effort into finding out whether teachers are good at their job. This means peer assessments of teaching, student performance and effective innovation. All this is difficult and time-consuming but if we want to recognise good teachers then we need to take the time to do it properly. Proxy measures are no substitute. Whether someone can write a paper about teaching doesn’t imply that they can teach.

3. Support serious pedagogical researchers. If you’re based in a large university then there’s almost certainly a group of specialist researchers already there. How much have you heard about their work? Have you collaborated with them? Universities have native expertise which could be used to improve teaching practice, usually much more efficiently than forcing non-specialists to jump through hoops. If the objective is genuinely to improve teaching standards then ask the people who know how to do it.

If there’s one thing that shows how evaluations of teaching aren’t working or taken seriously it’s that universities don’t make high-level appointments based on teaching. Prestige chairs exist to hire big-hitters in research based on their international profile, grant income and publication record. When was the last time you heard of a university recruiting a senior professor because they were great at teaching? Tell me once you’ve stopped laughing.

 


 

* This is now relatively common among universities in Europe and North America. The basic principle is that some staff are given workloads that allow them to carry out research, whilst others are given heavier teaching and administrative loads but the expectations for their research income and outputs are correspondingly reduced.

** If you don’t know about the Research Evaluation Framework and how it has poisoned academic life in the UK then don’t ask. Reactions from those involved may vary from gentle sobs to inchoate screaming.

*** Which gave rise to the Teaching Evaluation Framework, or TEF, and yet more anguish for UK academics. Because the obvious way to deal with the distorting effect of one ranking system is to create another. Surely that’s enough assessment of universities based on flawed data? No, of course not, because there’s also the Knowledge Evaluation Framework (KEF) coming up. I’m not even joking.

**** Oddly textbooks often don’t count. No, I can’t explain this. But I was told that publishing a textbook didn’t count as scholarship in education.

From tiny acorns

My father planted acorns.

This is one of those recollections that arrives many years after the fact and suddenly strikes me as having been unusual. As a child, however, it seemed perfectly normal that we should go out collecting acorns in the autumn. Compared to my father’s other eccentric habits and hobbies, of which there were many*, gathering acorns didn’t appear to be particularly strange or worthy of note.

In our village during mast years the acorns would rain down from boundary and hedgerow oak trees and sprout in dense carpets along the roadside. This brief flourishing was inevitably curtailed by the arrival of Frankie Ball, the local contractor responsible for mowing the verges. His indiscriminate treatment shredded a summer’s worth of growth and ensured that no seedlings could ever survive.

sprouting_acorn

Sprouting acorn (Quercus robur L.) by Amphis.

Enlightened modern opinion would now declare that mowing roadside verges is ecologically damaging; it removes numerous late-flowering plants and destroys potential habitats for over-wintering insects. I’m not going to pass such judgement here though because it was a purely practical decision. Too much growth would result in blocked ditches, eventually flooding fields and properties. Frankie was just doing his job.

My father, however, couldn’t allow himself to see so many potential oak trees perish. His own grandfather had been a prominent forester back in the old country (one of the smaller European principalities that no longer exists), and with a family name like Eichhorn it’s hard not to feel somehow connected to little oak trees. He took it upon himself to save as many of them as he could.

And so it was that we found ourselves, trowels in hand, digging up sprouting acorns from the roadsides and transporting them by wheelbarrow to the wood on Jackson’s farm.  Here they would be gently transplanted into locations that looked promising and revisited periodically to check on their progress. Over the years this involved at least hundreds of little acorns, perhaps thousands.

They all died. This isn’t too surprising: most offspring of most organisms die before they reach adulthood. Trees have a particularly low rate of conversion of seedlings to adults, probably less than one in a thousand. That’s just one of the fundamental facts of life and a driving force of evolution. Why though did my father’s experiment have such a low success rate? He’d apparently done everything right, even choosing to plant them somewhere other trees had succeeded before**. It’s only after becoming a forest ecologist myself that I can look back and see where he was going wrong.

First, oak trees are among a class of species that we refer to as long-lived pioneers. This group of species is unusual because most pioneers are short-lived. Pioneers typically arrive in open or disturbed habitats, grow quickly, then reproduce and die before more competitive species can drive them out. Weeds are the most obvious cases among plants, but if you’re looking at trees then something like a birch would be the closest comparison.

Oaks are a little different. Their seedlings require open areas with lots of light to grow, which means that they don’t survive well below a dark forest canopy. Having managed to achieve a reasonable stature, however, they stick around for many centuries and are hard to budge. In ecology we know this as the inhibition model of succession. Oaks are great at building forests but not so good at taking them over.

The next problem is that oak seedlings do particularly badly when in the vicinity of other adult oak trees. This is because the pests and diseases associated with large trees quickly transfer themselves to the juveniles. An adult tree might be able to tolerate losing some of its leaves to a herbivore but for a seedling with few resources this can be devastating. This set of forces led to the Janzen-Connell hypothesis which predicts that any single tree species will be prevented from filling a habitat because natural enemies ensure that surviving adults end up being spread apart. A similar pattern can arise because non-oak trees provide a refuge for oak seedlings. Whatever the specific causes, oak seedlings suffer when planted close to existing oaks.

This makes it seem a little peculiar that acorns usually fall so close to their parent trees. The reason acorns are such large nuts*** is that they want to attract animals which will try to move and store them over winter. This strategy works because no matter how many actually get eaten, a large proportion of cached acorns remain unused (either they’re forgotten or the animal that placed them dies) and so they are in prime position to grow the following spring. Being edible and a desirable commodity is actually in the interests of the tree.

jay

A Eurasian jay, Garrulus glandarius. Image credit: Luc Viatour.

Contrary to most expectations, squirrels turn out to be pretty poor dispersers of acorns. Although they move acorns around and bury them nicely, they don’t put them in places where they are likely to survive well. Jays are much better, moving oaks long distances and burying single acorns in scrubby areas where the new seedlings will receive a reasonable amount of light along with some protection from browsing herbivores. My father’s plantings failed mainly because he wasn’t thinking like a jay.

My father’s efforts weren’t all in vain. The care shown to trees and an experimental approach to understanding where they could grow lodged themselves in my developing mind and no doubt formed part of the inspiration that led me to where I am today****. From tiny acorns, as they say.

 


 

* His lifelong passion is flying, which at various points included building his own plane in the garden shed and flying hang-gliders. It took me a while to realise that not everyone’s father was like this.

** One possible explanation we can rule out is browsing by deer, which often clear vegetation from the ground layer of woodlands. Occasional escaped dairy cows were more of a risk in this particular wood.

*** Yes, botanically speaking they are nuts, which means a hard indehiscent (non-splitting) shell containing a large edible seed. Lots of things that we call nuts aren’t actually nuts. This is one of those quirks of terminology that gives botanists a bad name.

**** Although I’m very sceptical of teleological narratives of how academics came to choose their areas of study.

If you can’t sketch a figure then you don’t have a hypothesis

herbivory

Caterpillar feeding on Urena lobata leaf in Pakke Tiger Reserve, Arunachal Pradesh, India. What is the response by the plant? Photo by Rohit Naniwadekar (CC BY-SA).

This is one of my mantras that gets repeated time and again to students. Sometimes it feels like a catch-phrase, one of a small number of formulaic pronouncements that come out when you pull the string on my back. Why do I keep saying it, and why should anyone pay attention?

Let me give an example*. Imagine you’re setting up an experiment into how leaf damage by herbivores influences chemical defence in plant tissues. You might start by assuming that the more damage you inflict, the greater the response induced in the plant. This sounds perfectly sensible. So are you ready to get going and launch an experiment? No, absolutely not. First let’s turn your rather flimsy hypothesis into an actual expectation.

gr1

Why is this a straight line? Because you’ve not specified any other shape for the relationship. A straight line has to be the starting point, and if you collect the data and plug it directly into a GLM without thinking about it, this is exactly what you’re assuming. But is there any reason why it should be a straight line? All sorts of other relationships are possible. It could be a saturating curve, where the plant reaches some maximum response; this is more likely than a continuous increase. Alternatives include an accelerating response, or a step change at a certain level of damage.

gr2

 

Perhaps what you need to do is take a step back and think about that x-axis. What levels of damage are you measuring, and why? If you expect an asymptotic response then you’re going to need to sample quite a wide range of damage levels, but there’s no need to continue to extremes because after a certain point nothing more is going to happen. If it’s a step change then your whole design should concentrate on sampling intensively around the point where the shift occurs so as to identify that parameter accurately. And so on. Drawing this first sketch has already forced you to think more carefully about your experimental design.

Let’s not stop there though. Look at the y-axis, which rather blandly promises to measure the induced defence response. This isn’t an easy thing to pinpoint though. Presumably you don’t expect an immediate response; it will take time for the plant to metabolise and mobilise its chemical defences. How long will they take to reach their maximum point? After that it’s unlikely that the plant will maintain unnecessarily high levels of defences once the threat of damage recedes. Over time the response might therefore be humped.  Will defences increase and decrease at the same rate?

gr3

This then turns into a whole new set of questions for your experimental design. How many sample points do you need to characterise the response of a single plant? What is your actual response variable: the maximum level of defences measured? When does this occur? What if the maximum doesn’t vary between plants but instead the rate of response increases with damage?

This thought process can feel like a step backwards. You started with an idea and were all set to launch into an experiment, but I’ve stopped you and riddled the plan with doubt. That uncertainty was already embedded in the design though, in the form of unrecognised assumptions. In all likelihood these would come back to bite you at a later date. If you were lucky you might spot them while you were collecting data. More likely you would only realise once you came to analyse and write things up**.

This is why I advocate drawing your dream figure right at the outset. Not just before you start analysing the data, but before you even begin collecting it. The principle applies to experimental data, field sampling, even to computer simulations. If you can’t sketch out what you expect to find then you don’t know what you’re doing, and that needs to be resolved before going any further***.

If you’re in the position where you genuinely don’t know the answers to the types of questions above then there are three possible solutions:

  • Read the literature, looking for theoretical predictions that match your system and give you something to aim for. Even if the theory doesn’t end up being supported, you’ve still conducted a valid test.
  • Look at previous studies and see what they found. Note that this isn’t a substitute for a good theoretical prediction; “Author (Date) found this therefore I expect to find it too” is a really bad way to start an investigation. More important is to see why they found what they did and use that insight to inform your own study.
  • Invest some time in preliminary investigations. You still have to avoid the circularity of saying “I found this in preliminary studies and therefore expected to find it again”. If you genuinely don’t know what’s going to happen then try, find out, and think about a robust predictive theory that might account for your observations. Then test that theory in a full and properly-designed experiment.

Scientists are all impatient. Sitting around dreaming about what we hope will happen can sound like an indulgence when there’s the real work of measurement to be done. But specifying exactly what you expect will greatly increase your chances of eventually finding it.

 


 

* This is not an entirely random example. I set up a very similar experiment to this in my PhD which consumed at least a month’s effort in the field. You’ll also find that none of the data are published, nor even feature in my thesis, because I found absolutely nothing. This post is in part an explanation of why.

** There are two types of research students. There are those who realise all-too-late that there were critical design flaws in some of their experiments. The rest are liars.

*** Someone will no doubt ask “what about if you don’t know at all what will happen”. In that case I would query why you’re doing it in the first place. So-called ‘blue skies research’ is never entirely blind but begins with a reasonable expectation of finding something. That might include several possible outcomes that can be predicted then tested against one another. I would argue that truly surprising discoveries arise through serendipity while looking for something else. If you really don’t know what might happen then stop, put the sharp things down and go and ask a grown-up for advice first.

 

Kratom: when ethnobotany goes wrong

Mitragyna_speciosa111

Mitragyna speciosa (Korth.) Havil., otherwise known as kratom. Image credit: Uomo vitruviano

Efforts to control the trade and usage of recreational drugs* struggle against human ingenuity, driven by our boundless determination to get loaded. The search for new legal highs has led in two directions. One is the generation of new synthetic drugs which are sufficiently chemically distinct to avoid the regulations but which remain pharmacologically effective. These are often more dangerous than the illegal drugs they replace. The other is to mine the accumulated cultural and medical repositories of herbal lore from around the world to find psychoactive plants which haven’t yet been banned. Species are suddenly raised from obscurity to become the latest rush.

Over recent years a variety of plants have gone through a process of initial global popularity followed by a clamp-down, usually once a death has been associated with their use or abuse (even indirectly). A wave of media attention, hysteria and misinformation typically leads to regulatory action long before the formal medical evidence begins to emerge. One of the most recent episodes was over Salvia divinorum which enjoyed brief popularity as the legal hallucinogen of choice among students, despite being inconsistent in its effects and often quite unpleasant. Wherever you’re reading this, it’s probably already been banned.

Most of these plants have a long history of safe and moderate intake by indigenous populations in the regions where they grow naturally, either for ritual or medical purposes. The same can be said of many of the more common drugs we all know of: opium, coca and cannabis have many traditional uses stretching back for thousands of years. The problems arise when they are taken out of their cultural contexts and used for purely recreational purposes. This is often combined with plant breeding to increase the content of their psychoactive ingredients or chemical treatments that enhance their potency or synthesise their active components (such as in the production of heroin). A relatively benign drug is transformed from its original form into something much more problematic.

The latest plant to emerge as a potential drug in Europe, though already banned in many places, is Mitragyna speciosa, more commonly known as kratom, and native to several countries in Southeast Asia. Here in Ireland its active ingredient has been designated a Schedule 1 drug** since 2017. Pre-emptive legislation in other countries is quickly catching up.

I will confess to not having heard of kratom before it became a Western health concern. This is probably true of most people outside Asia, but more surprising to me given a long-standing interest in ethnobotany in Southeast Asia and having lived in Malaysia. I had previously subscribed to the view expressed in most textbooks that natural painkillers were absent from the regional flora, an opinion confirmed through my own discussions with orang asal shamen***. This may be because kratom grows in drier areas than I’ve worked in; I’ve certainly never come across it in any surveys of traditional agricultural systems in Malaysia. Pain relief is one of the scarcest and most sought-after forms of medicine, so if kratom is effective then I’m now puzzled that it isn’t more widespread.

In its normal range kratom is used as a mild painkiller, stimulant and appetite suppressant in the same way as coca leaves have been used for thousands of years in South America. The dosage obtained from chewing leaves, or even from extracts prepared by traditional methods, is likely to be low. This is very different from its recent use in Western countries where higher dosages and combinations with other drugs (including alcohol and caffeine) are likely to both enhance its effects and increase its toxicity. It is also more commonly sold over the internet as a powder.

Nevertheless, a parallel increase in recreational use within its native range has also been reported, although as of 2016 with no deaths associated. Recreational use also has a long history in Thailand, and habitual use in Malaysia has been known of since at least 1836. It is now thought to be the most popular illegal drug in south Thailand, though arguments continue over whether it ought to be regulated, despite the acknowledged risk of addiction. In this it mirrors the discussion over khat, a botanical stimulant widely used across Arabia and East Africa****. Where cultural associations are long-standing it is seen as less threatening than drugs from overseas.

Along with its long use as a drug in its own right, kratom has also been used as a substitute for opium (when unavailable) or treatment for addiction. It also has a folk usage in treating hypertension. This points towards the potential for beneficial medical uses which would be delayed or unrealised if knee-jerk regulation prevents the necessary research from being conducted. Compare the situation with cannabis: its medical use in China goes back thousands of years, but the stigma associated with its use in the West (which we can partly blame on Marco Polo) delayed its decriminalisation. Cannabis remains illegal in many countries despite steadily accumulating evidence of its medical value.

I’m a firm believer that we should recognise, respect and learn from the botanical knowledge of other cultures. Banning traditional medicines (and even socially important recreational drugs) in their home countries on the basis of their abuse by people in the Global North is morally wrong. Excessive regulation also deprives us of many of the potential benefits which might come from a better understanding of these plants.

All this is not to diminish the real physical and mental damage caused by addiction, and the need to protect people from abuse and the associated social costs. But the urge to get high is nothing new, and the cultures that have formed alongside psychoactive plant chemicals, from morphine to mescaline, usually incorporated ways of controlling their use and appreciating them in safe moderation. In Tibetan regions where I’ve worked cannabis is a frequent garden plant, and teenagers enjoy a quiet rebellious smoke with little stigma attached, while adults eventually grow out of the phase. Our own immature explorations of ethnobotany need to learn that there’s much more to a plant than the titillation provided by its active ingredients.

 


 

UPDATE: after posting, @liana_chua pointed out this fascinating blog post by @PaulThung about the market-driven boom in kratom production in West Kalimantan, where it is known as puri. It’s a great reminder that what’s seen as a growing problem in the Global North can simultaneously be a development opportunity for people in deprived regions.

 


 

* I dislike the term ‘war on drugs’ because I don’t like the tendency to militarise the vocabulary surrounding complex and nuanced issues (see also ‘terror’). Also, as others have pointed out, war is dangerous enough without being on drugs as well.

** Schedule 1 covers drugs which have no accepted medicinal or scientific value. The classification in Ireland is therefore based on uses rather than the potential for harm or addiction. This can become a bit circular. For example, nicotine isn’t on the list, but its only medical usage is in treating nicotine addiction. Cannabis, however, is on the list.

*** Of course I hadn’t considered before now that this is something they might not want to talk about, given that kratom is regulated in Malaysia, where it is known as ketum,  although attempts to ban it outright have not yet come to fruition. Still I would have expected to spot a large shrub in the Rubiaceae if they were growing it deliberately.

**** I’ve often chewed khat in countries where its use is tolerated, and it’s a great way to boost your mood and energy level during fieldwork. It is also addictive, of course.

 

Keep your enemies close to save the planet

torontowolfpack

This year Toronto Wolfpack join rugby’s Super League. But at what cost to the climate? (Rick Madonik/Toronto Star via Getty Images)

I’m a rugby fan. In the past I played for several amateur teams as I moved between cities for work. In all honesty I was never a particularly good player, but being keen and available is usually enough to secure a place in the squad and a guaranteed game in the reserves. It’s been over ten years since I picked up a ball though. Rugby is definitely a game for the young*. These days I meet my craving for rugby vicariously through supporting my local rugby union club and the rugby league side from where I grew up.

Over the last 30 years or so the game has changed immensely, and mostly for the better. Rugby union turned professional in 1995 while rugby league had begun to do so a century earlier (this was the original reason for the split between the two codes). Overall this has meant an increased profile, a more enjoyable experience for the fans, better protection for players and most of all an improved quality of the game.

Another trend began in the last few years though, which is that the sports became international at club level. In 2006 the rugby union Super League ceased to be a UK-only affair with the arrival of French side Catalans Dragons. In the 2020 season it has been joined by Toronto Wolfpack from Canada. A league previously confined to a narrow corridor of northern England has suddenly become transatlantic. Meanwhile in Ireland my local rugby union side Munster now play in the elite PRO14 league. Originally the Celtic League and composed of sides from Ireland, Scotland and Wales, these days it takes in two clubs each from Italy and South Africa. In the southern hemisphere Super Rugby unites teams from Argentina to Japan. Hardly next-door neighbours.

Why should this matter? Surely it’s great to see some of the world’s best teams pitted against one another?

Watching the start of the rugby league season made me a little anxious. Toronto, being unable to play in Canada at this time of year (it’s still too cold), began their season with a heavy loss against Yorkshire side Castleford Tigers. Many of their fans were Canadians based in the UK and delighted to have a team from back home to support. But there was also a hard core of travelling fans. And what happens when UK sides start playing games in Canada: how many air miles will be racked up? Few fans can afford to do so, and certainly not often, but there was no reason to even consider it before now.

This internationalisation of sport at the club level is driving a new trend in long-haul leisure travel, whether it’s American Football games in London or South African rugby teams playing in Ireland. Entire teams, their support staff and a dedicated fan base are now making regular trips around the world for one-off fixtures.** These will inevitably involve more flights than would otherwise have occurred. At least if you’re playing teams in your own country you can usually take the train.

This has of course been going on with international-level sports for a long time, but there the frequency is much lower. Major tournaments only occur every few years and visiting national fans usually attend a couple of games, allowing for an economy of scale. Even in this arena there’s been an increase in travel; one of the reasons for the formation of multinational touring sides like the Lions was to spread the cost of long-distance trips among national associations. Now every country travels independently. Still, although the audiences for international fixtures are huge, most fans watch at home on the TV. Club fans usually don’t have that option.

In the face of a climate crisis all sectors of society have a part to play. Unnecessarily increasing travel emissions by creating trans-national sporting contests among teams with very local fan-bases strikes me as an unwelcome new direction. Commercialisation of the game has had many benefits but this is not one of them. Despite flygskam, growth of air travel has continued unabated, and international sporting fixtures are a contributing factor.

The most enjoyable, passionate and intense games are those between local rivals. Watching Munster face up to Leinster, or Warrington against St Helens**, is to me the pinnacle of each sport at club level. Best of all, we can watch them practically on our doorsteps. Keeping our sporting fixtures local is one small way in which we can reduce our impact on the planet. Sustainable rugby? Why not.

 


 

* Anyone who plays rugby past the age of 35 is either brilliant, crazy or no longer interested in the finer tactical points of the game.

** The chant of disgruntled fans suspecting the referee of bias will have to be “Did you bring him on the plane?”.

*** Especially when we win. Oh look, we did it again.

Climate change and the Watchmen hypothesis

The climax of Alan Moore’s famous graphic novel (warning: spoilers*) plays out around a moral dilemma. In a world of conflict and discord, maybe the only thing that can bring humanity together is a shared enemy. If you accept that proposition, then could it ever be morally defensible to create such an enemy? And if you discovered that the enemy was a sham then would it better to reveal the truth or join the conspiracy? Part of the reason the conclusion to the book is so chilling is that your heart wants to side with the uncompromising truth-seekers while your head makes rational calculations that lead to unpalatable conclusions.

watchmenpage

Selected panels from p.19 of WATCHMEN 12 (1987) by Alan Moore and Dave Gibbons, published by DC Comics.

Why am I invoking an old comic? In climate change we face a common enemy which is undeniably real, whose approach can be measured, predicted and increasingly experienced in real time, and which has been created entirely by ourselves. It may not have the same dramatic impact as a surprise alien invasion; think of it more like spotting the aliens while their ships are still some distance from reaching Earth, but they’re already sending trash-talk transmissions in which they detail exactly how they’re going to render the planet uninhabitable to humans and frying a few interstellar objects just to prove the point. And we invited them, knowing exactly what would happen.

In this case there is no conspiracy. The stream of scientific studies demonstrating the perils of allowing global temperatures to increase further is so continuous and consistent as to be almost background**. We want to stick our heads in the sand and ignore climate change because we enjoy our short-haul flights for city breaks, private cars to drive to work and occasional steaks. The individual incentives are all aligned towards apathy, ongoing consumption and deferred responsibility. Whatever is the worst that could happen, many of us in the Global North will be dead of natural causes by the time it reaches our parts of the world.

In the face of such an obvious existential threat, about which we have been forewarned by the consistent voice of the overwhelming majority of scientists, how is humanity preparing? Are we coming together as one? Have we overcome our differences and channeled our collective intellects and resources into finding a solution for all?

Like hell. America withdraws from the only international agreement with a shred of common purpose; Australia continues to mine coal while the country burns; Poland hosts a gathering of climate scientists and uses it to defend the coal industry. I know people who have stopped attending UNFCCC meetings because the emotional toll is so great. To recognise how much needs to be done and to witness how little has been achieved is a terrible burden. This is not to say that the situation is hopeless; with concerted action we can still avert the worst outcomes, and doing so remains worthwhile.

With all this in mind, I’m forced to conclude that the evidence in support of the Watchmen hypothesis is lacking. Creating a common enemy will not be enough to bring the world together. We’ve been trying it for 30 years already***.

Where does this leave the likes of Extinction Rebellion? Over the last year I’ve been amazed by the scale of the protests in London, Berlin and cities around the world, which exceed every previous effort. It feels like it a tipping point, and it ought to be, because one is long overdue. Whether it proves to be the moment the tide turns, time will tell. It has all the elements of a success story: popular support for the message, if not always the methods; an inspiring figurehead in Greta Thunberg who continues to exceed expectations; politicians scrambling to be seen on their side. Yet the background to this is the ongoing prosecution of many of the participants as states quietly assert their control. And the usual pattern of politics is for green issues to slip down the agenda as soon as an election looms****.

One of the side-effects of XR is that the disaster narrative has currently obscured other discourses and even subjected them to friendly fire. But this is not a battle which will be won on a single front. Many alternatives are available, including market mechanisms, commercial partnerships or a rebalancing of economic goals (there are good reasons why the Green New Deal isn’t quite it, but I admire its objectives). These are not exclusive of one another, nor likely to be sufficient on their own, but if we are to succeed in inspiring change then a mixture of approaches and messages will be essential.

I’m not saying that we should stop heralding the impending catclysm*****. The uncompromising truth-speakers are right. We need to keep up the drumbeat of evidence, narratives, reporting and campaigning as the climate crisis unfolds. There are positive, individual steps that we can all take. But if we hold out for the moment when humanity suddenly unites to act as one then I fear it may never come.

 


* It’s been out since 1987 so you really have had plenty of time to read it, but if you haven’t then perhaps you should. And no, watching the film doesn’t count. Trigger warning: contains scenes of sexual violence.

** Even in such times, this paper stands out as particularly terrifying.

*** The first IPCC report was published in 1990. The fundamental message hasn’t altered since, even if the weight of evidence and urgency of action have increased. At least half of global carbon emissions have occurred since this report.

**** A few years ago I attended a talk by a statistician from one of the major political research agencies in the UK. He showed polling data with a consistent pattern that voters placed a high priority on green issues between elections but they fall down the ranking in advance of an election. Politicians know this, which is one reason why action in democratic countries is so slow.

***** If we’re going to stick with the comic book metaphors, this makes climate scientists the Silver Surfer to climate change’s Galactus. Too geeky?

Moving jobs as a mid-career academic

bindle

Why would anyone leave a permanent academic position at a research-intensive university?* After all, for many (if not most) PhD students, post-doc researchers and temporary lecturers, this is the ultimate dream. Openings for permanent posts don’t arise very often and competition for them is fierce. Once you’re ensconced in your own office with your name on the door then to most observers outside the ivory tower you’re living the dream.

And yet academics do move. Although it happens relatively infrequently in the career of a given individual, at least once they become permanent members of faculty, at any time all departments have a turnover of staff departing and (usually) being replaced. When this moves above a trickle it indicates problems, but there remains a background rate, even if on the surface everything is going well.

Having completed such a move just over a year ago, the rest of this post explains my thinking in doing so. I won’t mention who my former employer was, not that it’s hard to find out. That’s simply because I don’t want this post to even carry the suggestion of hard feelings or criticism of any individual or institution. But first: a story.

Two years ago I completed a job application on a flight home from the US. The flight was delayed and the deadline was the same day, which meant that on arrival at my parents’ house I rushed through the door and submitted online with minutes to spare. Recovering from the jetlag or even showering had to wait. A few weeks later I received notification that I had been shortlisted, then not long afterwards found myself back at the airport flying over to Ireland for an interview.

This had only been the second job application I had made that academic year and the first response.** That I only made a handful of applications was in part through being selective but also because mid-career positions don’t come up very often. There are often places at the bottom of the ladder for junior (tenure-track) lecturers, though nowhere near enough to meet demand, but by the time you’ve been in the business for over a decade, your skills and experience are so specialised that you either need to be lucky enough to find a opening for someone exactly like you or a call so broad that you can engineer your CV to fit. I also wasn’t going to risk moving for anything other than a permanent position.

Given all this, I went to the interview with the intention of treating it as practice and continued applying elsewhere. It’s always worth having several lines in the water, even if you don’t end up needing them. I wasn’t desperate for a job because I was in the fortunate position of already having that security. Maybe this relaxed, open-minded approach helped, because I got an offer.

There’s a slightly embarrassing element to the next part. When the phone call first came through to offer me the position I hung up. At that precise moment there was a tearful post-grad in my office who had come to see me for help. I will always put supporting a student in distress ahead of any phone call, however important. Luckily UCC weren’t offended by my rudeness and called back later.

To end the story, here I am. There are lots of great reasons for being in Ireland right now, and specifically at UCC. These include a growing focus on my field, national investment in forestry and agroforestry, and a booming higher education sector. The reasons for leaving UK Higher Education would surprise no-one.***

Why though did I leave a permanent academic position at a global top-100 university with international recognition? Several junior colleagues were aghast at what looked like folly. I had invested 13 years in the institution, built up a research group, developed teaching materials that were tried-and-tested, and no-one was trying to get rid of me. On the contrary, at the same time as I was trying to leave, they gave me an award and a performance bonus. I loved my colleagues in ecology and evolution; they’re a wonderful group and remain friends. The opening to replace me attracted a host of well-qualified applicants and they had no difficulty recruiting someone brilliant.

Why then did I leave? More generally, why would anyone disrupt their stable work and family life to move mid-career? These are my reasons, which may not translate to everyone’s circumstances, but perhaps might help clarify my thinking for anyone in a similar situation.

  1. I had gone as far as possible in the context of my existing position. After 13 years without a sabbatical the lack of respite from accumulated responsibilities left no space to reflect or develop. The backlogged manuscripts weren’t getting written; new projects were grounded; every year the expectations rolled over and incrementally increased. The thought of spending another year (or more) doing the same thing in the same place filled me with existential dread. Had I felt as though an alternative was within reach then I would have stayed. There’s no complaint implied here; the job had just became one that didn’t fit me any more.
  2. This was a quiet period with several major projects recently completed. Although I had four PhD students on the books (all with co-supervisors), actually the group was at a relatively low ebb, and nothing new was on the horizon. This was partly deliberate; having made the decision to go, I didn’t want to leave too many people in the lurch.
  3. It was time for a new challenge. When I returned to the UK from Malaysia in 2002 I had no intention of staying for long. That it took 16 years for me to leave again was simply because the opportunities lined up that way. Life had become comfortable but also a bit boring.
  4. I wanted to shake up my perspective. After over a decade working in the same place you know your colleagues well and if collaborations haven’t sparked then there’s little chance that they will. Working with new people is the best way to expose yourself to new ideas. This either means moving yourself or hoping that fresh recruits will restore energy in the place you’re already based. It had been a very long time since the latter had happened (after 13 years I was still the youngest permanent member of staff in the building) so I left instead.
  5. We were starting a family, which prompted reflection on my approach to work-life balance. Long hours, working evenings and weekends throughout the semester, were not compatible with the life I wanted or the parent I hoped to be. Nor was I going to be taking extended trips overseas to visit field sites and collaborators. The fieldwork had been one of the compensations of my old job; if that was being scaled back then I wanted the possibility of stronger research interests at home.

I can’t say just yet whether the move has been successful, and at any rate there’s no way to know for sure without a controlled comparison of some partial metric. But what I can say is that I’m enthusiastic about science again, enjoy coming into work every morning, and optimistic about getting some projects I care about off the ground. On that basis alone it’s been worth it. In fact, the department will be recruiting more people very soon — if you want to join us then keep your eyes open for forthcoming positions!

 


* For ‘permanent’ you can read ‘tenured’ if you like, but the truth is that tenure doesn’t mean quite the same thing outside North America. Universities generally can’t fire us for no reason but the level of protection isn’t equivalent. For ‘research-intensive’ you can read R1 in the USA, or Russell Group in the UK, or whatever your local class of prestige universities is.

** I’m not telling you how many failed applications had gone in over the preceding few years, but there were plenty. These had however been rather speculative; what changed was that I put serious effort into developing much stronger applications.

*** Brexit, HE funding issues, Brexit, low pay, Brexit, workload, Brexit, managerialism… did I mention Brexit?

Decolonise biogeography!

sinclair_in_peru

Arthur Sinclair in Peru. This image is taken from a fundraising page set up by his great-grandson, the writer Iain Sinclair, who is currently writing a book and producing a podcast series based on his attempts to retrace the original expedition.

Arthur Sinclair was a Scottish explorer and geographer whose most influential commission was his 1890 survey of half a million square miles of interior Peru produced on behalf of the Peruvian Corporation of London. In his report* he expresses limited sympathy for the indigenous inhabitants of this vast wilderness:

Poor Chuncho! The time seems to be approaching when, in vulgar parlance, you must take a back seat; but it must be acknowledged you have had a long lease of those magnificent lands, and done very little with them… The world, indeed, has been made neither better nor richer by your existence, and now the space you occupy — or rather wander in — to so little purpose, is required, and the wealth of vegetation too long allowed to run waste, must be turned to some useful account.

Modern readers with our current sensibilities will gasp at the patronising imperialism embedded in these words. Anyone with an awareness of the subsequent consequences for the people who once lived in and depended on these forests, not to mention the damage to the forests themselves, will be appalled that devastating change was proposed with such casual insouciance.

Attitudes of this nature were widespread among the imperial powers. I don’t mean to pick on Sinclair as a particularly egregious example because he wasn’t. My choice of this passage is solely due to the coincidence of having come across it recently, but I could have chosen from many; some now notorious, others obscure but nonetheless consequential at the time they were written. For example, Sinclair’s explorations took place at the same time as Joseph Conrad was embarking on his own travels into the Congo basin**.

peru_railway2.jpg

The Peruvian railway system was a triumph of colonial engineering which opened the interior of the country for trade and resource extraction. Photo by David Gubler. Source: Wikimedia Commons.

For all we might be appalled by the opinions of our predecessors, and strongly disavow them now, as biogeographers we must face up to the fact that our field arose first and foremost as an exercise in colonialism. Its raisin d’etre was to describe, delineate and evaluate the natural wealth of foreign lands for the benefit of colonial powers. We remain complicit while our institutions continue to memorialise and celebrate our forebears, often in buildings paid for by the proceeds of slavery, extraction of resources and appropriation of land. It’s not sufficient to say that we know better now when the advantages we accumulated through colonialism reinforce persistent inequalities.

In our new paper*** we draw attention to this ongoing problem. Ironically human geographers are acutely aware of the need to engage with colonial legacies, while the physical and biogeographers with whom they often share buildings typically assume that such concerns do not apply to them. This position cannot be defended.

One way in which this applies is in the distribution of biogeographical researchers. We extracted the institutional addresses of over 7000 authors of papers in the three leading journals of biogeography over a five year period, and show that only 11% of them are based in the tropics. Over 5000 of them are in the northern hemisphere, mostly in what we consider Global North countries.

biogeoauthors

Distribution of all authors contributing to papers in the three main international journals of biogeography over a five year period (2014–2018). Figure 1 in Eichhorn et al. (2020).

There are lots of forces underlying this pattern, but they all act to reinforce the dominance of Global North institutions. I don’t doubt that it would be similar for many other academic fields; this however only demonstrates how pervasive the problem is.

Meanwhile, the predominant flow of information (as data and records) is from Global South countries towards these centres of influence. The main databases which compile global biogeographical records are based in Europe, North America or Australia, and are maintained by researchers based in those countries. Is this harvesting of data any different in its dynamics to that of colonial resources?

Another issue is that, mostly subconsciously, the way that studies from Global North countries are described and framed differs from those published elsewhere. In a brilliant study published last year, Ergin & Alkan parse the signal of academic neo-colonialism in the language used by authors. Global North scholars write from a perspective of supposedly impartial generality, while southern scholars include geographical indicators that reinforce their position as producing localised case studies or applications.

This is an insidious effect, and easily deniable until pointed out. When one of the reviewers questioned this, it took me only a few moments to identify a number of papers from Wytham Woods (a study site in Oxfordshire) with titles that gave no hint of their origins, and purport to make broad ecological statements. Would a paper from the Rwenzoris in Rwanda be written, accepted and published with the same level of detached abstraction? I severely doubt it.

What can we do? On recognising the problem our collective responsibility is to reverse the trend. This depends on the behaviours of individuals and research groups. Capacity building, proper recognition of collaborators and support for research agendas from beyond the Global North are all part of the process. Opening up biogeography also means enabling researchers from across the world to not only have access to repositories of data, but to develop and host their own. And when we write, we should learn from the humanities and recognise how our positionality inflects the way we view and describe the world around us.

Achieving these things is not merely an act of contrition for past injustices; opening up the field will increase the diversity of insights and validity of our findings, making  biogeography into a truly global science. We need to decolonise biogeography.

 


* I obtained this excerpt from the travel diary of his ancestor Iain Sinclair, who is no apologist for his great-grandfather. He also has a blog serialising his attempts to retrace the path of the Peru expedition.

** Please don’t send me comments along the lines of how Heart of Darkness is a classic of world literature. It is both a classic work and appallingly racist.

*** Thanks to my brilliant coauthors Kate Baker and Mark Griffiths who continue to teach, inspire and provoke me with their insights.

 

Books I haven’t read

V.D10 Origin 1st edn title page_0

The opening pages of Darwin’s classic text in its first edition as held by the library of St John’s College, Cambridge.

A number of years ago on a UK radio show there was a flurry of attention when Richard Dawkins, under pressure from a religious interviewer, was unable to recall the full title of Darwin’s most famous book*. This was perceived as a flaw in his authority as an evolutionary biologist. How could he claim to support evolution if he couldn’t even name the book which launched the theory?

There was a prompt backlash to this line of argument from scientists who pointed out that we don’t have sacred texts in science. Unlike religions which fixate upon a single original source**, we recognise those who made contributions to the development of our field but don’t treat them as inviolable truth. Darwin, like all scientists, got some things wrong, didn’t quite manage to figure out some other problems, and occasionally changed his mind. None of this undermines his brilliance; the overwhelming majority of his ideas have stood the test of time, and given the resources and knowledge he had available to him (remembering that it was another century until we understood the structure of DNA), his achievement was astonishing.

Confession time: I haven’t read On the Origin. Maybe I will one day, but right now it’s not on my very long reading list.

There are many good reasons for reading On the Origin, none of which I need to be told. By all accounts it’s a fascinating, well-written and detailed argument from first principles for the centrality of natural selection in evolution. As a historical document and inspiration for the entire field of biology its importance is unquestionable. I’m certain that Richard Dawkins has read it, even if he didn’t memorise the title.

None of this means that I have to read it. The fundamental insight has been affirmed, repeated and strengthened by over 150 years of scientific study and publication. Even though I used to be a creationist, it didn’t take reading Darwin to change my mind***. What we know now makes a modern account more convincing than a Victorian naturalist could ever have managed.

An even more embarrassing confession is that I haven’t read The Theory of Island Biogeography****. This admission is likely to provoke horror in anyone from that generation of ecologists (my own lecturers) who remember the seismic impact that MacArthur & Wilson’s 1967 book had on the field. It defined the direction of enquiry in many areas of ecology for decades afterwards and effectively founded the scientific discipline of conservation biology. Some of their ideas turned out to be flawed, but the majority of ecologists still view the central model as effectively proven.

I’m not one of them. Yes, I apparently fall among the minority of ecologists, albeit led by some pretty influential voices, who view the model as so partial and incomplete as to lack predictive value in the real world (I’m not going to lay out my argument here, I’ve done that before). That I’ve reached this decision without reading the original book doesn’t perturb me in the slightest. In the same way as I’m confident that I can understand evolutionary theory without reading Darwin, I’ve read enough accounts of the Equilibrium Model of Island Biogeography (and taught it to undergraduates) that it’s not as if going back to the original source will change my mind.

If this upsets you then consider whether you’re happy to agree with the majority of evolutionary biologists that Lamarck’s model of inheritance was wrong without without bothering to read Lamarck (or his later advocate Lysenko). Lamarck made many great contributions to science; this wasn’t among them. For similar reasons I’m happy to make judgements on Haeckel’s embryological model of evolution (rejected), Wegener’s theory of plate tectonics (accepted), or Hubbell’s neutral theory (ambivalent), all without reading the original books.

What have I actually read then? Among the great classics of our field I’m pleased to have gone through a large number of Wallace’s original works, which were contemporaneous to Darwin, and Humboldt’s Essay on the Geography of Plants (1807). I can strongly recommend them. But they didn’t change my mind about anything. It was enjoyable to go back to the original sources, and by the end I was even more impressed by the authors’ achievements than before, but my understanding of the world remained unaltered. For that reason I wouldn’t ever claim that everyone should read them.

There are, however, a number of books which have changed my mind or radically reorganised my understanding of the world. These include Chase & Leibold’s 2003 book about niches or Whittaker & Fernandez-Palacios on islands. Without having read them I wouldn’t hold the opinions that I do today. I’m glad that I placed those higher on my reading list than On the Origin. But that certainly doesn’t make them essential reading for everyone.

We all follow our own intellectual journeys through science and there is no one true path. For this reason I’m always sceptical of attempts to set essential reading lists, such as the 100 papers every ecologist needs to read, which I and others disagreed with more on principle than content. So yes, if you like, you can think less of me for the reading that I haven’t done. But my guess is that many people who read this post will be feeling a quiet reassurance that it’s not just them, and that it’s nothing to be ashamed about.

 


* It is, of course, the barely memorable “On the Origin of Species by Means of Natural Selection or the Preservation of Favoured Races in the Struggle for Life.”

** This in itself is baffling given that sacred texts have their own complex histories of assembly from multiple sources. Most modern Christians don’t dwell on the fact that the issue of which books to include in the Bible was so contentious, especially for the Old Testament, and some traditions persist with quite different Bibles. Why include Daniel but not Enoch? Then there’s deciding which version should be seen as definitive, and whose translation… it’s not as simple as picking the one true book.

*** Notably a dominant theme in creationist critiques of evolution is to pick away at perceived errors or inconsistencies in Darwin’s writings on the assumption that undermining its originator will unravel the whole enterprise of modern biology.

**** And this from a former book reviews editor of the journal Frontiers of Biogeography. They’ll be throwing me out of the Irritable Biogeography Society next.