If you can’t sketch a figure then you don’t have a hypothesis

herbivory

Caterpillar feeding on Urena lobata leaf in Pakke Tiger Reserve, Arunachal Pradesh, India. What is the response by the plant? Photo by Rohit Naniwadekar (CC BY-SA).

This is one of my mantras that gets repeated time and again to students. Sometimes it feels like a catch-phrase, one of a small number of formulaic pronouncements that come out when you pull the string on my back. Why do I keep saying it, and why should anyone pay attention?

Let me give an example*. Imagine you’re setting up an experiment into how leaf damage by herbivores influences chemical defence in plant tissues. You might start by assuming that the more damage you inflict, the greater the response induced in the plant. This sounds perfectly sensible. So are you ready to get going and launch an experiment? No, absolutely not. First let’s turn your rather flimsy hypothesis into an actual expectation.

gr1

Why is this a straight line? Because you’ve not specified any other shape for the relationship. A straight line has to be the starting point, and if you collect the data and plug it directly into a GLM without thinking about it, this is exactly what you’re assuming. But is there any reason why it should be a straight line? All sorts of other relationships are possible. It could be a saturating curve, where the plant reaches some maximum response; this is more likely than a continuous increase. Alternatives include an accelerating response, or a step change at a certain level of damage.

gr2

 

Perhaps what you need to do is take a step back and think about that x-axis. What levels of damage are you measuring, and why? If you expect an asymptotic response then you’re going to need to sample quite a wide range of damage levels, but there’s no need to continue to extremes because after a certain point nothing more is going to happen. If it’s a step change then your whole design should concentrate on sampling intensively around the point where the shift occurs so as to identify that parameter accurately. And so on. Drawing this first sketch has already forced you to think more carefully about your experimental design.

Let’s not stop there though. Look at the y-axis, which rather blandly promises to measure the induced defence response. This isn’t an easy thing to pinpoint though. Presumably you don’t expect an immediate response; it will take time for the plant to metabolise and mobilise its chemical defences. How long will they take to reach their maximum point? After that it’s unlikely that the plant will maintain unnecessarily high levels of defences once the threat of damage recedes. Over time the response might therefore be humped.  Will defences increase and decrease at the same rate?

gr3

This then turns into a whole new set of questions for your experimental design. How many sample points do you need to characterise the response of a single plant? What is your actual response variable: the maximum level of defences measured? When does this occur? What if the maximum doesn’t vary between plants but instead the rate of response increases with damage?

This thought process can feel like a step backwards. You started with an idea and were all set to launch into an experiment, but I’ve stopped you and riddled the plan with doubt. That uncertainty was already embedded in the design though, in the form of unrecognised assumptions. In all likelihood these would come back to bite you at a later date. If you were lucky you might spot them while you were collecting data. More likely you would only realise once you came to analyse and write things up**.

This is why I advocate drawing your dream figure right at the outset. Not just before you start analysing the data, but before you even begin collecting it. The principle applies to experimental data, field sampling, even to computer simulations. If you can’t sketch out what you expect to find then you don’t know what you’re doing, and that needs to be resolved before going any further***.

If you’re in the position where you genuinely don’t know the answers to the types of questions above then there are three possible solutions:

  • Read the literature, looking for theoretical predictions that match your system and give you something to aim for. Even if the theory doesn’t end up being supported, you’ve still conducted a valid test.
  • Look at previous studies and see what they found. Note that this isn’t a substitute for a good theoretical prediction; “Author (Date) found this therefore I expect to find it too” is a really bad way to start an investigation. More important is to see why they found what they did and use that insight to inform your own study.
  • Invest some time in preliminary investigations. You still have to avoid the circularity of saying “I found this in preliminary studies and therefore expected to find it again”. If you genuinely don’t know what’s going to happen then try, find out, and think about a robust predictive theory that might account for your observations. Then test that theory in a full and properly-designed experiment.

Scientists are all impatient. Sitting around dreaming about what we hope will happen can sound like an indulgence when there’s the real work of measurement to be done. But specifying exactly what you expect will greatly increase your chances of eventually finding it.

 


 

* This is not an entirely random example. I set up a very similar experiment to this in my PhD which consumed at least a month’s effort in the field. You’ll also find that none of the data are published, nor even feature in my thesis, because I found absolutely nothing. This post is in part an explanation of why.

** There are two types of research students. There are those who realise all-too-late that there were critical design flaws in some of their experiments. The rest are liars.

*** Someone will no doubt ask “what about if you don’t know at all what will happen”. In that case I would query why you’re doing it in the first place. So-called ‘blue skies research’ is never entirely blind but begins with a reasonable expectation of finding something. That might include several possible outcomes that can be predicted then tested against one another. I would argue that truly surprising discoveries arise through serendipity while looking for something else. If you really don’t know what might happen then stop, put the sharp things down and go and ask a grown-up for advice first.

 

Kratom: when ethnobotany goes wrong

Mitragyna_speciosa111

Mitragyna speciosa (Korth.) Havil., otherwise known as kratom. Image credit: Uomo vitruviano

Efforts to control the trade and usage of recreational drugs* struggle against human ingenuity, driven by our boundless determination to get loaded. The search for new legal highs has led in two directions. One is the generation of new synthetic drugs which are sufficiently chemically distinct to avoid the regulations but which remain pharmacologically effective. These are often more dangerous than the illegal drugs they replace. The other is to mine the accumulated cultural and medical repositories of herbal lore from around the world to find psychoactive plants which haven’t yet been banned. Species are suddenly raised from obscurity to become the latest rush.

Over recent years a variety of plants have gone through a process of initial global popularity followed by a clamp-down, usually once a death has been associated with their use or abuse (even indirectly). A wave of media attention, hysteria and misinformation typically leads to regulatory action long before the formal medical evidence begins to emerge. One of the most recent episodes was over Salvia divinorum which enjoyed brief popularity as the legal hallucinogen of choice among students, despite being inconsistent in its effects and often quite unpleasant. Wherever you’re reading this, it’s probably already been banned.

Most of these plants have a long history of safe and moderate intake by indigenous populations in the regions where they grow naturally, either for ritual or medical purposes. The same can be said of many of the more common drugs we all know of: opium, coca and cannabis have many traditional uses stretching back for thousands of years. The problems arise when they are taken out of their cultural contexts and used for purely recreational purposes. This is often combined with plant breeding to increase the content of their psychoactive ingredients or chemical treatments that enhance their potency or synthesise their active components (such as in the production of heroin). A relatively benign drug is transformed from its original form into something much more problematic.

The latest plant to emerge as a potential drug in Europe, though already banned in many places, is Mitragyna speciosa, more commonly known as kratom, and native to several countries in Southeast Asia. Here in Ireland its active ingredient has been designated a Schedule 1 drug** since 2017. Pre-emptive legislation in other countries is quickly catching up.

I will confess to not having heard of kratom before it became a Western health concern. This is probably true of most people outside Asia, but more surprising to me given a long-standing interest in ethnobotany in Southeast Asia and having lived in Malaysia. I had previously subscribed to the view expressed in most textbooks that natural painkillers were absent from the regional flora, an opinion confirmed through my own discussions with orang asal shamen***. This may be because kratom grows in drier areas than I’ve worked in; I’ve certainly never come across it in any surveys of traditional agricultural systems in Malaysia. Pain relief is one of the scarcest and most sought-after forms of medicine, so if kratom is effective then I’m now puzzled that it isn’t more widespread.

In its normal range kratom is used as a mild painkiller, stimulant and appetite suppressant in the same way as coca leaves have been used for thousands of years in South America. The dosage obtained from chewing leaves, or even from extracts prepared by traditional methods, is likely to be low. This is very different from its recent use in Western countries where higher dosages and combinations with other drugs (including alcohol and caffeine) are likely to both enhance its effects and increase its toxicity. It is also more commonly sold over the internet as a powder.

Nevertheless, a parallel increase in recreational use within its native range has also been reported, although as of 2016 with no deaths associated. Recreational use also has a long history in Thailand, and habitual use in Malaysia has been known of since at least 1836. It is now thought to be the most popular illegal drug in south Thailand, though arguments continue over whether it ought to be regulated, despite the acknowledged risk of addiction. In this it mirrors the discussion over khat, a botanical stimulant widely used across Arabia and East Africa****. Where cultural associations are long-standing it is seen as less threatening than drugs from overseas.

Along with its long use as a drug in its own right, kratom has also been used as a substitute for opium (when unavailable) or treatment for addiction. It also has a folk usage in treating hypertension. This points towards the potential for beneficial medical uses which would be delayed or unrealised if knee-jerk regulation prevents the necessary research from being conducted. Compare the situation with cannabis: its medical use in China goes back thousands of years, but the stigma associated with its use in the West (which we can partly blame on Marco Polo) delayed its decriminalisation. Cannabis remains illegal in many countries despite steadily accumulating evidence of its medical value.

I’m a firm believer that we should recognise, respect and learn from the botanical knowledge of other cultures. Banning traditional medicines (and even socially important recreational drugs) in their home countries on the basis of their abuse by people in the Global North is morally wrong. Excessive regulation also deprives us of many of the potential benefits which might come from a better understanding of these plants.

All this is not to diminish the real physical and mental damage caused by addiction, and the need to protect people from abuse and the associated social costs. But the urge to get high is nothing new, and the cultures that have formed alongside psychoactive plant chemicals, from morphine to mescaline, usually incorporated ways of controlling their use and appreciating them in safe moderation. In Tibetan regions where I’ve worked cannabis is a frequent garden plant, and teenagers enjoy a quiet rebellious smoke with little stigma attached, while adults eventually grow out of the phase. Our own immature explorations of ethnobotany need to learn that there’s much more to a plant than the titillation provided by its active ingredients.

 


 

UPDATE: after posting, @liana_chua pointed out this fascinating blog post by @PaulThung about the market-driven boom in kratom production in West Kalimantan, where it is known as puri. It’s a great reminder that what’s seen as a growing problem in the Global North can simultaneously be a development opportunity for people in deprived regions.

 


 

* I dislike the term ‘war on drugs’ because I don’t like the tendency to militarise the vocabulary surrounding complex and nuanced issues (see also ‘terror’). Also, as others have pointed out, war is dangerous enough without being on drugs as well.

** Schedule 1 covers drugs which have no accepted medicinal or scientific value. The classification in Ireland is therefore based on uses rather than the potential for harm or addiction. This can become a bit circular. For example, nicotine isn’t on the list, but its only medical usage is in treating nicotine addiction. Cannabis, however, is on the list.

*** Of course I hadn’t considered before now that this is something they might not want to talk about, given that kratom is regulated in Malaysia, where it is known as ketum,  although attempts to ban it outright have not yet come to fruition. Still I would have expected to spot a large shrub in the Rubiaceae if they were growing it deliberately.

**** I’ve often chewed khat in countries where its use is tolerated, and it’s a great way to boost your mood and energy level during fieldwork. It is also addictive, of course.

 

Keep your enemies close to save the planet

torontowolfpack

This year Toronto Wolfpack join rugby’s Super League. But at what cost to the climate? (Rick Madonik/Toronto Star via Getty Images)

I’m a rugby fan. In the past I played for several amateur teams as I moved between cities for work. In all honesty I was never a particularly good player, but being keen and available is usually enough to secure a place in the squad and a guaranteed game in the reserves. It’s been over ten years since I picked up a ball though. Rugby is definitely a game for the young*. These days I meet my craving for rugby vicariously through supporting my local rugby union club and the rugby league side from where I grew up.

Over the last 30 years or so the game has changed immensely, and mostly for the better. Rugby union turned professional in 1995 while rugby league had begun to do so a century earlier (this was the original reason for the split between the two codes). Overall this has meant an increased profile, a more enjoyable experience for the fans, better protection for players and most of all an improved quality of the game.

Another trend began in the last few years though, which is that the sports became international at club level. In 2006 the rugby union Super League ceased to be a UK-only affair with the arrival of French side Catalans Dragons. In the 2020 season it has been joined by Toronto Wolfpack from Canada. A league previously confined to a narrow corridor of northern England has suddenly become transatlantic. Meanwhile in Ireland my local rugby union side Munster now play in the elite PRO14 league. Originally the Celtic League and composed of sides from Ireland, Scotland and Wales, these days it takes in two clubs each from Italy and South Africa. In the southern hemisphere Super Rugby unites teams from Argentina to Japan. Hardly next-door neighbours.

Why should this matter? Surely it’s great to see some of the world’s best teams pitted against one another?

Watching the start of the rugby league season made me a little anxious. Toronto, being unable to play in Canada at this time of year (it’s still too cold), began their season with a heavy loss against Yorkshire side Castleford Tigers. Many of their fans were Canadians based in the UK and delighted to have a team from back home to support. But there was also a hard core of travelling fans. And what happens when UK sides start playing games in Canada: how many air miles will be racked up? Few fans can afford to do so, and certainly not often, but there was no reason to even consider it before now.

This internationalisation of sport at the club level is driving a new trend in long-haul leisure travel, whether it’s American Football games in London or South African rugby teams playing in Ireland. Entire teams, their support staff and a dedicated fan base are now making regular trips around the world for one-off fixtures.** These will inevitably involve more flights than would otherwise have occurred. At least if you’re playing teams in your own country you can usually take the train.

This has of course been going on with international-level sports for a long time, but there the frequency is much lower. Major tournaments only occur every few years and visiting national fans usually attend a couple of games, allowing for an economy of scale. Even in this arena there’s been an increase in travel; one of the reasons for the formation of multinational touring sides like the Lions was to spread the cost of long-distance trips among national associations. Now every country travels independently. Still, although the audiences for international fixtures are huge, most fans watch at home on the TV. Club fans usually don’t have that option.

In the face of a climate crisis all sectors of society have a part to play. Unnecessarily increasing travel emissions by creating trans-national sporting contests among teams with very local fan-bases strikes me as an unwelcome new direction. Commercialisation of the game has had many benefits but this is not one of them. Despite flygskam, growth of air travel has continued unabated, and international sporting fixtures are a contributing factor.

The most enjoyable, passionate and intense games are those between local rivals. Watching Munster face up to Leinster, or Warrington against St Helens**, is to me the pinnacle of each sport at club level. Best of all, we can watch them practically on our doorsteps. Keeping our sporting fixtures local is one small way in which we can reduce our impact on the planet. Sustainable rugby? Why not.

 


 

* Anyone who plays rugby past the age of 35 is either brilliant, crazy or no longer interested in the finer tactical points of the game.

** The chant of disgruntled fans suspecting the referee of bias will have to be “Did you bring him on the plane?”.

*** Especially when we win. Oh look, we did it again.

Climate change and the Watchmen hypothesis

The climax of Alan Moore’s famous graphic novel (warning: spoilers*) plays out around a moral dilemma. In a world of conflict and discord, maybe the only thing that can bring humanity together is a shared enemy. If you accept that proposition, then could it ever be morally defensible to create such an enemy? And if you discovered that the enemy was a sham then would it better to reveal the truth or join the conspiracy? Part of the reason the conclusion to the book is so chilling is that your heart wants to side with the uncompromising truth-seekers while your head makes rational calculations that lead to unpalatable conclusions.

watchmenpage

Selected panels from p.19 of WATCHMEN 12 (1987) by Alan Moore and Dave Gibbons, published by DC Comics.

Why am I invoking an old comic? In climate change we face a common enemy which is undeniably real, whose approach can be measured, predicted and increasingly experienced in real time, and which has been created entirely by ourselves. It may not have the same dramatic impact as a surprise alien invasion; think of it more like spotting the aliens while their ships are still some distance from reaching Earth, but they’re already sending trash-talk transmissions in which they detail exactly how they’re going to render the planet uninhabitable to humans and frying a few interstellar objects just to prove the point. And we invited them, knowing exactly what would happen.

In this case there is no conspiracy. The stream of scientific studies demonstrating the perils of allowing global temperatures to increase further is so continuous and consistent as to be almost background**. We want to stick our heads in the sand and ignore climate change because we enjoy our short-haul flights for city breaks, private cars to drive to work and occasional steaks. The individual incentives are all aligned towards apathy, ongoing consumption and deferred responsibility. Whatever is the worst that could happen, many of us in the Global North will be dead of natural causes by the time it reaches our parts of the world.

In the face of such an obvious existential threat, about which we have been forewarned by the consistent voice of the overwhelming majority of scientists, how is humanity preparing? Are we coming together as one? Have we overcome our differences and channeled our collective intellects and resources into finding a solution for all?

Like hell. America withdraws from the only international agreement with a shred of common purpose; Australia continues to mine coal while the country burns; Poland hosts a gathering of climate scientists and uses it to defend the coal industry. I know people who have stopped attending UNFCCC meetings because the emotional toll is so great. To recognise how much needs to be done and to witness how little has been achieved is a terrible burden. This is not to say that the situation is hopeless; with concerted action we can still avert the worst outcomes, and doing so remains worthwhile.

With all this in mind, I’m forced to conclude that the evidence in support of the Watchmen hypothesis is lacking. Creating a common enemy will not be enough to bring the world together. We’ve been trying it for 30 years already***.

Where does this leave the likes of Extinction Rebellion? Over the last year I’ve been amazed by the scale of the protests in London, Berlin and cities around the world, which exceed every previous effort. It feels like it a tipping point, and it ought to be, because one is long overdue. Whether it proves to be the moment the tide turns, time will tell. It has all the elements of a success story: popular support for the message, if not always the methods; an inspiring figurehead in Greta Thunberg who continues to exceed expectations; politicians scrambling to be seen on their side. Yet the background to this is the ongoing prosecution of many of the participants as states quietly assert their control. And the usual pattern of politics is for green issues to slip down the agenda as soon as an election looms****.

One of the side-effects of XR is that the disaster narrative has currently obscured other discourses and even subjected them to friendly fire. But this is not a battle which will be won on a single front. Many alternatives are available, including market mechanisms, commercial partnerships or a rebalancing of economic goals (there are good reasons why the Green New Deal isn’t quite it, but I admire its objectives). These are not exclusive of one another, nor likely to be sufficient on their own, but if we are to succeed in inspiring change then a mixture of approaches and messages will be essential.

I’m not saying that we should stop heralding the impending catclysm*****. The uncompromising truth-speakers are right. We need to keep up the drumbeat of evidence, narratives, reporting and campaigning as the climate crisis unfolds. There are positive, individual steps that we can all take. But if we hold out for the moment when humanity suddenly unites to act as one then I fear it may never come.

 


* It’s been out since 1987 so you really have had plenty of time to read it, but if you haven’t then perhaps you should. And no, watching the film doesn’t count. Trigger warning: contains scenes of sexual violence.

** Even in such times, this paper stands out as particularly terrifying.

*** The first IPCC report was published in 1990. The fundamental message hasn’t altered since, even if the weight of evidence and urgency of action have increased. At least half of global carbon emissions have occurred since this report.

**** A few years ago I attended a talk by a statistician from one of the major political research agencies in the UK. He showed polling data with a consistent pattern that voters placed a high priority on green issues between elections but they fall down the ranking in advance of an election. Politicians know this, which is one reason why action in democratic countries is so slow.

***** If we’re going to stick with the comic book metaphors, this makes climate scientists the Silver Surfer to climate change’s Galactus. Too geeky?

Moving jobs as a mid-career academic

bindle

Why would anyone leave a permanent academic position at a research-intensive university?* After all, for many (if not most) PhD students, post-doc researchers and temporary lecturers, this is the ultimate dream. Openings for permanent posts don’t arise very often and competition for them is fierce. Once you’re ensconced in your own office with your name on the door then to most observers outside the ivory tower you’re living the dream.

And yet academics do move. Although it happens relatively infrequently in the career of a given individual, at least once they become permanent members of faculty, at any time all departments have a turnover of staff departing and (usually) being replaced. When this moves above a trickle it indicates problems, but there remains a background rate, even if on the surface everything is going well.

Having completed such a move just over a year ago, the rest of this post explains my thinking in doing so. I won’t mention who my former employer was, not that it’s hard to find out. That’s simply because I don’t want this post to even carry the suggestion of hard feelings or criticism of any individual or institution. But first: a story.

Two years ago I completed a job application on a flight home from the US. The flight was delayed and the deadline was the same day, which meant that on arrival at my parents’ house I rushed through the door and submitted online with minutes to spare. Recovering from the jetlag or even showering had to wait. A few weeks later I received notification that I had been shortlisted, then not long afterwards found myself back at the airport flying over to Ireland for an interview.

This had only been the second job application I had made that academic year and the first response.** That I only made a handful of applications was in part through being selective but also because mid-career positions don’t come up very often. There are often places at the bottom of the ladder for junior (tenure-track) lecturers, though nowhere near enough to meet demand, but by the time you’ve been in the business for over a decade, your skills and experience are so specialised that you either need to be lucky enough to find a opening for someone exactly like you or a call so broad that you can engineer your CV to fit. I also wasn’t going to risk moving for anything other than a permanent position.

Given all this, I went to the interview with the intention of treating it as practice and continued applying elsewhere. It’s always worth having several lines in the water, even if you don’t end up needing them. I wasn’t desperate for a job because I was in the fortunate position of already having that security. Maybe this relaxed, open-minded approach helped, because I got an offer.

There’s a slightly embarrassing element to the next part. When the phone call first came through to offer me the position I hung up. At that precise moment there was a tearful post-grad in my office who had come to see me for help. I will always put supporting a student in distress ahead of any phone call, however important. Luckily UCC weren’t offended by my rudeness and called back later.

To end the story, here I am. There are lots of great reasons for being in Ireland right now, and specifically at UCC. These include a growing focus on my field, national investment in forestry and agroforestry, and a booming higher education sector. The reasons for leaving UK Higher Education would surprise no-one.***

Why though did I leave a permanent academic position at a global top-100 university with international recognition? Several junior colleagues were aghast at what looked like folly. I had invested 13 years in the institution, built up a research group, developed teaching materials that were tried-and-tested, and no-one was trying to get rid of me. On the contrary, at the same time as I was trying to leave, they gave me an award and a performance bonus. I loved my colleagues in ecology and evolution; they’re a wonderful group and remain friends. The opening to replace me attracted a host of well-qualified applicants and they had no difficulty recruiting someone brilliant.

Why then did I leave? More generally, why would anyone disrupt their stable work and family life to move mid-career? These are my reasons, which may not translate to everyone’s circumstances, but perhaps might help clarify my thinking for anyone in a similar situation.

  1. I had gone as far as possible in the context of my existing position. After 13 years without a sabbatical the lack of respite from accumulated responsibilities left no space to reflect or develop. The backlogged manuscripts weren’t getting written; new projects were grounded; every year the expectations rolled over and incrementally increased. The thought of spending another year (or more) doing the same thing in the same place filled me with existential dread. Had I felt as though an alternative was within reach then I would have stayed. There’s no complaint implied here; the job had just became one that didn’t fit me any more.
  2. This was a quiet period with several major projects recently completed. Although I had four PhD students on the books (all with co-supervisors), actually the group was at a relatively low ebb, and nothing new was on the horizon. This was partly deliberate; having made the decision to go, I didn’t want to leave too many people in the lurch.
  3. It was time for a new challenge. When I returned to the UK from Malaysia in 2002 I had no intention of staying for long. That it took 16 years for me to leave again was simply because the opportunities lined up that way. Life had become comfortable but also a bit boring.
  4. I wanted to shake up my perspective. After over a decade working in the same place you know your colleagues well and if collaborations haven’t sparked then there’s little chance that they will. Working with new people is the best way to expose yourself to new ideas. This either means moving yourself or hoping that fresh recruits will restore energy in the place you’re already based. It had been a very long time since the latter had happened (after 13 years I was still the youngest permanent member of staff in the building) so I left instead.
  5. We were starting a family, which prompted reflection on my approach to work-life balance. Long hours, working evenings and weekends throughout the semester, were not compatible with the life I wanted or the parent I hoped to be. Nor was I going to be taking extended trips overseas to visit field sites and collaborators. The fieldwork had been one of the compensations of my old job; if that was being scaled back then I wanted the possibility of stronger research interests at home.

I can’t say just yet whether the move has been successful, and at any rate there’s no way to know for sure without a controlled comparison of some partial metric. But what I can say is that I’m enthusiastic about science again, enjoy coming into work every morning, and optimistic about getting some projects I care about off the ground. On that basis alone it’s been worth it. In fact, the department will be recruiting more people very soon — if you want to join us then keep your eyes open for forthcoming positions!

 


* For ‘permanent’ you can read ‘tenured’ if you like, but the truth is that tenure doesn’t mean quite the same thing outside North America. Universities generally can’t fire us for no reason but the level of protection isn’t equivalent. For ‘research-intensive’ you can read R1 in the USA, or Russell Group in the UK, or whatever your local class of prestige universities is.

** I’m not telling you how many failed applications had gone in over the preceding few years, but there were plenty. These had however been rather speculative; what changed was that I put serious effort into developing much stronger applications.

*** Brexit, HE funding issues, Brexit, low pay, Brexit, workload, Brexit, managerialism… did I mention Brexit?

Decolonise biogeography!

sinclair_in_peru

Arthur Sinclair in Peru. This image is taken from a fundraising page set up by his great-grandson, the writer Iain Sinclair, who is currently writing a book and producing a podcast series based on his attempts to retrace the original expedition.

Arthur Sinclair was a Scottish explorer and geographer whose most influential commission was his 1890 survey of half a million square miles of interior Peru produced on behalf of the Peruvian Corporation of London. In his report* he expresses limited sympathy for the indigenous inhabitants of this vast wilderness:

Poor Chuncho! The time seems to be approaching when, in vulgar parlance, you must take a back seat; but it must be acknowledged you have had a long lease of those magnificent lands, and done very little with them… The world, indeed, has been made neither better nor richer by your existence, and now the space you occupy — or rather wander in — to so little purpose, is required, and the wealth of vegetation too long allowed to run waste, must be turned to some useful account.

Modern readers with our current sensibilities will gasp at the patronising imperialism embedded in these words. Anyone with an awareness of the subsequent consequences for the people who once lived in and depended on these forests, not to mention the damage to the forests themselves, will be appalled that devastating change was proposed with such casual insouciance.

Attitudes of this nature were widespread among the imperial powers. I don’t mean to pick on Sinclair as a particularly egregious example because he wasn’t. My choice of this passage is solely due to the coincidence of having come across it recently, but I could have chosen from many; some now notorious, others obscure but nonetheless consequential at the time they were written. For example, Sinclair’s explorations took place at the same time as Joseph Conrad was embarking on his own travels into the Congo basin**.

peru_railway2.jpg

The Peruvian railway system was a triumph of colonial engineering which opened the interior of the country for trade and resource extraction. Photo by David Gubler. Source: Wikimedia Commons.

For all we might be appalled by the opinions of our predecessors, and strongly disavow them now, as biogeographers we must face up to the fact that our field arose first and foremost as an exercise in colonialism. Its raisin d’etre was to describe, delineate and evaluate the natural wealth of foreign lands for the benefit of colonial powers. We remain complicit while our institutions continue to memorialise and celebrate our forebears, often in buildings paid for by the proceeds of slavery, extraction of resources and appropriation of land. It’s not sufficient to say that we know better now when the advantages we accumulated through colonialism reinforce persistent inequalities.

In our new paper*** we draw attention to this ongoing problem. Ironically human geographers are acutely aware of the need to engage with colonial legacies, while the physical and biogeographers with whom they often share buildings typically assume that such concerns do not apply to them. This position cannot be defended.

One way in which this applies is in the distribution of biogeographical researchers. We extracted the institutional addresses of over 7000 authors of papers in the three leading journals of biogeography over a five year period, and show that only 11% of them are based in the tropics. Over 5000 of them are in the northern hemisphere, mostly in what we consider Global North countries.

biogeoauthors

Distribution of all authors contributing to papers in the three main international journals of biogeography over a five year period (2014–2018). Figure 1 in Eichhorn et al. (2020).

There are lots of forces underlying this pattern, but they all act to reinforce the dominance of Global North institutions. I don’t doubt that it would be similar for many other academic fields; this however only demonstrates how pervasive the problem is.

Meanwhile, the predominant flow of information (as data and records) is from Global South countries towards these centres of influence. The main databases which compile global biogeographical records are based in Europe, North America or Australia, and are maintained by researchers based in those countries. Is this harvesting of data any different in its dynamics to that of colonial resources?

Another issue is that, mostly subconsciously, the way that studies from Global North countries are described and framed differs from those published elsewhere. In a brilliant study published last year, Ergin & Alkan parse the signal of academic neo-colonialism in the language used by authors. Global North scholars write from a perspective of supposedly impartial generality, while southern scholars include geographical indicators that reinforce their position as producing localised case studies or applications.

This is an insidious effect, and easily deniable until pointed out. When one of the reviewers questioned this, it took me only a few moments to identify a number of papers from Wytham Woods (a study site in Oxfordshire) with titles that gave no hint of their origins, and purport to make broad ecological statements. Would a paper from the Rwenzoris in Rwanda be written, accepted and published with the same level of detached abstraction? I severely doubt it.

What can we do? On recognising the problem our collective responsibility is to reverse the trend. This depends on the behaviours of individuals and research groups. Capacity building, proper recognition of collaborators and support for research agendas from beyond the Global North are all part of the process. Opening up biogeography also means enabling researchers from across the world to not only have access to repositories of data, but to develop and host their own. And when we write, we should learn from the humanities and recognise how our positionality inflects the way we view and describe the world around us.

Achieving these things is not merely an act of contrition for past injustices; opening up the field will increase the diversity of insights and validity of our findings, making  biogeography into a truly global science. We need to decolonise biogeography.

 


* I obtained this excerpt from the travel diary of his ancestor Iain Sinclair, who is no apologist for his great-grandfather. He also has a blog serialising his attempts to retrace the path of the Peru expedition.

** Please don’t send me comments along the lines of how Heart of Darkness is a classic of world literature. It is both a classic work and appallingly racist.

*** Thanks to my brilliant coauthors Kate Baker and Mark Griffiths who continue to teach, inspire and provoke me with their insights.

 

Books I haven’t read

V.D10 Origin 1st edn title page_0

The opening pages of Darwin’s classic text in its first edition as held by the library of St John’s College, Cambridge.

A number of years ago on a UK radio show there was a flurry of attention when Richard Dawkins, under pressure from a religious interviewer, was unable to recall the full title of Darwin’s most famous book*. This was perceived as a flaw in his authority as an evolutionary biologist. How could he claim to support evolution if he couldn’t even name the book which launched the theory?

There was a prompt backlash to this line of argument from scientists who pointed out that we don’t have sacred texts in science. Unlike religions which fixate upon a single original source**, we recognise those who made contributions to the development of our field but don’t treat them as inviolable truth. Darwin, like all scientists, got some things wrong, didn’t quite manage to figure out some other problems, and occasionally changed his mind. None of this undermines his brilliance; the overwhelming majority of his ideas have stood the test of time, and given the resources and knowledge he had available to him (remembering that it was another century until we understood the structure of DNA), his achievement was astonishing.

Confession time: I haven’t read On the Origin. Maybe I will one day, but right now it’s not on my very long reading list.

There are many good reasons for reading On the Origin, none of which I need to be told. By all accounts it’s a fascinating, well-written and detailed argument from first principles for the centrality of natural selection in evolution. As a historical document and inspiration for the entire field of biology its importance is unquestionable. I’m certain that Richard Dawkins has read it, even if he didn’t memorise the title.

None of this means that I have to read it. The fundamental insight has been affirmed, repeated and strengthened by over 150 years of scientific study and publication. Even though I used to be a creationist, it didn’t take reading Darwin to change my mind***. What we know now makes a modern account more convincing than a Victorian naturalist could ever have managed.

An even more embarrassing confession is that I haven’t read The Theory of Island Biogeography****. This admission is likely to provoke horror in anyone from that generation of ecologists (my own lecturers) who remember the seismic impact that MacArthur & Wilson’s 1967 book had on the field. It defined the direction of enquiry in many areas of ecology for decades afterwards and effectively founded the scientific discipline of conservation biology. Some of their ideas turned out to be flawed, but the majority of ecologists still view the central model as effectively proven.

I’m not one of them. Yes, I apparently fall among the minority of ecologists, albeit led by some pretty influential voices, who view the model as so partial and incomplete as to lack predictive value in the real world (I’m not going to lay out my argument here, I’ve done that before). That I’ve reached this decision without reading the original book doesn’t perturb me in the slightest. In the same way as I’m confident that I can understand evolutionary theory without reading Darwin, I’ve read enough accounts of the Equilibrium Model of Island Biogeography (and taught it to undergraduates) that it’s not as if going back to the original source will change my mind.

If this upsets you then consider whether you’re happy to agree with the majority of evolutionary biologists that Lamarck’s model of inheritance was wrong without without bothering to read Lamarck (or his later advocate Lysenko). Lamarck made many great contributions to science; this wasn’t among them. For similar reasons I’m happy to make judgements on Haeckel’s embryological model of evolution (rejected), Wegener’s theory of plate tectonics (accepted), or Hubbell’s neutral theory (ambivalent), all without reading the original books.

What have I actually read then? Among the great classics of our field I’m pleased to have gone through a large number of Wallace’s original works, which were contemporaneous to Darwin, and Humboldt’s Essay on the Geography of Plants (1807). I can strongly recommend them. But they didn’t change my mind about anything. It was enjoyable to go back to the original sources, and by the end I was even more impressed by the authors’ achievements than before, but my understanding of the world remained unaltered. For that reason I wouldn’t ever claim that everyone should read them.

There are, however, a number of books which have changed my mind or radically reorganised my understanding of the world. These include Chase & Leibold’s 2003 book about niches or Whittaker & Fernandez-Palacios on islands. Without having read them I wouldn’t hold the opinions that I do today. I’m glad that I placed those higher on my reading list than On the Origin. But that certainly doesn’t make them essential reading for everyone.

We all follow our own intellectual journeys through science and there is no one true path. For this reason I’m always sceptical of attempts to set essential reading lists, such as the 100 papers every ecologist needs to read, which I and others disagreed with more on principle than content. So yes, if you like, you can think less of me for the reading that I haven’t done. But my guess is that many people who read this post will be feeling a quiet reassurance that it’s not just them, and that it’s nothing to be ashamed about.

 


* It is, of course, the barely memorable “On the Origin of Species by Means of Natural Selection or the Preservation of Favoured Races in the Struggle for Life.”

** This in itself is baffling given that sacred texts have their own complex histories of assembly from multiple sources. Most modern Christians don’t dwell on the fact that the issue of which books to include in the Bible was so contentious, especially for the Old Testament, and some traditions persist with quite different Bibles. Why include Daniel but not Enoch? Then there’s deciding which version should be seen as definitive, and whose translation… it’s not as simple as picking the one true book.

*** Notably a dominant theme in creationist critiques of evolution is to pick away at perceived errors or inconsistencies in Darwin’s writings on the assumption that undermining its originator will unravel the whole enterprise of modern biology.

**** And this from a former book reviews editor of the journal Frontiers of Biogeography. They’ll be throwing me out of the Irritable Biogeography Society next.