Category Archives: Uncategorized

The bitter tree

Like many ecologists I have a fascination with the scientific names that attach themselves to species. Sometimes these celebrate the person who discovered or described the species*, or a benefactor, or are made as a tribute to a notable person. One that I recently stumbled across is the South American tree Quassia amara, a common understorey species of disturbed lowland forests. Until I encountered the backstory to its name while reading The Ethnobotany of Eden (which I strongly recommend) I had no idea where its name came from. The story is complex and revealing.

The tree is one of a relatively small proportion of tropical species which owes their name to Linnaeus, indicating that its significance was recognised early in the development of modern taxonomy. That a Swedish botanist came to hold a sample is due to its potential as a remedy for fevers, a serious concern of the European powers whose hold over tropical lands was still tenuous while their colonists struggled with malaria and other unfamiliar afflictions.

The tree’s Latin name celebrates the man who introduced the plant to Europeans as a medicine, the freed slave Graman Quassi (c.1690–1787), originally of the Akan people of West Africa from modern-day Ghana, hence his Kwa name Kwasimukámba which I will use in preference here. He arrived in Suriname as a child slave of the Dutch empire. So successful was Kwasimukámba that he not only lived an unusually long and ultimately comfortable life but was also celebrated internationally. He even travelled to The Hague and received an audience with Willem V, the Prince of Orange, who bestowed a number of extravagant gifts on him in recognition of his service to the empire.

If we ended the story here then it would be almost heart-warming. But let’s delve deeper. How did Kwasimukámba come to be a freed slave?

Many other slaves escaped from servitude in South America and formed independent communities, known as the maroons, often deep into colonised territories. Some of these became well-established enough to effectively become trade partners of the European powers and were tolerated. Others, such as the Saramaka, fought lengthy insurgencies before eventually winning this recognition. It was in this struggle that Kwasimukámba first demonstrated his worth to the Dutch, acting as a negotiator and tracker on behalf of the white colonists. Later he led a corps of African conscripts known as the Black Rangers, even losing one of his ears in the fighting against the rebels. For his efforts he was gifted a gold breastplate on which was inscribed ‘Quassie, faithful to the whites’. The Saramaka remember him as a traitor.

What then of the tree that bears his name? It was Kwasimukámba who introduced it to Europeans as a local remedy for fever in his other noted capacity as a herbalist and sorceror**. It was soon overtaken as a cure for malaria by Cinchona, another South American tree and the source of quinine. Nevertheless, Quassia amara is still used as an effective treatment for intestinal parasites, an insecticide, and a bittering agent in foods and drinks. The second part of the species name, amara, comes from the Spanish word meaning ‘bitter’. Even here Kwasimukámba is memorialised because the most bitter of the tree’s chemical constituents is now known as quassin, one of a family of chemicals called the quassinoids. These are amongst the most bitter-tasting chemicals in nature and form ingredients of Angostura bitters***.

Quassia_amara12

Flowers of Quassia amara. All parts of the plant can be used for their extracts.

Without wishing to be unfair to Kwasimukámba, whose reputation as a healer cannot be entirely unfounded, it is highly unlikely that he personally discovered the medicinal benefits of Quassia. More likely is that he learnt of its efficacy through his interactions with the Saramaka or other maroons. They, in turn, are likely to have acquired the knowledge from the indigenous peoples they encountered in the forests. My scepticism about who deserves the credit is simply based on a matter of probability. Native healers had been using the tree for many generations before the African slaves and their European masters arrived, and continue to do so.

We will never know who first named the bitter tree and divined its useful medicinal properties. In a fair accounting of history they would receive the credit for Quassia, although Kwasimukámba deserves his recognition too. He can hardly be blamed for the accolade of being immortalised in science as the tree was named for, not by, him. Then again, in his later years he styled himself ‘Professor of Herbology’, so he was not averse to personal aggrandisement.

So who was Kwasimukámba: a manumitted slave who achieved fame in his lifetime? An imperial collaborator? A talented herbalist? Or a charlatan who took credit for the insights of others? The truth must be all of those things, and no single story is complete without the rest. We should beware making moral judgements on our forebears, as I’ve argued before, because these were complex people making personal decisions in very different times. The name Quassia links a bitter-flavoured tree to a bitter history, one that invokes slavery, oppression, forgotten indigenous peoples and the legacies of colonialism. The struggles of the Saramaka for recognition of their rights continue to the present day. Perhaps we shouldn’t resent Kwasimukámba his place in the annals of science though. At least this once an oppressed slave managed to make a decent life for himself. I can raise a glass to that.

 


 

* Which usually means the first person from the Global North to place the species in the context of a largely imperial system of classification. That a species was long known to local people in its place of origin is usually overlooked, although taxonomists are getting better at this.

** Not all of his concoctions were as widely approved of.

*** Amusingly Angostura bitters do not contain the bitter-tasting bark of Angostura trifoliata, but are instead named after the town in Venezuela from where the recipe originated. I should write another blog post about that.

Oops there goes another rubber tree

2.10

Rubber trees bend, and they also break. This is one of the CATAS plantations in China being battered by strong winds.

Frank Sinatra credited an ambitious ant with shifting a rubber tree, which was a rather implausible achievement. The presumptuous ant might have been taking undeserved credit for a much more normal occurrence, which is rubber trees blowing over in strong winds. That happens all the time with no help from ants.

Rubber tree plantations are found in the tropics, these days predominantly in Asia (despite the rubber tree Hevea brasiliensis being originally a South American species, as its name suggests). Many of the main regions of production are also subject to periodic hurricanes which can cause serious damage to the trees and therefore economic losses to the producers. We can’t prevent hurricanes, but could we manage plantations to make them more wind resistant?

This is a question which we approach in a new paper (led by my collaborator Yun Ting at Nanjing Forestry University*) using terrestrial laser scanning and some novel computational methods. The study site was a research station in southern China where a number of different clones have been planted, each of which has a distinctive branching architecture. We already know which of these is most vulnerable to blow-down from the evidence of past hurricanes. What management options are likely to help prevent damage to the trees?

At present simulations are the only way to approach this problem. It’s not just that it’s far too risky to carry out detailed environmental measurements in the middle of a hurricane (not least because the trees often fall over), but the equipment is likely to blow away too. It’s hard enough to find equipment that’s capable of withstanding hurricane conditions anyway. And then there’s the issue of not knowing when or where a hurricane will strike. Our approach can potentially circumvent all these limitations.

The study involved a large team all of whom had incredibly specialised skills and I don’t pretend to fully understand all of them. The first step was using a terrestrial laser scanner to record the full three-dimensional structure of several rubber trees. This creates a high-resolution point cloud which could be used to reconstruct virtual trees which aim to capture every leaf and branch on the original tree. These were then used to create plantations which were subjected to hurricane-force winds in a simulation environment developed for testing the aerodynamics properties of aircraft and other vehicles. This allowed us to evaluate what the implications of crown structure were for the wind forces experienced by trees during a hurricane. No-one had to put themselves at any risk.**

image3

Computer visualisations of two rubber trees reconstructed from terrestrial laser scanning data. Notice how the one on the left has broader, spreading branches, while the one on the right has a more upright form.

There are many caveats to this study; don’t bother pulling it apart because we already know a good few of them. Most of the assumptions and simplifications were enforced by the computational demands of such high-resolution simulations. For example, plots were relatively small in size (nine trees at a time).*** More problematic is that adding natural flexibility to the trees wasn’t an option. What we are measuring is therefore the wind resistance generated by something approximating a tree made of platinum. Of course we know that this isn’t realistic, but I’d suggest that you look at the qualitative outcomes rather than expecting the quantitative predictions to be precisely accurate. This is a first demonstration of the approach, not a complete realisation of every possible feature. It’s good enough to reveal some interesting differences.

What did we find? Well, clones with larger, denser crowns put up greater wind resistance and generate higher degrees of turbulence, making them more likely to be damaged when exposed to hurricane-force winds. Reassuringly, these are the same ones that blow over more commonly in the real world. So we did all this sophisticated work… and found out something that everyone already knew.

That’s not really the point though. Having shown that the simulations generate reasonable outcomes that match with experience, we can now start to tweak the models and explore the impact of management strategies. Manipulating tree spacing or thinning of tree crowns can all be done virtually, more quickly and with less effort than establishing another plantation and waiting for the next hurricane to discover whether it was effective. What we have is a framework for trying out almost any combination of tree sizes, shapes and arrangements.

Rubber plantations are one of the simplest types of forest, which was a deliberate choice. We can imagine further applications of our method in more complex habitats, where a mixture of tree species could be put together to see how real-world forests cope in the face of one of nature’s greatest destructive forces. That’s our eventual aim, anyway. High hopes, as Sinatra would have put it.

 


 

* This was a massive collaborative study in which my own role was very minor. I haven’t even visited the field site.

** No trees were harmed in the preparation of this paper. It’s not even in a print journal.

*** When I suggested some minor amendments to a late draft I thought Yun Ting was about to cry. It’s hard to convey quite how much computational effort goes into generating just one of these figures. Simulations are not easy research!

 

Tree diseases harm people too

surano

An olive grove in Surano, Italy, following infection by Xylella fastidiosa. Wikimedia Commons CC BY-SA 4.0 Sjor

In the midst of a global pandemic, expressing concern about trees can feel like an indulgence. That’s not to say that COVID-19 is foremost on everyone’s minds. For farmers in East Africa about to face another devastating plague of locusts, or those in Vanuatu whose homes were destroyed by Cyclone Harold, there are more pressing concerns. On our own doorstep there are many people going without food in the UK, one of the world’s richest countries. Even climate change is having to take a back seat for the time being.

And so you might find it hard to spare any thought for a disease which is quickly sweeping through Mediterranean olive trees. For those of us living in northern Europe the direct impact is likely to be felt as increased prices for olives and olive oil, and perhaps even some scarcity in the next few years. I’m sure your heart bleeds for the middle classes.

Many people don’t have the luxury of shrugging this calamity off though. And it truly is a calamity for producers in some of the poorest regions of Europe, particularly in rural parts of Italy, Greece and Spain where olives and other orchard crops have been the mainstay of livelihoods for generations. They are about to face not only a pandemic, but the prolonged recession that will be its aftermath, in which their whole way of life will simultaneously be disappearing. A new paper estimates that the total economic impact of this disease could exceed €5B*. It isn’t just about the trees.

This was a disaster that could so easily have been avoided. The pathogen Xylella fastidiosa was first recorded in olive trees in southern Italy back in 2013. It likely arrived from Central America where similar pests are endemic, probably via the horticultural trade. National and international agencies rapidly responded, and a containment plan was put in place, agreed with both the Italian government and European Commission. This was beset with protests and legal challenges which delayed any action. Amazingly, several scientists and public officials were accused of deliberately spreading the disease and subjected to a police investigation**. All this wasted valuable time.

By 2016 the disease had already reached Spain and southern France. Now it’s also been detected in Portugal and Israel. Unfortunately it appears that trees can be asymptomatic for some time, which has made tracking and modelling its spread more problematic. It can also infect a range of other trees, including common orchard species like cherries, almonds and plums. These are often asymptomatic hosts though, which means that the disease can spread across landscapes undetected, carried by common xylem-sucking insects. Now that the disease is at large across Europe, local containment might be possible in some regions, but a full outbreak is almost inevitable.

The new paper by Schneider et al. suggests that constant vigilance, which means routine testing and reporting of cases, should be combined with measures to reduce the rate of spread, such as felling of trees in buffer zones and vector control. This may buy enough time for resistant trees to be developed and planted, or for a transition to substitute crops. To do so requires trust and co-operation between government agencies and individual landowners, something that has thus far been in short supply. It also inevitably means that some farmers will have to sacrifice healthy orchards for the good of others.

Whatever the outcome, the landscape of large parts of the Mediterranean looks set to change dramatically in the coming years. And all because we failed to act even when the evidence was in place. Will we learn the lessons for next time?

 


 

* Roughly equivalent to $5.5B or £4.5B at time of writing, but have you seen how crazy FOREX markets are at the moment? Oddly some of the media coverage has talked about a figure of €20B, which wasn’t the headline figure of the original paper, but appears to be based on adding up all the largest estimates and assuming a worst-imaginable scenario.

** Italy has previous form when it comes to prosecuting scientists and officials whose advice is not approved of.

What trees would we plant to maximise carbon uptake?

fangorn

Fangorn Forest as represented in Lord of the Rings: The Fellowship of the Ring. This is fantasy fiction, not the type of habitat that we should expect to find or create in the real world.

One of the reasons often put forward for growing more trees is that it’s a method to draw down carbon from the atmosphere and lock it up in wood. Afforestation is far more efficient and straightforward than any currently imagined ‘carbon capture technology’. Photosynthesis is the original carbon uptake mechanism, evolved and perfected over more than a billion years, and human ingenuity isn’t going to design anything better, at least not on the scales required by rapid climate change. Trees have done it all before.

Before going much further, and for the avoidance of doubt, planting trees (even a trillion of them) isn’t going to solve the climate crisis. It’s one potential tool but no substitute for massive reductions in emissions. At its very best, tree-planting would only remove the carbon which has been released from land-use change, not that from the burning of fossil fuels, which represent stores of carbon generated hundreds of millions of years ago. That’s even before we get to the contentious question of where we might plant the trees, which usually turns out to be someone else’s country.

The other problem with tree-planting is that it assumes the trees either stay in place or are continuously replenished. Forest fires, logging, land clearance, droughts, pest outbreaks… all the potential causes of tree mortality will eventually lead to this carbon returning to the atmosphere. A tree is at best a temporary carbon store, albeit one that can last a few centuries. We’re not laying down any new coal deposits, they just don’t make it any more. I’m therefore sceptical of any off-setting program which justifies current emissions on the basis of anticipated long-term carbon storage through tree planting. It’s not something we can rely on.

All these caveats accepted, we can begin to ask the question: if we really were planting forests with the primary objective of taking up carbon, and we planned to do it in the temperate countries which are overwhelmingly responsible for global change, what should we plant?

A recent newspaper article asked this question with the UK in mind*. By coincidence I had a Twitter exchange on the same subject shortly before the article came out. As a thought experiment it’s a reasonable discussion to have, and by doing so publicly it forces people to contemplate the implications of the arguments that are so often made for planting trees.

The right tree species should be (a) fast-growing under local conditions, (b) tall, preferably forming dense forests with as little space between the trees as possible, and (c) of high wood density, maximising the amount of carbon for a given volume of trunk. Ideally they should also be long-lived and relatively resistant to the many forms of disturbance that kill trees, including extreme weather and diseases. There’s no single species of tree that satisfies all these conditions, not least because high wood density leads to slower growth rates. Some compromise is necessary.

The conclusion of the article, taking into account the assumption that carbon uptake was the sine qua non, was that plantations of fast-growing non-native conifers were the best way forward.** The backlash to this suggestion was immediate, predictable and justified. Such tree species are not only hopeless for conservation (and therefore would lead to a net loss of biodiversity) but also aesthetically undesirable as they would transform familiar landscapes. Yes, say the public, we want more forests, but surely not like this.

I don’t disagree with any of the objections to such a scheme, but it does highlight the inherent problem with making so many claims for the benefits of tree planting that are logically incompatible. It is impossible to design a forest which maximises all the potential functions we want from them: promoting native species, boosting biodiversity, storing carbon, amenity value, aligning with our aesthetic preferences, and maybe also providing some economic benefit to the landowners who are being asked to turn over their productive estates to trees. If we pick just one of these factors to emphasise — in this case carbon — then inevitably we will have to lose out on the others.

Every response to climate change presents us with difficult choices. The trite maxim that we should plant more trees puts people in mind of a sylvan idyll of sun-dappled glades beneath the bowers of mighty broad-leaved giants. Such forests exist in Europe only in the imagination. If real trees are to be used to solve our problems then real forests will be necessary, and they might not be the ones that everyone expected. Be careful what you wish for.

 


 

* Full disclosure: the academic whose views the article reports, Prof. John Healey of Bangor University, is also a collaborator of mine (we co-supervise a PhD student). We haven’t shared our opinions on this topic though.

** My pick, if we’re playing tree Top Trumps, is the Nordmann fir, Abies nordmanniana, which is so much more than just a great Christmas tree. It’s also the tallest-growing native tree in Europe and as a montane species tolerates a wide range of challenging environments.

Stop the eco-triumphalism now

ET6ZIXgWsAAvQQk

There is no excuse for this. Image taken from the Twitter feed of XR East (@xr_east)*. Note that Extinction Rebellion UK have disavowed both this group and the sentiments expressed in the post. Similar opinions have been circulating on social media for some time though.

I’m angry. I was reluctant to write anything on this blog about COVID-19, pandemics or epidemiology, given that none of them are among my specialisms*. But lately I’ve been appalled by a subset of ecologists and environmental campaigners — admittedly small — who have used this crisis to score points for their personal obsessions.

This is not to criticise those who seek positive messages to cling onto when there is so much to mourn or fear. Some of this desire is misdirected, including numerous stories of nature returning to areas which have seen dramatic reductions in human activity. Whether it’s drunk elephants in Chinese villages or dolphins in Venice, these have usually turned out to be false reports. Does it matter that many people share stories that are quickly disproven if they offer a brief cheering distraction? Perhaps not so much. There are more dangerous lies out there.

Yet linked to many of these stories is an agenda, sometimes implicit, other times actively advanced, which has at its core a desire to show that something good has come of the pandemic. Few would say so with quite that lack of subtlety, but there is no other way to interpret many of the articles on declines in air pollution, clear skies or the potential for advancing a green agenda during the anticipated recovery. I’m more sympathetic to those who point out that the international response shows that we have the ability to pull together for the collective good that could be harnessed again. There may be opportunities to be taken once we can see the light at the end of the tunnel, but that still seems a long way off.

Others are even less subtle, for example the assertion by the European Agroforestry Federation that all this wouldn’t have happened if we’d had more agroforestry. Even if true, and despite being a longstanding advocate of agroforestry I’d struggle to make an evidence-based case for it***, now is surely not the time.

The most pernicious narrative, one which lurks in the fringes of the Deep Green movement and other more mainstream environmentalist groups, is that anything that reduces human populations and activities can only be a good thing for the environment. I have even seen it argued (and I refuse to link to it) that COVID-19 is an expression of Gaia immunity, the Earth fighting back against the infestation of humans that is taking it out of some mythical balance****. The refrain ‘we are the virus’ has been expressed repeatedly on social media. The sentiment at the top of this article is not a isolated incident. Nor is it a new idea; in 1988 Prince Philip told a German interviewer that he was ‘tempted to ask for reincarnation as a particularly deadly virus’ so as to aid in population control. Yes, he really did say that.

To believe that a catastrophe causing widespread mortality is somehow beneficial is simple eco-fascism. Privileged people in the Global North, living in splendid isolation whilst delighting (even indirectly) in the death, privation and suffering of others, is not morally acceptable on any grounds. There is no positive side to a pandemic.

There are two strands to eco-fascism. One has a long history in the belief that restricted national resources compel populations to resist immigration in the name of supposed sustainability. This narrative has already inspired acts of terrorism, but also been made in more mainstream settings by France’s National Rally (Rassemblement national) among others. Most reasonable, compassionate people would disavow this set of arguments.

The other narrative, however, holds that any reduction in human population can only help the natural world. This is sometimes expressed as the suggestion that not having (so many) children is the best way of combating climate change, an argument which is both logically and morally suspect. Any case for global population control always ends up being anti-female, racist and ultimately anti-human. Even if we might agree on a ‘sustainable’ human population level — and that’s not an easy figure to decide upon — the process of reaching it is unconscionable by any means, and certainly with the speed required for it to be an effective measure. As Jenny Turner put it, “How can a planet lose seven or eight billion humans… without events of indiscriminate devastation? When people start thinking about getting rid of other people, which sorts of people does history suggest are usually got rid of first?”

On this point, in which countries do you expect the death rate from COVID-19 to be the highest? It will almost certainly be in the Global South where access to medical care, nutrition and government support is most lacking. They are not the ones who bear the greatest responsibility for global change, and once again they will carry the heaviest burden. This pandemic is first and foremost a tragedy. It will not in itself be of any lasting benefit to the environment.

We are rightly offended when politicians use disasters to advance their own partisan agendas. As scientists, including ecologists, we need to step back from the campaigns that are usually at the forefront of our minds and accept that this situation is a distinct crisis of its own. Yes climate change remains a pressing issue; yes extinctions continue apace; yes the international wildlife trade is abhorrent. But these are separate problems which the world’s attention will return to again. For now simple compassion and decency requires us to stand back and accept that for once it isn’t about us.

 


 

* I suspect (and rather hope) that the post will be taken down, but here’s the evidence. There is a dispute going on about whether this represents a genuine faction of XR, or ‘infiltrators’, which are difficult for an autonomous movement with no central leadership to protect themselves against. It wouldn’t be the first time that someone associated with XR has said something offensive though. For the record, I continue to be a strong supporter of XR and their core principles.

gscreenshot_2020-03-25-101141

** Not that this has stopped quite a number of people.

*** The only relevant project I’ve been involved with, on mosquito-borne diseases in Northern Thailand, actually implicated orchard expansion as a likely cause of greater risk of infection. But this is a complicated subject and no single study can tell the whole story, particularly one which wasn’t designed to test that hypothesis.

***** I don’t have time here to explain in detail why Gaia theory is nonsense and has been rejected by the majority of ecologists. For more see pages 175–176 in my textbook and the papers it refers to. I’m also writing another book about it.

Junk pedagogical research

stickoftruth

Can I teach? I think so. Should I aspire to publish papers telling other people how to teach? Probably not. This is me showing students how to estimate tree height.

In my last job I was employed on a teaching-track position*. For many years this worked reasonably well for me. I enjoy teaching, I think I’m quite good at it, and I didn’t mind a slightly higher load as the price of not needing to satisfy arbitrary targets for research grant income or publication in high-impact journals. That’s not to say I stopped doing research, because obviously that didn’t happen, but I accepted that there was a trade-off between the two and that I was closer to one end of the spectrum. It still left me three clear months every summer to get out into the field and collect data.

Many UK universities developed teaching-track positions in response to the national research assessment exercise (the REF**) which incentivised them to concentrate resources in the hands of a smaller number of staff whilst ensuring that someone else got on with the unimportant business of running the university and the distraction of educating undergraduates. Such is the true meaning of research-led teaching.

A problem began to arise when those staff who had been shuffled into teaching-track positions applied for promotion. The conventional signifiers of academic success weren’t relevant; you could hardly expect them to bring in large grants, publish in top-tier journals or deliver keynotes at major conferences if they weren’t being given the time or support to do so.

Some head-scratching took place and alternative means were sought out to decide who was performing well. It’s hard enough to determine what quality teaching looks like at an institutional level***, and assessing individuals is correspondingly even more difficult.

The first thing to turn to is student evaluations. These largely measure how good a lecturer is at entertaining and pleasing their students, or how much the students enjoy the subject. Evidence suggests that evaluations are inversely proportional to the amount that students learn, as well as being biased against women and protected minorities. In short they’re not just the wrong measure, they’re actively regressive in their effects. Not that this stops many universities using them of course.

What else is there? Well, being academics, the natural form of output to aim for is publications. It’s the only currency some academics understand. Not scientific research papers, of course, because teaching staff aren’t supposed to be active researchers. So instead the expectation became that they would publish papers based on pedagogical research****. This sounds, on the face of it, quite sensible, which is why many universities went down that route. But there are three major problems.

1. Pedagogical research isn’t easy. There are whole fields of study, often based in departments of psychology, who have developed approaches and standards to ensure that work is of appropriate quality. Expecting an academic with a background in biochemistry or condensed matter physics to publish in a competitive journal of pedagogical research without the necessary training is unreasonable. Moreover, it’s an implicit insult to those colleagues for whom such work is their main focus. Demanding that all teachers should publish pedagogical research implies that anyone can do it. They can’t.

2. Very few academics follow pedagogical research. That’s not to say that they shouldn’t. Most academics teach and are genuinely interested in doing so as effectively as possible. But the simple truth is that it’s hard enough to keep track of the literature in our areas of research specialism. Not many can make time to add another, usually unrelated field to their reading list. I consider myself more engaged than most and even I encounter relevant studies only through social media or articles for a general readership.

3. A lot of pedagogical research is junk. Please don’t think I’m talking about the excellent, specialist work done by expert researchers into effective education practice. There is great work out there in internationally respected journals. I’m talking about the many unlisted, low-quality journals that have proliferated over recent years, and which give education research a bad name. Even if they contain some peer review process, many are effectively pay-to-publish, and some are actively predatory. I won’t name any here because that’s just asking for abusive e-mails.

Why to these weak journals exist? Well, we have created an incentive structure in which a class of academics needs to publish something — anything — in order to gain recognition and progress in their careers. A practice which we would frown upon in ‘normal’ research is actively encouraged by many of the world’s top universities. Junk journals and even junk conferences proliferate as a way to satisfy universities’ own internal contradictions.

What’s the alternative? I have three suggestions:

1. Stop imposing an expectation based on research onto educators. If research and teaching are to be separated (a trend I disagree with anyway) then they can’t continue to be judged by the same metrics. Incentivising publications for their own sake helps no-one. Some educators will of course want to carry out their own independent studies, and this should be encouraged and respected, but it isn’t the right approach for everyone.

2. Put some effort into finding out whether teachers are good at their job. This means peer assessments of teaching, student performance and effective innovation. All this is difficult and time-consuming but if we want to recognise good teachers then we need to take the time to do it properly. Proxy measures are no substitute. Whether someone can write a paper about teaching doesn’t imply that they can teach.

3. Support serious pedagogical researchers. If you’re based in a large university then there’s almost certainly a group of specialist researchers already there. How much have you heard about their work? Have you collaborated with them? Universities have native expertise which could be used to improve teaching practice, usually much more efficiently than forcing non-specialists to jump through hoops. If the objective is genuinely to improve teaching standards then ask the people who know how to do it.

If there’s one thing that shows how evaluations of teaching aren’t working or taken seriously it’s that universities don’t make high-level appointments based on teaching. Prestige chairs exist to hire big-hitters in research based on their international profile, grant income and publication record. When was the last time you heard of a university recruiting a senior professor because they were great at teaching? Tell me once you’ve stopped laughing.

 


 

* This is now relatively common among universities in Europe and North America. The basic principle is that some staff are given workloads that allow them to carry out research, whilst others are given heavier teaching and administrative loads but the expectations for their research income and outputs are correspondingly reduced.

** If you don’t know about the Research Evaluation Framework and how it has poisoned academic life in the UK then don’t ask. Reactions from those involved may vary from gentle sobs to inchoate screaming.

*** Which gave rise to the Teaching Evaluation Framework, or TEF, and yet more anguish for UK academics. Because the obvious way to deal with the distorting effect of one ranking system is to create another. Surely that’s enough assessment of universities based on flawed data? No, of course not, because there’s also the Knowledge Evaluation Framework (KEF) coming up. I’m not even joking.

**** Oddly textbooks often don’t count. No, I can’t explain this. But I was told that publishing a textbook didn’t count as scholarship in education.

From tiny acorns

My father planted acorns.

This is one of those recollections that arrives many years after the fact and suddenly strikes me as having been unusual. As a child, however, it seemed perfectly normal that we should go out collecting acorns in the autumn. Compared to my father’s other eccentric habits and hobbies, of which there were many*, gathering acorns didn’t appear to be particularly strange or worthy of note.

In our village during mast years the acorns would rain down from boundary and hedgerow oak trees and sprout in dense carpets along the roadside. This brief flourishing was inevitably curtailed by the arrival of Frankie Ball, the local contractor responsible for mowing the verges. His indiscriminate treatment shredded a summer’s worth of growth and ensured that no seedlings could ever survive.

sprouting_acorn

Sprouting acorn (Quercus robur L.) by Amphis.

Enlightened modern opinion would now declare that mowing roadside verges is ecologically damaging; it removes numerous late-flowering plants and destroys potential habitats for over-wintering insects. I’m not going to pass such judgement here though because it was a purely practical decision. Too much growth would result in blocked ditches, eventually flooding fields and properties. Frankie was just doing his job.

My father, however, couldn’t allow himself to see so many potential oak trees perish. His own grandfather had been a prominent forester back in the old country (one of the smaller European principalities that no longer exists), and with a family name like Eichhorn it’s hard not to feel somehow connected to little oak trees. He took it upon himself to save as many of them as he could.

And so it was that we found ourselves, trowels in hand, digging up sprouting acorns from the roadsides and transporting them by wheelbarrow to the wood on Jackson’s farm.  Here they would be gently transplanted into locations that looked promising and revisited periodically to check on their progress. Over the years this involved at least hundreds of little acorns, perhaps thousands.

They all died. This isn’t too surprising: most offspring of most organisms die before they reach adulthood. Trees have a particularly low rate of conversion of seedlings to adults, probably less than one in a thousand. That’s just one of the fundamental facts of life and a driving force of evolution. Why though did my father’s experiment have such a low success rate? He’d apparently done everything right, even choosing to plant them somewhere other trees had succeeded before**. It’s only after becoming a forest ecologist myself that I can look back and see where he was going wrong.

First, oak trees are among a class of species that we refer to as long-lived pioneers. This group of species is unusual because most pioneers are short-lived. Pioneers typically arrive in open or disturbed habitats, grow quickly, then reproduce and die before more competitive species can drive them out. Weeds are the most obvious cases among plants, but if you’re looking at trees then something like a birch would be the closest comparison.

Oaks are a little different. Their seedlings require open areas with lots of light to grow, which means that they don’t survive well below a dark forest canopy. Having managed to achieve a reasonable stature, however, they stick around for many centuries and are hard to budge. In ecology we know this as the inhibition model of succession. Oaks are great at building forests but not so good at taking them over.

The next problem is that oak seedlings do particularly badly when in the vicinity of other adult oak trees. This is because the pests and diseases associated with large trees quickly transfer themselves to the juveniles. An adult tree might be able to tolerate losing some of its leaves to a herbivore but for a seedling with few resources this can be devastating. This set of forces led to the Janzen-Connell hypothesis which predicts that any single tree species will be prevented from filling a habitat because natural enemies ensure that surviving adults end up being spread apart. A similar pattern can arise because non-oak trees provide a refuge for oak seedlings. Whatever the specific causes, oak seedlings suffer when planted close to existing oaks.

This makes it seem a little peculiar that acorns usually fall so close to their parent trees. The reason acorns are such large nuts*** is that they want to attract animals which will try to move and store them over winter. This strategy works because no matter how many actually get eaten, a large proportion of cached acorns remain unused (either they’re forgotten or the animal that placed them dies) and so they are in prime position to grow the following spring. Being edible and a desirable commodity is actually in the interests of the tree.

jay

A Eurasian jay, Garrulus glandarius. Image credit: Luc Viatour.

Contrary to most expectations, squirrels turn out to be pretty poor dispersers of acorns. Although they move acorns around and bury them nicely, they don’t put them in places where they are likely to survive well. Jays are much better, moving oaks long distances and burying single acorns in scrubby areas where the new seedlings will receive a reasonable amount of light along with some protection from browsing herbivores. My father’s plantings failed mainly because he wasn’t thinking like a jay.

My father’s efforts weren’t all in vain. The care shown to trees and an experimental approach to understanding where they could grow lodged themselves in my developing mind and no doubt formed part of the inspiration that led me to where I am today****. From tiny acorns, as they say.

 


 

* His lifelong passion is flying, which at various points included building his own plane in the garden shed and flying hang-gliders. It took me a while to realise that not everyone’s father was like this.

** One possible explanation we can rule out is browsing by deer, which often clear vegetation from the ground layer of woodlands. Occasional escaped dairy cows were more of a risk in this particular wood.

*** Yes, botanically speaking they are nuts, which means a hard indehiscent (non-splitting) shell containing a large edible seed. Lots of things that we call nuts aren’t actually nuts. This is one of those quirks of terminology that gives botanists a bad name.

**** Although I’m very sceptical of teleological narratives of how academics came to choose their areas of study.

If you can’t sketch a figure then you don’t have a hypothesis

herbivory

Caterpillar feeding on Urena lobata leaf in Pakke Tiger Reserve, Arunachal Pradesh, India. What is the response by the plant? Photo by Rohit Naniwadekar (CC BY-SA).

This is one of my mantras that gets repeated time and again to students. Sometimes it feels like a catch-phrase, one of a small number of formulaic pronouncements that come out when you pull the string on my back. Why do I keep saying it, and why should anyone pay attention?

Let me give an example*. Imagine you’re setting up an experiment into how leaf damage by herbivores influences chemical defence in plant tissues. You might start by assuming that the more damage you inflict, the greater the response induced in the plant. This sounds perfectly sensible. So are you ready to get going and launch an experiment? No, absolutely not. First let’s turn your rather flimsy hypothesis into an actual expectation.

gr1

Why is this a straight line? Because you’ve not specified any other shape for the relationship. A straight line has to be the starting point, and if you collect the data and plug it directly into a GLM without thinking about it, this is exactly what you’re assuming. But is there any reason why it should be a straight line? All sorts of other relationships are possible. It could be a saturating curve, where the plant reaches some maximum response; this is more likely than a continuous increase. Alternatives include an accelerating response, or a step change at a certain level of damage.

gr2

 

Perhaps what you need to do is take a step back and think about that x-axis. What levels of damage are you measuring, and why? If you expect an asymptotic response then you’re going to need to sample quite a wide range of damage levels, but there’s no need to continue to extremes because after a certain point nothing more is going to happen. If it’s a step change then your whole design should concentrate on sampling intensively around the point where the shift occurs so as to identify that parameter accurately. And so on. Drawing this first sketch has already forced you to think more carefully about your experimental design.

Let’s not stop there though. Look at the y-axis, which rather blandly promises to measure the induced defence response. This isn’t an easy thing to pinpoint though. Presumably you don’t expect an immediate response; it will take time for the plant to metabolise and mobilise its chemical defences. How long will they take to reach their maximum point? After that it’s unlikely that the plant will maintain unnecessarily high levels of defences once the threat of damage recedes. Over time the response might therefore be humped.  Will defences increase and decrease at the same rate?

gr3

This then turns into a whole new set of questions for your experimental design. How many sample points do you need to characterise the response of a single plant? What is your actual response variable: the maximum level of defences measured? When does this occur? What if the maximum doesn’t vary between plants but instead the rate of response increases with damage?

This thought process can feel like a step backwards. You started with an idea and were all set to launch into an experiment, but I’ve stopped you and riddled the plan with doubt. That uncertainty was already embedded in the design though, in the form of unrecognised assumptions. In all likelihood these would come back to bite you at a later date. If you were lucky you might spot them while you were collecting data. More likely you would only realise once you came to analyse and write things up**.

This is why I advocate drawing your dream figure right at the outset. Not just before you start analysing the data, but before you even begin collecting it. The principle applies to experimental data, field sampling, even to computer simulations. If you can’t sketch out what you expect to find then you don’t know what you’re doing, and that needs to be resolved before going any further***.

If you’re in the position where you genuinely don’t know the answers to the types of questions above then there are three possible solutions:

  • Read the literature, looking for theoretical predictions that match your system and give you something to aim for. Even if the theory doesn’t end up being supported, you’ve still conducted a valid test.
  • Look at previous studies and see what they found. Note that this isn’t a substitute for a good theoretical prediction; “Author (Date) found this therefore I expect to find it too” is a really bad way to start an investigation. More important is to see why they found what they did and use that insight to inform your own study.
  • Invest some time in preliminary investigations. You still have to avoid the circularity of saying “I found this in preliminary studies and therefore expected to find it again”. If you genuinely don’t know what’s going to happen then try, find out, and think about a robust predictive theory that might account for your observations. Then test that theory in a full and properly-designed experiment.

Scientists are all impatient. Sitting around dreaming about what we hope will happen can sound like an indulgence when there’s the real work of measurement to be done. But specifying exactly what you expect will greatly increase your chances of eventually finding it.

 


 

* This is not an entirely random example. I set up a very similar experiment to this in my PhD which consumed at least a month’s effort in the field. You’ll also find that none of the data are published, nor even feature in my thesis, because I found absolutely nothing. This post is in part an explanation of why.

** There are two types of research students. There are those who realise all-too-late that there were critical design flaws in some of their experiments. The rest are liars.

*** Someone will no doubt ask “what about if you don’t know at all what will happen”. In that case I would query why you’re doing it in the first place. So-called ‘blue skies research’ is never entirely blind but begins with a reasonable expectation of finding something. That might include several possible outcomes that can be predicted then tested against one another. I would argue that truly surprising discoveries arise through serendipity while looking for something else. If you really don’t know what might happen then stop, put the sharp things down and go and ask a grown-up for advice first.

 

Kratom: when ethnobotany goes wrong

Mitragyna_speciosa111

Mitragyna speciosa (Korth.) Havil., otherwise known as kratom. Image credit: Uomo vitruviano

Efforts to control the trade and usage of recreational drugs* struggle against human ingenuity, driven by our boundless determination to get loaded. The search for new legal highs has led in two directions. One is the generation of new synthetic drugs which are sufficiently chemically distinct to avoid the regulations but which remain pharmacologically effective. These are often more dangerous than the illegal drugs they replace. The other is to mine the accumulated cultural and medical repositories of herbal lore from around the world to find psychoactive plants which haven’t yet been banned. Species are suddenly raised from obscurity to become the latest rush.

Over recent years a variety of plants have gone through a process of initial global popularity followed by a clamp-down, usually once a death has been associated with their use or abuse (even indirectly). A wave of media attention, hysteria and misinformation typically leads to regulatory action long before the formal medical evidence begins to emerge. One of the most recent episodes was over Salvia divinorum which enjoyed brief popularity as the legal hallucinogen of choice among students, despite being inconsistent in its effects and often quite unpleasant. Wherever you’re reading this, it’s probably already been banned.

Most of these plants have a long history of safe and moderate intake by indigenous populations in the regions where they grow naturally, either for ritual or medical purposes. The same can be said of many of the more common drugs we all know of: opium, coca and cannabis have many traditional uses stretching back for thousands of years. The problems arise when they are taken out of their cultural contexts and used for purely recreational purposes. This is often combined with plant breeding to increase the content of their psychoactive ingredients or chemical treatments that enhance their potency or synthesise their active components (such as in the production of heroin). A relatively benign drug is transformed from its original form into something much more problematic.

The latest plant to emerge as a potential drug in Europe, though already banned in many places, is Mitragyna speciosa, more commonly known as kratom, and native to several countries in Southeast Asia. Here in Ireland its active ingredient has been designated a Schedule 1 drug** since 2017. Pre-emptive legislation in other countries is quickly catching up.

I will confess to not having heard of kratom before it became a Western health concern. This is probably true of most people outside Asia, but more surprising to me given a long-standing interest in ethnobotany in Southeast Asia and having lived in Malaysia. I had previously subscribed to the view expressed in most textbooks that natural painkillers were absent from the regional flora, an opinion confirmed through my own discussions with orang asal shamen***. This may be because kratom grows in drier areas than I’ve worked in; I’ve certainly never come across it in any surveys of traditional agricultural systems in Malaysia. Pain relief is one of the scarcest and most sought-after forms of medicine, so if kratom is effective then I’m now puzzled that it isn’t more widespread.

In its normal range kratom is used as a mild painkiller, stimulant and appetite suppressant in the same way as coca leaves have been used for thousands of years in South America. The dosage obtained from chewing leaves, or even from extracts prepared by traditional methods, is likely to be low. This is very different from its recent use in Western countries where higher dosages and combinations with other drugs (including alcohol and caffeine) are likely to both enhance its effects and increase its toxicity. It is also more commonly sold over the internet as a powder.

Nevertheless, a parallel increase in recreational use within its native range has also been reported, although as of 2016 with no deaths associated. Recreational use also has a long history in Thailand, and habitual use in Malaysia has been known of since at least 1836. It is now thought to be the most popular illegal drug in south Thailand, though arguments continue over whether it ought to be regulated, despite the acknowledged risk of addiction. In this it mirrors the discussion over khat, a botanical stimulant widely used across Arabia and East Africa****. Where cultural associations are long-standing it is seen as less threatening than drugs from overseas.

Along with its long use as a drug in its own right, kratom has also been used as a substitute for opium (when unavailable) or treatment for addiction. It also has a folk usage in treating hypertension. This points towards the potential for beneficial medical uses which would be delayed or unrealised if knee-jerk regulation prevents the necessary research from being conducted. Compare the situation with cannabis: its medical use in China goes back thousands of years, but the stigma associated with its use in the West (which we can partly blame on Marco Polo) delayed its decriminalisation. Cannabis remains illegal in many countries despite steadily accumulating evidence of its medical value.

I’m a firm believer that we should recognise, respect and learn from the botanical knowledge of other cultures. Banning traditional medicines (and even socially important recreational drugs) in their home countries on the basis of their abuse by people in the Global North is morally wrong. Excessive regulation also deprives us of many of the potential benefits which might come from a better understanding of these plants.

All this is not to diminish the real physical and mental damage caused by addiction, and the need to protect people from abuse and the associated social costs. But the urge to get high is nothing new, and the cultures that have formed alongside psychoactive plant chemicals, from morphine to mescaline, usually incorporated ways of controlling their use and appreciating them in safe moderation. In Tibetan regions where I’ve worked cannabis is a frequent garden plant, and teenagers enjoy a quiet rebellious smoke with little stigma attached, while adults eventually grow out of the phase. Our own immature explorations of ethnobotany need to learn that there’s much more to a plant than the titillation provided by its active ingredients.

 


 

UPDATE: after posting, @liana_chua pointed out this fascinating blog post by @PaulThung about the market-driven boom in kratom production in West Kalimantan, where it is known as puri. It’s a great reminder that what’s seen as a growing problem in the Global North can simultaneously be a development opportunity for people in deprived regions.

 


 

* I dislike the term ‘war on drugs’ because I don’t like the tendency to militarise the vocabulary surrounding complex and nuanced issues (see also ‘terror’). Also, as others have pointed out, war is dangerous enough without being on drugs as well.

** Schedule 1 covers drugs which have no accepted medicinal or scientific value. The classification in Ireland is therefore based on uses rather than the potential for harm or addiction. This can become a bit circular. For example, nicotine isn’t on the list, but its only medical usage is in treating nicotine addiction. Cannabis, however, is on the list.

*** Of course I hadn’t considered before now that this is something they might not want to talk about, given that kratom is regulated in Malaysia, where it is known as ketum,  although attempts to ban it outright have not yet come to fruition. Still I would have expected to spot a large shrub in the Rubiaceae if they were growing it deliberately.

**** I’ve often chewed khat in countries where its use is tolerated, and it’s a great way to boost your mood and energy level during fieldwork. It is also addictive, of course.

 

Keep your enemies close to save the planet

torontowolfpack

This year Toronto Wolfpack join rugby’s Super League. But at what cost to the climate? (Rick Madonik/Toronto Star via Getty Images)

I’m a rugby fan. In the past I played for several amateur teams as I moved between cities for work. In all honesty I was never a particularly good player, but being keen and available is usually enough to secure a place in the squad and a guaranteed game in the reserves. It’s been over ten years since I picked up a ball though. Rugby is definitely a game for the young*. These days I meet my craving for rugby vicariously through supporting my local rugby union club and the rugby league side from where I grew up.

Over the last 30 years or so the game has changed immensely, and mostly for the better. Rugby union turned professional in 1995 while rugby league had begun to do so a century earlier (this was the original reason for the split between the two codes). Overall this has meant an increased profile, a more enjoyable experience for the fans, better protection for players and most of all an improved quality of the game.

Another trend began in the last few years though, which is that the sports became international at club level. In 2006 the rugby union Super League ceased to be a UK-only affair with the arrival of French side Catalans Dragons. In the 2020 season it has been joined by Toronto Wolfpack from Canada. A league previously confined to a narrow corridor of northern England has suddenly become transatlantic. Meanwhile in Ireland my local rugby union side Munster now play in the elite PRO14 league. Originally the Celtic League and composed of sides from Ireland, Scotland and Wales, these days it takes in two clubs each from Italy and South Africa. In the southern hemisphere Super Rugby unites teams from Argentina to Japan. Hardly next-door neighbours.

Why should this matter? Surely it’s great to see some of the world’s best teams pitted against one another?

Watching the start of the rugby league season made me a little anxious. Toronto, being unable to play in Canada at this time of year (it’s still too cold), began their season with a heavy loss against Yorkshire side Castleford Tigers. Many of their fans were Canadians based in the UK and delighted to have a team from back home to support. But there was also a hard core of travelling fans. And what happens when UK sides start playing games in Canada: how many air miles will be racked up? Few fans can afford to do so, and certainly not often, but there was no reason to even consider it before now.

This internationalisation of sport at the club level is driving a new trend in long-haul leisure travel, whether it’s American Football games in London or South African rugby teams playing in Ireland. Entire teams, their support staff and a dedicated fan base are now making regular trips around the world for one-off fixtures.** These will inevitably involve more flights than would otherwise have occurred. At least if you’re playing teams in your own country you can usually take the train.

This has of course been going on with international-level sports for a long time, but there the frequency is much lower. Major tournaments only occur every few years and visiting national fans usually attend a couple of games, allowing for an economy of scale. Even in this arena there’s been an increase in travel; one of the reasons for the formation of multinational touring sides like the Lions was to spread the cost of long-distance trips among national associations. Now every country travels independently. Still, although the audiences for international fixtures are huge, most fans watch at home on the TV. Club fans usually don’t have that option.

In the face of a climate crisis all sectors of society have a part to play. Unnecessarily increasing travel emissions by creating trans-national sporting contests among teams with very local fan-bases strikes me as an unwelcome new direction. Commercialisation of the game has had many benefits but this is not one of them. Despite flygskam, growth of air travel has continued unabated, and international sporting fixtures are a contributing factor.

The most enjoyable, passionate and intense games are those between local rivals. Watching Munster face up to Leinster, or Warrington against St Helens**, is to me the pinnacle of each sport at club level. Best of all, we can watch them practically on our doorsteps. Keeping our sporting fixtures local is one small way in which we can reduce our impact on the planet. Sustainable rugby? Why not.

 


 

* Anyone who plays rugby past the age of 35 is either brilliant, crazy or no longer interested in the finer tactical points of the game.

** The chant of disgruntled fans suspecting the referee of bias will have to be “Did you bring him on the plane?”.

*** Especially when we win. Oh look, we did it again.