Category Archives: Uncategorized

How to write a PhD thesis. Part I: Saving Time

My aim in this series is to explain the things that are often in plain sight but not always to the person that most needs to know. Image by Randal Phoenix.

Amongst PhD students it’s unsurprising that the thesis becomes the subject of obsessive attention. The thesis is, after all, the one absolutely essential output of the program, the final product, the main focus of assessment. Everything else is nice-to-have but not actually required for graduation1.

There’s another point of view though, which is that the thesis is actually the least-important output you’ll produce. As soon as you’ve passed the viva it doesn’t matter. The chances are that no-one will ever read it again. The fact that you have a PhD implies that a thesis exists somewhere but your potential as a researcher will be measured in actual publications2. This is why, as I’ve argued elsewhere, the thesis only needs to be good enough to pass. It doesn’t have to be perfect. It’s not the defining statement of your career. You need a thesis; you don’t need a great thesis. Don’t create something that’s a joy to read when it’s only being sent to an audience of two people3.

This is why I counsel against the danger of thesis-gazing, which is when an inordinate of time is spent on trying to craft the perfect thesis for your own satisfaction rather than directing effort towards the parts that will actually count. Focus instead on the elements that will have an impact on your future career, the chances of an easier ride in the viva, and for the actual progress of your field as a whole. Crafting a thesis that flows in a cohesive narrative arc might be satisfying but it’s not necessary. A student who slings together a bunch of loosely-connected studies and sticks a cover on it, but who also gets several papers out at the same time, is going to have a better chance of landing a post-doc position. Your future employers will not read your thesis but they will read your CV and publication list.

The main idea I’d like to get across in this post, and the ones that follow, is the importance of marginal value. You have limited time, no matter what stage of your degree you’re at. Invest in the things that will make a difference now and in the future, and cut corners wherever you can get away with it. This will reduce the pressure you place on yourself.

Here are the three things you can probably save time on4:

  • Final discussion. Quite frankly, no-one cares about this. In this concluding chapter you aspire to draw all the threads of your various studies together. The aim is to convince the reader that it did all actually make sense, you found something worthwhile, and has wider importance and implications. The truth is that your examiners have already made their mind up by the time they reach the discussion. A great discussion will not rescue a thesis if the data chapters have been below par. Likewise a bad discussion isn’t going to change their minds, and if they don’t like it then they can ask for amendments. Most likely is that they will run out of time before the viva, give it a skim read, and not worry too much. Leave this until last and run it off in a couple of days (no more than a week).
  • Appendices. A thesis chapter is a manuscript that needs to go on a diet. Publishing it requires you to remove all the fat: supplementary analyses, digressions on individual observations, excessive literature citations and endless waffle. It’s already too long. Then there are the other things you did on top, which probably took a lot of time, but which don’t quite fit into the chapter. So you think: let’s have an appendix. This appears to be a cost-free solution. It will also have absolutely no impact on the quality of your thesis or the outcome of the viva. If there’s something you can’t bear to leave out then you could have an appendix. Or you could put it in a shoebox and bury it in a bog where it has a higher chance of being found. Time spent preparing and formatting the appendix is generally wasted.
  • Rants. You’ve spent a long time thinking about this one thing. It’s been your whole life for the last few years. In the course of this obsession you have developed Opinions. And someone needs to hear them. Whether it’s that a major paper in the field is wrong, or a method is flawed, you’re about to unleash. Unfortunately only two people will read it and the internal examiner won’t care. The external will either disagree, which creates a problem in the viva, or perhaps already agree. Maybe, in the best case scenario, you might actually change their mind. Fantastic! That’s a lot of effort to shift one person’s opinion, and probably not worth it. Write an angry blog post instead or start an argument on Bluesky.
  • General introduction. This one is likely to spark some disagreement from other academics because writing a general introduction is an essential step in the grad student journey. But if the point of the general introduction in the thesis is only to explain and justify your research questions then it doesn’t need to be highly polished. Put the real effort into the chapter introductions because there’s a good chance of those being turned into manuscripts. The counter argument is that the general introduction is the first thing examiners read and it’s worth giving them a good impression. I do agree with this. Make the introduction coherent and authoritative, but it’s not going to make-or-break your thesis. If you miss something out then your examiners will tell you and it’s an easy correction to make.

I’ve been direct supervisor for a dozen completed PhDs and acted as tutor or advisor to countless others. Every single one said that they want to publish their general introduction as a review paper. Maybe this is one of those post-grad myths because I’ve never seen it happen. Instead students can spend an inordinate amount of time crafting what amounts to a lengthy attempt to demonstrate that they’ve read (and cited) everything of potential relevance to their PhD. It’s good experience but look at it in terms of return on effort. I’ve examined theses with finely crafted introductions and made no comments on them at all, then given the students a hard time because their data chapters were sub-standard. I’ve also read theses with cursory introductions followed by a series of self-contained manuscripts. The latter had a much easier ride.

In my next post I’ll move onto the things that are often under-valued by many students, but where you can reinvest the time saved by decreasing emphasis on the above. Or just take the weekend off, you probably deserve it.


1 There are some countries where a publication, or at least a submitted manuscript, is required before the PhD can be awarded.

2 Or, depending on your field, software tools on GitHub, patents, new species, DNA sequences… all sorts of things that might be picked up and used by others. Unlike your thesis.

3 My experience is with theses in the UK and Ireland, and I know that there is variation between systems, but in all cases the likely audience for a thesis is in the low single figures.

4 I’m going to put one caveat in here, which is that working on something unimportant can be treated as productive procrastination, or help build up momentum for the important writing to come. This can be true but can also be used as a convenient excuse. Instead of bulking up the thesis, why not write something that someone might actually choose to read?

Why you should (almost) never write a negative book review

Negative book reviews are great fun to read. There’s a dark delight to be found in a comprehensive take-down of a book, especially if you side with the reviewer. The Schadenfreude is even more delicious if you happen to dislike the book’s author. It’s a guilty pleasure but I’m not going to deny that it’s still a pleasure. Notable examples include the demolition job on Boris Johnson’s life of Churchill, the fierce rebuttal of Michael Behe’s misleading account of the impossibility of evolution, or the Amazon reviews of the worst photo album ever. You finish reading each of these feeling not only happier but wiser(1).

With all that in mind, I strongly believe that you should avoid ever writing or publishing a negative book review. I was a book reviews editor at a journal for around 10 years and of the hundreds of reviews I handled, only a smattering were negative. I regret publishing all of them and openly apologise for my role in bringing them into the literature(2). Years ago I wrote a negative review myself, for a different journal, and the editor made the shrewd decision not to publish it, for which I remain grateful.

My current opinion on negative reviews is shaped by being a book author myself, and recognising the amount of time, sacrifice and personal investment that goes into bringing a book into publication. Even a bad book. Even, quite frankly, an utterly irredeemable and worthless book. If you haven’t written a book yourself then you should take pause before criticising anyone who has. They at least had the personal drive to create something and place it before the world.

Writing a bad review carries costs for the author, and also costs for you. On the author’s side, you are damaging their reputation and (potentially) their income(3). The more people read your review, the fewer are likely to buy the book, and the author will become known as having produced a bad book. If this is an early-career researcher, or their first book, the results can be devastating for a career. There is a name on the front of the book, it’s a real person, and your comments are inevitably directed at them. The greater your reputation and status, the more harmful your critiques become.

On the other hand, beyond the time and energy required to write a bad review, there are also further costs to the reviewer. You become someone who writes mean reviews, and we can all make judgements about the type of person who criticises others in public. Even if no-one challenges you openly, there’s a good chance that you will lose friends or reputation in the process. Whether or not I agreed with them about a particular review I would think twice about collaborating with someone who was willing to write a negative book review (this is a decision I have acted upon).

There are features of a book which make it bad, at least to you as a reader, but which don’t deserve a bad review. These include that the book was poorly written, you disagree with the conclusions, you have a personal dislike of the author (however well-founded), or that there are mistakes on points of detail.

In all these cases, there’s a simple thing you can do, which is to not write a review. No-one is forcing you to. If you’ve been invited to review a book then you can decline, return any commission fee, and just walk away. Consider the sage advice of every grandmother that if you have nothing nice to say then you should say nothing. Over the years a few reviewers came back to me with variants on this. If the book is simply bad then there’s no point in giving it the attention that comes from a review. Just let it fade away, unread and unrecognised, amongst the thousands of books that are published across the world every single day.

There are a few exceptional cases where a negative book review might be warranted, so it’s not an absolute never, but there are specific and stringent conditions. These include:

  • The author is rich, famous and established, and well able to stand up for themselves. For example, I would have no qualms about criticising someone like Richard Dawkins in print. This could however be a reason not to write a bad review because people with deep pockets and reputations to defend can fight back, so proceed with due caution.
  • The book is catastrophically, dangerously wrong. Here the facts and evidence have to be absolutely clear, independently verifiable and iron-clad beyond being subject to interpretation or opinion. A book disputing the public health benefits of vaccination, or denying the negative effects of climate change, needs to be stamped on in any available outlet. Mistakes or differences of opinion don’t count; the required level is that of putting lives at risk.
  • The book has implications or conclusions that are themselves dangerous. This is slightly different to the above, as it accepts that the book might have contents that are defensible or at least open to interpretation. For example, the notorious book about human intelligence The Bell Curve contained data on IQ that were not themselves incorrect(4). The real problems were the assumptions made about what IQ actually measures, ignorance of the enormous amount of prejudice embedded in the test, and the lessons that were drawn from an analysis of inherently biased data. The problem with eugenics is not so much that it gets the genetics wrong, it’s that genetic arguments are used to advance causes which are themselves morally reprehensible(5).

Even if one of these applies, ask yourself a few questions before doing so. Are you the right person to be writing the review, and can your status or credentials protect you against any possible backlash? Do you have a vested interest, meaning your case could be undermined by accusations of bias? Might there be any legal consequences to the criticisms you are making if they could be construed as defamatory or damaging to the author’s livelihood? Are there other more informal avenues to responding to this book which won’t attract the same repercussions? Most of all, are you completely sure that you are right and able to defend every word in your review? Even if all these things are true the possible costs of being behind a negative review might outweigh any benefits that come from its publication.

If you do find yourself desperate to write a negative book review then my advice is to follow the same procedure as for angry letters. Take the time to write your best, most thorough and damning indictment of the book. Then file it away and move on with the rest of your life. It’s not worth it.


(1) There are some which verge on the spiteful, such as this recent takedown of Ocean Vuong, which are pure entertainment.

(2) A general apology is largely performative so if you’re reading this and feel like you deserve a personal one for something that I’ve published then I am very willing to do so and to make amends if possible.

(3) Although not as much as you might think. Most authors are lucky to receive 10% of the cover charge of a full-price book, and often only after various fees have been paid off. Most of us don’t write books for the money. Nor for the fame. Or the career benefits. In fact, writing books is one of the least well-remunerated things you can choose to do with your professional life.

(4) Please do not read this as in any sense a defence of The Bell Curve, a disgraceful and appalling tract used as a crutch by racial eugenicists. The issue isn’t whether you can measure and analyse differences in IQ, because obviously you can, it’s whether or not you should.

(5) This is not to say that eugenic theory didn’t misunderstand or misrepresent population genetics, so much as that its early proponents included enough elements of fact to appear reasonable and defensible. Hindsight is a hanging judge but in the 1950s the evidence was more mixed and the space for what was considered acceptable interpretation was broader. There’s no excuse for it now.

How to write a conference abstract

In the northern hemisphere the leaves have opened on the trees, the migrants have returned, and the thoughts of researchers turn to which conferences they plan to attend. This is therefore often when we begin submitting abstracts to apply to give talks at meetings(1). It’s usually a competitive process — there are many more conference delegates than speaking slots — so some form of selection has to take place.

Evidence that I also give talks occasionally, although a lot less frequently than I used to, mainly because I’m not the one directly doing the research any more.

When selecting a title and writing the abstract for your submission, it’s essential to keep the audience in mind, and the primary audience is not who you might first assume. The one that matters is the scientific committee of the conference who will make the decisions over who gets to speak. It’s not, paradoxically, the actual audience of people who will attend your talk if you eventually get to the conference. That’s a second-order problem; first you need to get through the selection process.

Broadly there are three situations you might find yourself in. If you’ve been invited to submit an abstract then you’re under less pressure, but you still need to make sure that your proposed talk aligns with what the organisers were expecting from you(2). More commonly you’re either presenting work that has already been published and using the conference to promote it, or else you’re presenting something that you’re still working on and either using the conference as a testing ground or as a self-imposed deadline for completing the work. It’s the last of these that’s the trickiest to navigate.

If you already have a published paper (or submitted manuscript) then your job is relatively easy because you might only need a few tweaks to the word limit or to match the conference theme. More often though you’re writing something from scratch and this is where the advice becomes more useful. Why listen to me? I’ve sat on lots of academic committees for conferences, including one in the last month, so I have a fair idea of how decisions on who gets to speak are made.

  1. Align your talk with the theme, whether of the conference overall or the specific session you’re applying for. This can often be the deciding factor in abstract selection. For example, if you’re hoping to attend ATBC 2025 in Oaxaca, it’s necessary but not sufficient to be working in tropical biology and conservation. The theme of ‘Merging Diverse Actors, Approaches and Local Knowledge’ means that talks incorporating those features have a higher chance of success.
  2. Say something specific in your title. This is also a general rule for manuscripts, but for talks it indicates that you have a clear idea of your message. Instead of ‘Bird communities in Honduran forests’, go for ‘Habitat specialist birds decline over 20 years in lowland forests of Honduras’. Try to include a directional statement: things go up, or down, or move in a certain direction. Vague titles give reviewers the impression that you haven’t worked out your story yet and they’re usually not inclined to wait and find out.
  3. Don’t be too general. Unless you’re giving a plenary lecture, in which case you’re not reading this blog post, a conference talk is not the time to present your broad thoughts on the field of study. If you’re applying to SilviLaser 2025 then a talk entitled ‘Applications of remote sensing to forests’ is not going to be accepted because that’s the research focus of literally everyone at the meeting. Your 15-minute talk isn’t going to provide them with any fresh insight.
  4. Don’t be too specialised either though. This is a tricky balance to strike, but from the organisers’ perspective they are looking for a talk of sufficiently broad appeal that people will come to the session, or at least stay once they’re already there. Every entomologist at Ento2025 loves insects but a talk on the length of the stridulatory file of field crickets isn’t going to bring in the crowds unless you make it sing for them. Why should people care about your little corner of research? Every piece of scientific research is specific but placing it in context makes it relevant.
  5. Include quantitative information in your results. This might be the direction and magnitude of effects in an experimental study, the number of records in an observational study, or the counts in a meta-analysis. As with a specific title, this increases the confidence of the panel that you know what you’re going to say. These numbers can be amended before the conference takes place but they still have to sound reasonable.

All the above are ways to try to read into the mind of the academic committee for the conference, not the attendees. Why don’t the actual audience matter as much for your abstract? Well, here’s the secret: no-one reads conference abstracts(3). The vast majority of conference delegates pick which session to go to based on the overall theme, or might select a few talks based on their titles, but almost no-one is reading all the conference abstracts to decide exactly where to direct their attendance.

Once your talk is accepted this also takes some of the pressure off. No-one is going to hold you to account on whether the content of your talk exactly matches the submitted abstract. Everyone accepts that things can change — new data comes in, the analysis doesn’t quite pan out as expected, or a completely new interpretation might occur to you in the interim. This is all fine and part of normal science; responding to new evidence is a strength, not a weakness or admission of failure.

One thing that matters much less than many applicants expect is the seniority of the speaker. If you meet the minimum level for the meeting then you have as much chance as anyone. I’ve seen famous senior researchers declined by panels because they submitted sloppy and complacent abstracts. Seniority can also count against you because we all know by now whether Big Name Scientist is a good speaker or not and we might have heard it all before. Given the choice I will always pick a promising ECR over a well-polished greatest hits catalogue.

By and large scientific committees are on your side. They’re looking to put together the most exciting and engaging conference for people who are passionate about the same things that you are. Make them believe in you and then in a few months you could find yourself in a pleasant destination getting to display your enthusiasm on the stage.


(1) In this post I focus on talks because it’s where competition for slots is most intense. It’s also usually necessary to submit an abstract to present a poster but there are seldom the same constraints on numbers, as can be seen from the common practice of offering rejected speakers the opportunity to present a poster instead. I’m not going to get into which is better; both have their merits and it depends to some extent on the person.

(2) This comes from experience: I’ve had an invited conference talk declined after submitting an abstract which in retrospect was what I wanted to talk about and not what they were looking for.

(3) Unless you’re in a field where conference abstracts are published and recognised, in which case no-one reads them until after the conference has already taken place. In ecology they’re not considered to be publications.

How to be an external examiner for a degree course

An arbitrarily selected university at which I was once an external examiner. Panoramic photo taken by mintchocicecream.

Acting as an external examiner for a degree course is one of those roles for which academics typically receive no guidance or training. This is quite striking given how much authority an external examiner has over the degrees awarded to students. The experience is very different from being the external examiner for a PhD, for which there is plenty of advice out there, along with some profoundly differing opinions. Here however I’m referring to the role of assessing a taught undergraduate or Masters-level course at another university. This post is very much a personal perspective because even over the course of a long career any one individual only gains a limited experience, and mine has been entirely within Europe, so please add or qualify in the comments.

How do you become an external examiner?

Opportunities for external examiners are seldom advertised externally which means they are usually allocated through networks and personal connections rather than a transparent process. That doesn’t overly concern me because finding external examiners is difficult and I’d rather not place any more barriers in the way of course directors. I doubt that an open recruitment would attract many candidates, we would spend a long time on it, and I suspect in most cases we’d fall back on calling a friend anyway. It’s also not an easy job with much remuneration, as will become clear lower down, so really it’s a favour you do for friends and colleagues more than an opportunity. That being said, here are the main ways to get involved.

  • The most common method is being in the wrong place at the wrong time, which usually means at a conference bar at a moment of weakness. One moment you’re sharing a drink with a colleague, the next they’re casually mentioning that they have a vacancy for an external, and by the time they’ve bought the next round you’ve signed up for the next three years.
  • Related to this is to be recommended by someone else, usually by a more experienced academic who was approached at the conference bar and still had the wherewithal to say “No, but I know a great person you should try.”
  • Ask during your development or performance review. Letting your line manager know that you’re on the lookout for opportunities means they can pass any on to you, as well as signalling that you’re trying to broaden your external responsibilities.
  • Let the world know! I’ve never seen anyone put an open call out on social media but if you’re really keen then I doubt you’d be left waiting for long.

What does the role involve?

There doesn’t seem to be a standard list of expectations, or at least every institution for which I’ve acted as examiner has asked me to do something slightly different. Some combination of the following are usually involved though. Typically it involves a few days of preparation beforehand and one or two days at the university itself per year.

  • Checking exam papers is always an important element. Spotting mistakes is part of this, but more important is to ensure that the questions are clear and easily comprehensible. A module convenor who is deep into the material can sometimes need a little help to make sure that others will understand things the same way that they do. It’s also necessary to check that the expectations are reasonable for the level of the students and the time available.
  • These days vivas are no longer standard for all undergraduate students but in some places remain expected for taught post-graduates. Often a subset of the cohort is invited to a viva. These might be useful to make decisions about borderline candidates, find out more about a particular student who faced personal difficulties during their degree, or just to check that overall standards match up with your expectations of strong and weak students. I find them very useful for gathering some informal feedback about the content and delivery of the course, although this can also be achieved by holding open meetings or course surveys.
  • Review the exam scripts or coursework. There is always far more of this than you can cover in the time available to you so this is necessarily selective. Sometimes you might take a deep dive into a module with unusual mark distribution, or track the performance of a particular student who needs special attention. Other times you might just pick a box of papers at random. Your role isn’t to check or amend the marks, it’s more to make sure that they align with the university’s own grade descriptions.
  • Attend the final exam board. At this your main responsibility is to make sure that the university applies their own rules consistently and correctly so that candidates get the degree classifications that they deserve. Sometimes the rules are bizarre and hard to justify but it’s not within your power to fix them or propose different ones. All you can do is make sure that the university is doing what they claim to.
  • Submit a final report at the end of the academic year. This is normally the trigger that allows you to get paid. In this you are asked to respond to a set of questions which is determined by the university, usually related to the overall standards and practices you’ve seen on the course. You can ask for reports from previous examiners to see whether there are long-standing issues and how they have been addressed in the past.
  • To clear up one common misconception, you’re not there to evaluate the accuracy of module content, even in areas of your expertise. You have to respect the academic independence of your colleagues to teach what they want in the manner that they choose. I’m not going to declare that r/K selection is out-of-date, or that CSR theory is an abomination against reason, even if I believe both to be the case1.

The term of an external examiner is usually three years, sometimes with an opportunity to extend for another year, but not much longer. The principle of changing externals to get different insights is an important one. In my experience you spend the first year just trying to understand the course and it takes until the second or third year before you can start making properly informed observations and recommendations.

What are the benefits?

  • You get paid! How much you receive varies widely between universities but even at best it in no way compensates you for the actual time spent. In other words, don’t do it for the money. A nominal fee does at least help you to feel that your expertise is appreciated.
  • You get to visit another university. This is a much bigger benefit than the pay. If they’re doing things properly and haven’t suffered from ruinous budget cuts (hello to my friends in UK universities) then you should be offered the costs of travel, a stay at a hotel and a nice meal out at their expense. Perhaps this can lead to new collaborations or connections that otherwise might not have developed.
  • It exposes you to lots of new ideas. Some of these might be approaches to teaching that you wouldn’t have otherwise considered. Alternatively you might learn from the mistakes made by others. Seeing how other people and institutions deal with the challenges of teaching and supporting students can help you to improve practices at your own institution.
  • There’s also a great deal of reassurance that comes from finding out that a different set of colleagues in another city or country are struggling with the same issues that you are. Whether it’s the use of LLMs in coursework, the difficulties involved in teaching to students with English as a second language, or dealing with ever-declining lecture attendance, you might not find the answers but you can come away with the feeling that at least it’s not just you.
  • People have to listen to you. There are very few moments in an academic career when you say something and your audience are obliged to hear you out, respect what you say, and then respond to it. Ultimately you submit a report with some recommendations and these need to be acted upon (or if not then provide a justification). Aside from boosting your ego it’s a rare opportunity to have your expertise lead directly to changes which are hopefully beneficial to students or the teaching of your discipline. This can be very satisfying.

What are the biggest challenges?

  • Top of the list for me are virtual learning environments (VLEs). Most modern universities use one of Moodle (relatively good), Canvas (ok once you understand the layout) or Blackboard (an offence to the dignity of all who encounter it). Every university adopts different policies for setting up modules, which are interpreted in wildly variable ways by individual academics. I often spend around 50% of my time as an examiner just trying to access pages and find the information I need.
  • IT security is simply not designed with external examiners in mind. Normally you need a new institutional email account, which allows you access to their system, and is annoyingly the only way they ever contact you. This includes setting your password to get onto the systems. Expect many headaches.
  • Moving onto actual pedagogical issues, the mixed approaches to assessment and evaluation you observe can be challenging to understand. As time goes on I’ve become more accepting of this in that any single assessment is unlikely to shape the outcome of a whole degree which aggregates across lots of modules. Nevertheless it continues to surprise me what some teachers think are appropriate ways to test or reward student achievement.

I hope this helps to clarify a little of what’s going on behind the scenes of a task that is sometimes quite mysterious. I’ve enjoyed all my external examiner positions2 and have gained a great deal from them professionally. If you get asked then you can feel flattered and I hope your experiences are as positive as mine.


1 Go on, try me.

2 Funnily enough I don’t have one at the moment. No, I’m not looking.

Feeding the world without breaking the planet

The Little Shepherd Boy by Carlo Dalgas (1840). Held at the National Museum of Denmark. Image through Wikimedia Commons.

One of the greatest threats to life on Earth is poetry. George Monbiot, in Regenesis (2022).

Our pastoral stereotypes conjure up a picture of a shepherd boy patiently tending to his flocks, probably sat cross-legged on a rock, playing a rough-hewn flute, while flower-dappled meadows stretch into the distance. This trope can be tracked back to the Classical era, bolstered by later Christian iconography, but is a long way from the prosaic reality of livestock farming. Western culture has fixed upon pastoralism as the epitome of harmonious living with nature when it is anything but. As George Monbiot points out early in his latest book, Regenesis, the real population challenge faced by the world today is of livestock numbers, not humans.

The goal of this book is to find a path towards an agricultural system which is simultaneously more robust and sustainable, while providing humanity with sufficient food that is nutritious and affordable, but not trashing the planet in the process. That he has named the problem and engaged with it so constructively is impressive, and I recommend that everyone reads this book. Personally I’m not wholly convinced by his solution, although it would be a lot to ask to split this Gordian knot all at once. Despite my misgivings there is a lot to support and recommend.

His solution for the livestock crisis can be distilled into three parts. The first is that our diets need to dramatically reduce their association with animal products. Not everyone needs to follow Monbiot’s own path and adopt veganism (I certainly haven’t), though there’s little doubt that it would be beneficial for the health of the planet and likely ourselves too. Excluding meat, eggs and dairy products is not essential but we certainly require far less of them than we currently consume, and First World levels of demand cannot be extended to the whole human population without running out of land area in which to produce them.

In one of his most strikingly counter-intuitive arguments, Monbiot provides convincing evidence that extensive, low-intensity or organic approaches to livestock production can only have a limited role in future food production. He is surely right here insofar as they inevitably require greater land areas, eroding natural habitats still further. If the outputs are only premier foods for elite consumers then they will do little to resolve the coming food crisis.

Rather than becoming an evangelical vegan, a position likely to be either mocked or ignored, Monbiot recognises the important cultural role played by protein-based foods and seeks out alternatives. He finds great hope in fungal fermentation. Meat substitutes are now becoming commercially available and increasingly accepted by a public who are at least curious and, if price becomes a factor, will likely embrace them willingly.

This is the first area where I have some scepticism. While it is certainly true that meat substitutes can effectively replace (and even be superior to) low-grade, mass-produced meat products such as burgers, mince or chicken nuggets, this is a low bar to reach. These foods are valued for their lack of inherent flavour and ability to absorb other ingredients; that they are derived from meat is almost incidental. The attachment to a real roast chicken, sirloin steak or mackerel is going to be harder to break. Still, replacing the large and growing fraction of the global food market made up effectively of interchangeable protein lumps would be a valuable service.

Then, however, comes the question of who produces it. Monbiot conjures an idealistic vision of small-scale, distributed technologies that place protein manufacture into the hands of communities throughout the world. Perhaps your protein lumps could be picked up on your local high street next door to the baker or costermonger. More likely, however, is that economies of scale will mean that such enterprises will be undercut by larger industrial manufacturers. This runs the risk of reducing the resilience of global food systems, reproducing one of the crises that Monbiot identifies in our current situation. Factories under the control of small numbers of corporations, investors or governments become sources of inequity and vulnerable infrastructure at times of conflict. I fear that we could simply end up transferring systemic risk from one limited set of power-brokers to another.

The book then turns to arable production. A staggering fraction of the world’s crops are used to feed livestock, and one of the immediate benefits of a reduction in animal consumption would be the freeing up of large amounts of land which could then be used to feed people, or turned over to nature. This is an easy win for conservation and humanity alike, and therefore hard to argue with.

Nevertheless, arable crops remain the major source of calories for most of humanity, resting upon high-intensity production of a few staple grains. We will need to continue to grow crops even in the absence of livestock. Once again Monbiot directs some unexpected friendly fire in the direction of those who advocate for various forms of low-intensity agriculture such as high-nature-value farming, organic production or no-till systems. These are all more beneficial to nature and more sustainable than conventional agriculture, but share the common weakness that they not currently profitable (nor likely to be under any plausible model), and increasing their scale would only reproduce the problem of extensive agriculture. Less bad farming is still farming. I agree with him here. There is a niche market for the expensive products of agro-ecological farms but they won’t feed the world.

Monbiot places his hope in another innovation: perennial crops. What if, instead of the wasteful and polluting cycle of planting, harvesting and ploughing, we could grow crops that would last for multiple years? Intensive arable cultivation will always be necessary but through natural weed suppression and reduced soil erosion it could be closer to sustainable.

For me this is the part of the story that is least plausible. The most likely problem with perennial crops is perennial weeds. For sure they will suppress the annual weeds that require the destructive application of herbicides and regular ploughing. But long-lived crops will offer opportunities for longer-lived weeds to invade, particularly woody plants, and those are much harder for agricultural machinery to cope with. It only takes a small opening in a field of perennial crops: a wet hollow, or some blow-down from a strong wind, and it will quickly be filled with a birch sapling. I hope that the trials and expanded production that Monbiot envisages will still take place but I can foresee many hurdles in the way of perennial crops as a panacea.

Alongside these practical considerations there is one notable omertà that challenges Monbiot’s remedy. He implicitly assumes, by necessity, that the human population is a fixed term in the equation. It is of course possible to feed all the people currently on the planet on less land than we currently use, and we could do so more fairly and effectively using the approaches he advocates for. But can we assume that falling birth rates and the demographic transition will keep the problem within manageable parameters?

Moving into this area is fraught with moral peril and I don’t blame Monbiot for evading the issue. Personally I refuse to be drawn on the question of what a correct or sustainable human population might be. The answer depends on assumptions or ethical principles that are not widely agreed upon and all of which, once allowed to play out, lead to problematic outcomes. It’s not a question that science alone can answer. Nevertheless there is a simple truth which is that a growing population will, on average, lead to greater impacts, even if temporarily dampened by reduced inequality of consumption. A shrinking population raises further questions as to whose numbers are declining and why. The process is unlikely to be consensual and more plausibly involves mass suffering. Such are the grim repercussions of getting the equation wrong.

The danger of treating human population as a static term is well-illustrated by the example of the Green Revolution of the 1960s, when new crop strains and production techniques boosted yields worldwide. If the global population had remained at 1960 levels (around 3 billion people) then we would already be living in Monbiot’s world, one in which intensive agriculture could amply meet the needs of the majority while leaving land aside for nature. Understanding why this did not happen is the key to making sure the next agricultural revolution doesn’t amplify the results of the last. Would we merely defer the problem for another two generations?

Despite my concerns, I am happy to promote Monbiot’s solutions because they are undoubtedly better than the status quo and would buy us time to collectively agree upon what a sustainable future for the planet looks like. We can only begin to solve a problem once we dare to name it, and Monbiot has made every effort to consult not only the scientific evidence but the practical experience of the farmers who know what it takes to turn dirt into food. There is no silver bullet, only hard choices, and it is our responsibility to make them in the fairest, most equitable way we can.

The environmental impacts of music

Was there ever a book that you put off reading because you knew that it would change your worldview but you weren’t quite ready for it? Several years ago I bought Decomposed: The Political Ecology of Music (Kyle Devine), then guiltily abandoned it in my heap of unread books. This spring I finally overcame my apprehension and began.

Is this the sign of a misspent youth, or of misspent income? Either way I’ve been hauling these records around for decades now. This is only a fraction of the collection. I have to distribute them to spread the weight after causing floors to sag in our old house.

Why was I so concerned? I am professionally an ecologist, and outside work a music obsessive. As a former small-time DJ (both club and radio) I’ve accumulated a substantial and likely valuable record collection which started in earnest in the early 90s and has grown ever since. Many of my 90s dance 12″s were given away when I had no further use for them(1) but for sentimental reasons the LPs have been retained through multiple house moves. I didn’t stop buying vinyl records in the CD age (I own both) and continued to do so until now. The so-called vinyl revival, of which there have actually been several, had no impact on my buying habits. I estimate that I spent roughly £1000 on music every year for about 25 years. I still daydream of one day opening a record shop or starting a label.

In the back of my mind I knew, as most surely do, that vinyl is a petrochemical product. By way of self-reassurance I had assumed that it was a minority side-product, of limited environmental impact by itself. In the past I’ve been happy to cut ethically problematic products out of my life, including tiger prawns and vanilla, or drive an electric car, but vinyl records were too bound up in my identity. Buying records is what I do, it’s who I am(2).

The first thing this book made me do was to critically reflect on how record-buying as a hobby is merely an extension of the central capitalist drive to sell us things we don’t need, particularly petroleum products. Audiophile claims about the superior sound quality are tenuous(3) and, outside a few minority subcultures, almost all new music can now be bought in alternative formats (including digital). This leaves my preference attached solely to the ritual of the needle drop, a habit which connects me to my analogue childhood, a youth spent in sweaty clubs, and a lifetime of crate-digging in basement shops with fellow vinyl junkies. We love the feel of vinyl because we’ve been conditioned by our culture to do so. Is that reason enough?

The next discovery, perhaps unsurprising, is the degree to which my hobby is anything other than harmless. One point that Devine makes forcefully is that the music industry constitutes much more than the commonly recognised axis of artists, agents, producers, distributors and retailers. As a fundamentally material product, music relies on industries with far-reaching impacts. This all adds up to a hefty carbon and environmental footprint, both directly and through the activities it supports.

Even if you overlook the environmental impacts of oil extraction and transport, the industrial production of petrochemicals is inevitably polluting with harmful consequences for both human health and the natural world. A large proportion of vinyl for the US market was formerly produced at Keysor-Century‘s factory just outside LA, a plant which became notorious for breaches of environmental standards, and has left a legacy of contamination. Now the vinyl used in records is more likely to be made elsewhere, particularly in Thailand, where oversight remains less stringent.

The advantage of vinyl is its durability and resistance to decay. The very features that make it a wonderful medium for the long-term storage of music are simultaneously its downsides once its useful life has come to an end. Some records go back into second-hand circulation or are recycled. Most end up as landfill. There they are likely to remain for millennia, leaching the products of their slow breakdown into soil and water.

Alas, there is no simple solution. Although digital music carries the promise of separation from petrocapitalism, Devine points out that it provides a classic example of Jevons’ paradox that increased efficiency tends instead to increase total consumption. We now buy an array of material electronic products to listen to our immaterial music files, while the physical infrastructure of data storage, processing and transmission is largely hidden from view. The music itself may be increasingly digital but the impacts have simply moved elsewhere and capitalism has found a way of selling us new stuff.

In a particularly telling set of figures towards the end of the book, Devine pieces together the fragmentary and uncertain quantitative evidence to examine the environmental impact of the global music industry at the peaks of shellac, vinyl and CD sales, then compares them to data from 2016 including digital music. There is half of a good news story: the mass of plastics involved in the production of music is falling. On the other hand, thanks to the popularity of streaming services, the energy costs, and therefore the carbon emissions, are rapidly rising.

The book builds a case for what it calls ‘the slow violence of music’. While its consequences are less immediately obvious than industries such as mining, or have lower aggregate impact than activities like transport, it is linked to both of these and causes its own separate pathologies. It is tragically ironic that an activity so closely connected to many protest and counter-cultural movements is itself so inextricably entwined with the same forces they seek to oppose. Pick up a physical copy of Joni Mitchell’s Big Yellow Taxi and consider whether pressing it on a 7″ plastic disk manufactured from petrochemicals detracts from its intended message.

What then is the most ethical and environmentally responsible way to buy music? The obvious answer, as with all commodities, is of course to buy less. Beyond that, and assuming that you still want to support musicians and hear new music, my own best guess is that downloading and hosting music locally is the least harmful. This has the advantage, if you shop directly through record labels or hosting sites such as Bandcamp, that the producer receives a decent fraction of the purchase price, whereas streaming services only generate paltry revenue for musicians(4). You get to own the files and use them as you see fit without creating hidden energy costs.

So here’s my pledge. I’m going to give up buying new vinyl records and switch entirely to digital downloads. I have no qualms about picking up second hand records(5) but I won’t add to the existing problem. This habit will be harder for me to break than giving up smoking, but my conscience can’t maintain this blind spot any longer. Yes, it’s only a small thing, and very much a First World Problem, but we all need to start cutting back and I can’t pretend that my lifelong petrochemical addiction is necessary.


(1) Some might be shocked that I casually gave away large piles of records. Rest assured, the majority of 90s dance music was absolutely dreadful, and no-one wants to hear it any more, even ironically. The records were worthless. Some things are best forgotten.

(2) I hesitate to call myself a collector because my purchases are not linked to any concern about the value (present or future) of the records, nor to any completist accumulation of a particular genre or artist. I only buy what I want to listen to.

(3) You genuinely can hear the difference. I’ll happily prove it to anyone who has doubts. But to achieve this requires spending a lot of money on equipment and a well-pressed record made with high-quality vinyl. For the overwhelming majority of casual listeners there’s only downsides if sound quality is what you actually care about, and for the audiophiles among us we’re just spending lots of money on yet more unnecessary kit.

(4) I’ve heard it said that total royalties from physical record sales exceed those from streaming services, even at a time when vinyl is supposedly a legacy format. No wonder artists still want to release them.

(5) You can take many things but please leave me the pleasure of discovering a forgotten Crispy Ambulance 7″ in a dusty box at the back of a second-hand record shop.

Why it’s good to fall flat on your face as a teacher

I was fortunate enough to be taught to play rugby by Tosh Askew, one of the great youth coaches of the English game. At the time he was reaching the end of his playing career with Liverpool St Helens, and later went on to coach a highly successful England U19 side, laying the foundations for a generation of internationals who became a leading force on the world stage. Long before that he was standing in the rain shouting at groups of disorganised and reluctant schoolchildren, one of whom was me.

A reconstruction of good tackle form from BBC Sport. In my mind this is how it happened, but I’m sure that reality was very different.

Tosh was a teacher who didn’t need to rely on discipline or coercion to get his charges in line, even while out in the mud on a cold winter afternoon*. His physical presence alone was terrifying enough. Throughout my later years playing amateur club rugby I could still hear his voice in the back of my mind booming “Run straight Eichhorn!” It’s there to this day, over 30 years later. That wasn’t why he was such a great teacher though.

One session sticks in my mind during which we were being drilled in attacking and defensive line play. I can only have been 12 or 13 years old and at that moment I was on the defending side. Tosh, in his attempts to impose some order on the attacking group, had picked up the ball and was directing their movements. So it was that I found myself, a scrawny and bookish young lad, facing the prospect of a large, muscular man heading in my general direction. I did what any self-respecting rugby player would do in such circumstances. I went for his knees.

Moments later, to my great surprise, I found myself on the ground clutching a pair of legs, with Tosh also in the dirt, having off-loaded the ball on the way down. Play immediately stopped as all the other boys paused to take in the scene. He turned and looked at me.

“What are you doing Eichhorn?”

“Tackling the ball carrier Sir”, I responded meekly.

“Very good. Play on!”

No more was said about it and the session resumed. To this day I have no idea whether he went down deliberately to salvage my pride, or tripped over me, or was just trying to make sure that I didn’t get hurt in the process. I had no business bringing down a man of his size and strength.

This incident provided me with an immediate, if poorly-deserved, confidence boost. In the eyes of my peers it gave me a certain cachet: I had taken down Tosh! It even featured on my school report later that term. Why am I still dwelling on this minor incident, three decades on? Only because I’ve learnt a different lesson from it, which is the value as a teacher of allowing yourself to take a very public fall in front of your students.

Sometimes as teachers a student will step up and tell us that we’re wrong. In such circumstances the instinct is often to push back. You might be adamant that you’re correct, or else feel that your authority in class depends on maintaining your superior status. I recommend trying something different: let them take you down. Clearly, deliberately, so that everyone can see it.

It doesn’t happen to me often, or indeed as often as I would like, but sometimes a student will correct me on the identification of a species, or provide a counter-example that conflicts with one of my points. In the early days of teaching I probably would have reacted defensively, reflecting my own insecurity. Later on I’d have thought about teachable moments, maybe inviting the rest of the class to respond and seeing if I could turn it into a discussion. The latter approach is great if it works, but can also end up being a means of pitting students against one other, and places your initially brave student in the firing line. They will all think twice about speaking up again.

What I’ve learnt from that tackle, or rather from its aftermath, is that sometimes as a teacher you should allow yourself to be taken down. Any loss to your authority will be more than offset by the gains for your student. They can walk away buoyed with the knowledge that they got one over you, if only this once, and that however terrifying it might seem in the moment, they can actually do whatever it is that you’re trying to teach them, whether it’s a physical tackle or demonstrating some critical thinking. Their confidence is worth far more than your pride.


* I should probably make it clear at this point that although he was known as Tosh to the students, we would never have dared call him that to his face. It was ‘Sir’ or Mr Askew. If I ran into him again today then I’d still feel wary about using his first name.

Is ecology really more important than ever?

Mock poster based on a real review of Baxter State Park in Maine, created by @ambershares_ for her @subparparks series. If you like it then buy the book or a range of merchandise. Lest it need pointing out this is a humorous take, not the actual views of the artist.

I’ve grown weary of the repeated assertion, expressed in journal editorials, society newsletters and conference promotions, that ‘ecology has never been more important’. Is this really true? And even if it is, does it provide a strong case that everyone else should care about it? I think we should retire the phrase and instead seek to make more direct, positive statements about the value of ecology*.

My scepticism arises from the observation that the rhetorical trope is by no means restricted to ecology. The same justification appears repeatedly in other fields, both inside and outside academia, diluting its impact considerably. I suspect that their communities of interest could make a case that everything from radiology to real ale to Renaissance poetry is now more important than ever. If you’ve seen the phrase used in another context and thought “So what?” then it’s very likely that non-ecologists are thinking the same thing when we use it, and fellow ecologists hardly need persuading.

If everything really is becoming more important then this might be caused by two broader trends. One is an ever-expanding, educated and increasingly connected global population, so for any topic there is almost bound to be more people taking an interest than before. At the same time, political, social and technological change continues to intensify, threatening to eclipse or eradicate many of the things we care about, whether it’s butterflies or Brutalist architecture. Neither of these patterns makes a strong case for ecology in particular.

If everything is increasing in importance than the absolute increase in the value of ecology (perhaps greater than before) is less relevant than the relative value of ecology. Can we say that ecology is more important than public health, or economic inequality, or agricultural production? Phrasing it in this way makes the original statement appear even more nonsensical because we are ranking the incomparable**.

A useful rhetorical approach is to argue from the opposite position. Could it actually be the case that ecology is less important? Are our claims merely attempts to draw attention to something we wish people would care about as much as we do?

The truth is that for a large proportion of people, their direct dependence on natural systems is decreasing. On an individual level, ecology is surely most important of all for hunter-gatherers, whose entire survival depends on the vagaries and vicissitudes of natural forces. I have worked with shifting agriculturalists, and I know of no other people whose understanding of their environment, formed through careful and systematic observation, is as great as theirs. Even farmers in the developed world retain a close connection with nature. I view all of them as ecologists in one form or another.

Contrast our increasingly urbanised, detached species, and for the most part it is possible to live our lives without recognising our dependence on nature. When we do encounter the living world it is often through the managed conditions of parks and gardens, and we are as likely to be repelled by the intrusions of uncontrolled nature (wasps and weeds) as to be delighted by them. Even if ecological processes underpin many of the services we require, our direct needs are often met from systems that are heavily managed, sanitised and shifted a long way from any natural baseline.

Viewed from this perspective, the problem isn’t that ecology isn’t becoming more important, it’s that to a large proportion of the people on the planet (increasing in both absolute and relative terms) it is becoming less obviously relevant. We recognise this phenomenon in issues such as plant blindness. A natural world that is not encountered or interacted with is difficult to muster much enthusiasm for. It’s not as important to people, even if it’s important for people.

This lack of connection can remain true even while nature documentaries are among the most-watched broadcasts on television. This is because they often editorially eradicate humans through the use of careful camera angles and choice of filming locations. Nature is presented as something pure and detached from humans; when a human does appear it is often in order to foreground the emotional response of the presenter. This is nature as spectacle, not as lived reality. We are not truly immersed and connected with it.

But what about the bigger picture? It is surely the case that ecology is central to solving many of the grand challenges that face humanity: climate change, collapsing biodiversity, feeding a growing population. It is in facing these problems that we can make the strongest argument for why ecology truly matters. This also makes me uncomfortable though because it frames ecology as a crisis discipline, only worthy of attention because things are going so badly wrong. Surely we can all believe in a more positive vision.

This then is the crux of the problem: ecology is important to our species collectively, even while it is becoming less directly important to us individually. Many believe that there is a connection between the two, and that by providing individuals with opportunities to experience and relate to nature they will be more likely to act in the greater interest (I’ve often heard this said but am unaware of any compelling evidence from a direct study***). How should we as ecologists address this? Blanket statements of its importance are not going to cut through.

There’s a sense in which the ‘ecology has never been more important’ claim is an admission of insecurity; a cry for attention in the face of abundant evidence that economic and social systems are ignoring our scientific expertise. It’s also one that only needs making in an affluent, Global North context. There’s no point trying to tell a subsistence fisherman to care about ecology because they already do, even if they might not phrase it in quite the same terms.

A more productive approach then is to direct our energies into finding a form of words that will demonstrate the relevance of ecology to the audiences we are trying to reach. To some extent we are already attempting this with concepts like ‘nature-based solutions’, which can help policy-makers relate to our science****. We might resent the consequent dilution of our passion into someone else’s priorities but ultimately this is likely to be the most effective way to achieve the responses we are looking for. Rather than trying to turn everyone into ecologists (although more will always be welcome) we should show others how ecology impacts on the things they already care about. Make ecology important to them instead of asserting that ecology is important in its own right.


* I’m not going to get into an argument here about what ecology means, given that the word itself carries different implications for academic researchers, environmental campaigners or outside observers. For the purposes of this post assume it means something like the study of the natural world.

** I’m sympathetic to the argument that all three of those fields can be linked to ecology. On the other hand, someone outside our own subject area might argue that studying ecology is only important insofar as it advances public health or agriculture. I’m put in mind of the absurd claim by pathologist Rudolf Virchow that ‘Medicine is a social science and politics is only medicine on a large scale’. That everything links to your field doesn’t make it the centre.

*** In opposition to this view are observations such that conservation biologists have a relatively high carbon footprint. I’d be delighted to learn of any systematic study that has tested the assumption rigorously.

**** This is a generous reading because I’m aware that there are plenty of people who dislike the term, and indeed all buzzwords and phrases that create bandwagons around poorly-defined concepts. If they achieve the intended outcome then I’m inclined to be less critical.

Will herbal medicine provide a cure for COVID-19?

EWZp1dMXYAYo3Cf

COVID Organics, the miracle ‘cure’ for COVID19. Original source of photo unknown.

As the pandemic spread around the world, the President of Madagascar, Andry Rajoelina, made a startling announcement. He launched a new drink, COVID-Organics,  developed by Malagasy scientists, which was purported to cure the new disease. The evidence of its efficacy was slight, but the basis behind it was linked to a history of local herbal lore and an existing treatment for malaria*, combined with an association with a modern-style research institute.

Already several African states, including Tanzania, Guinea-Bissau and Congo-Brazzaville, have invested scarce resources on importing the new treatment, and it’s being rolled out across Madagascar, some of which is at national expense. It’s easy to understand why. Western medicines are often unaffordable at the best of times, and in the international scramble for resources they are now simply out of reach. COVID Organics is available on the doorstep. Moreover, there is a strong desire to support native expertise, despite most international scientists advising caution in the absence of any reliable evidence. Every society looks to its own authority figures for hope and guidance. We shouldn’t criticise desperate people for trying whatever remedy is actually available.

The claim of a wonder treatment was however met with scepticism from medical experts. President Rajoelina has hit out at critics from the Global North, accusing them of a condescending attitude towards African expertise. At the same time there is a reluctance by many to openly dismiss a treatment that has been promoted as an indigenous African solution drawn from a respected tradition. Even the WHO did so only obliquely.

I am on record as being strongly in favour of recognising and valuing alternative approaches to the development of knowledge beyond the frequently colonial attitudes we are responsible for perpetuating. In this case, however, I’m not inclined to mince my words. COVID-Organics on its own will probably do no harm, but there’s very little chance that it will do any good. By all means test it like any other potential drug, but its provenance doesn’t make it any more plausible as a candidate treatment. And if it takes the place of known, genuinely effective interventions (social distancing, hand washing etc.), or wastes money that could be spent on proven medical care, then it will become positively dangerous to health.

Why am I so sceptical? Those who advocate herbal medicine as an alternative to conventional treatment usually follow one of two lines of argument in support. The first is that it is an ancient practice, based on thousands of years of development, and that this long duration has ensured the transmission of only the most effective cures.

It’s easy to pick this apart. Firstly, the foundations of herbal medicine were derived from theoretical grounds which we now know to have been fundamentally flawed. In its Western form these include the Doctrine of Signatures, which states that God indicated the medical uses of plants through their physical characteristics, or treating symptoms as manifestations of the four humours. Such methods of identifying possible plants and matching them to conditions is little better than random. We shouldn’t expect paradigms that predate germ theory to stumble on insights into a novel threat.

Second, herbalists will often advocate their art by picking out those remedies which have gone on to be important medical drugs. It’s a classic case of the prosecutor’s dilemma; a number of effective treatments have come from plants, but not all medical plants are effective. One which is usually rolled out is the Madagascar periwinkle, which gave rise to a lucrative pharmaceutical used to treat a common form of childhood leukemia. This is however completely unrelated to its traditional usage as a largely ineffective treatment for diabetes. That it yielded such a valuable modern drug owes as much to serendipity as herbal medicine.

Finally, the legend that tropical forests contain a fabled pharmacopeia whose secrets are held by traditional healers has been comprehensively demolished by prolonged enquiry. The story remains persistent because of its connection to a number of beloved folk images rather than any basis in evidence. We have probably taken all the low-hanging medical fruit from the plant kingdom already. A forest-dwelling shaman won’t solve our new problem, not least because remote tribal people live at such low densities that they tend not to suffer from contagious viruses.

Should we instead be scouring the plant kingdom for potential COVID cures? To do so would almost certainly be a waste of time and resources. Not that I’m sure some unscrupulous or naive researchers are putting in grant applications to do exactly that right now. Note that most major pharmaceutical companies gave up on this approach to drug research many years ago after wasting spectacular sums in the process. If it worked then Big Pharma would be doing it already for the diseases we already have.

OK, you might ask, but what if one of these herbal cures turned out to actually work? Medical plants contain a vast number of chemicals. Identifying, purifying and testing the active ingredients is a long process. Sometimes physiological effects rely on complex interactions with other constituents which mean that the individual chemicals don’t act quite the same in isolation. Controlled dosages of herbal medicines are almost impossible to achieve. And there is a high risk that one or other component will be allergenic or otherwise harmful. Demonstrating efficacy and comprehensive safety of a botanical treatment is therefore much harder than for any single component drug.

To summarise, it is possible that herbal medicine might eventually lead to a cure for COVID-19, but it is much less likely to do so than conventional scientific approaches. Even if a cure does eventually arrive through the herbal route, it will take much longer, likely many years, and the lack of any precedents in the modern era is not encouraging. We haven’t found a herbal cure for any other virus yet, and not for want of trying. Maybe Madagascar really has stumbled on the solution to the world’s greatest current problem. Until we have some solid evidence, however, I wouldn’t bet on it. We are all desperate for a cure to appear but wasting time and scarce resources on dead ends will ultimately cost lives.

 


 

* Is it coincidence that both the presidents of Madagascar and the United States have promoted the use of treatments for malaria, a fever caused by parasitic infection, as supposed cures for an entirely unrelated virus?

 

The importance of luck in academic careers

survivorship_bias

As usual, there’s an XKCD comic for everything.

Not long after I received my first permanent academic contract I attended a conference and went out drinking after the sessions. By the end of the evening I found myself amongst a large group of people around my own age, mainly post-docs and PhD students. Conversation turned to careers and it so happened that I was the only one with a secure position, which prompted an immediate question. What was the secret? Everyone was feverishly after the same thing, and here was someone in the room who knew the trick.

My answer, ‘luck’, went down like a glass of cold sick. It was honest but unpopular. I now regret saying it, and realise that I should have added ‘privilege’. At the time I hadn’t appreciated the extent to which privilege played a part in the relatively smooth passage of my career*. It didn’t seem that way to me, but in retrospect I certainly had it easier than most. The truth, however, is that ‘luck’ is often the secret, insofar as it’s the element no-one wants to talk about. All the other things, the ones we can either control or are imposed upon us, are obvious. I had none of the magic bullets: no Nature paper or prestigious research fellowship**. Objectively there was nothing on my CV that set me apart from most other post-docs on the job circuit. It certainly felt a lot like luck to me.

Why don’t people like to hear this? Accepting the importance of luck downplays the extent to which anyone has agency in their professional lives. We like to hear that working hard and chasing our dreams brings success in the end. I think for the most part that it’s true, and almost everyone who manages to get into the ivory tower will tell you that backstory. But it’s akin to hearing an Olympic gold medalist tell you that their secret is dedication and never giving up. As if all the people that didn’t come first just weren’t dreaming hard enough.

Telling people to keep plugging away until they get their break also assumes that there are no costs to them doing so. It’s the mindset that leads to the eternal post-doc, a restless soul traveling from university to university, country to country, for the chance of another year or two of funding, then having to pack up their things and move on once again. While still young and relatively carefree, that can be fun for some. When you’re 40 and want to get married, buy a house, have children or care for your parents, it becomes impossible. I never had to go through that and I’m extremely grateful for it.

Declaring the importance of luck and privilege also somehow diminishes the achievement of those who have made it, and therefore provokes hostility from those who are already through the door. It’s not the story we like to tell ourselves, and it’s certainly not one we like other people to tell about us.

So let me lay my own cards on the table. I believe that I deserve to have a permanent academic job***. I worked hard to get here, and I’m pretty good at it. But I can also say without any equivocation that I’ve known people who worked much harder than me and were demonstrably smarter than me but who didn’t manage to capture one. I’m comfortable admitting that although I surpassed the minimum expectation, if it were a true meritocracy then the outcome would have been different.

My first job came about because I was in the right place at the right time. I had hung around long enough in a university department to pick up the necessary ticks on my CV, despite substantial periods of that being spent on unemployment benefit. Small bits of consultancy through personal connections and a peppercorn rent from a friend made a big difference. I had the privilege of being in a position to loiter long enough for a job to come up, and the good fortune to find one that fit. Lots of others wouldn’t have had the luxury of doing so, and had it taken another six months, I wonder whether I would have been hunting for another career as well.

Does any of this change the advice given to an eager young academic? No. You still need to publish papers, get some teaching experience, win some competitive grant income, take on some service roles, promote yourself and your work as widely as possible. The formula remains exactly the same. It’s not easy, and it’s got even harder over the last 15 years. Good luck. And if you don’t have luck, make sure that you have a Plan B in your back pocket.

Any change needs to come from the academy itself. Its us who have the problem and it’s our responsibility to fix it****. There’s no way to entirely remove luck from the hiring equation (think of it as a stochastic model term) but we can influence the other parameters. I was wrong to think that privilege wasn’t part of the equation that got me here, but I can try to minimise its distorting effects in future.

 


 

* I have the ‘full house’ of privileges, being a white, male, heterosexual, tall, healthy, able-bodied, native English-speaking, middle class… did I miss anything out? Don’t bother talking to me about my struggle because I didn’t have one.

** The one thing jobs, Nature papers and grants all have in common is that your probability of getting one increases with the number of times you try. It’s not entirely a lottery but there is a cost to every attempt and some can afford more of them. And I still don’t have a Nature paper.

*** There are almost certainly people who will disagree with this, but let them.

**** As pointed out in this post, however, our survivorship bias can prevent us from recognising that those following us are facing very different challenges to the ones we went through.

 


 

Postscript: I already had this post lined up to publish when an excellent and complementary thread appeared on Twitter.

https://twitter.com/RIBernhard/status/1257385363703824396