Be humble, be courageous, but above all: be the whistle-blower

A view of Buda and Pest

A view of Buda and Pest

From the 26th of June until the 2nd of July the Central European University in Budapest hosted a summer school on the topic of the precautionary principle (henceforth PP). An assembly of (among others) lawyers, environmental scientists, policy-makers and medical experts attended. Their roots stretched the breadth and length of the globe: from the Philippines to Ireland and Sweden to Uganda. Collectively their objective was to explore the meaning of the PP in both theory and practice. It was with great pleasure and interest that I attended this week packed with learning, reflection and discussion, and a true privilege to be working with this diverse group of people. In this blog I will focus on the message David Gee, a science-policy interface expert and co-author/–editor of the European Environment Agency’s ‘Late Lessons from Early Warnings’ (henceforth LLfEW), shared with us in his talks.

As humans we have proven not to be very ‘wise’ as our Latin title Homo Sapiens might suggest. Gee would like to think Homo Stupidus is a much more appropriate and characterising term for humankind. The reason for this view is that we have proved to fail to take lessons from past cases and failed to take timely precautionary measures when past cases might have taught us it is wise to heed early warnings of harm. The two LLfEW reports, exploring the (mis)use of the PP, have shown precisely this, even though there are some (exceptional) cases in which due consideration was given to early warning signs and they were acted upon.

The problem resulting from not heeding these warnings is that history repeats itself and equally awful consequences of this mistake result. One of the key problems Gee ascribes this failure to is the excuse that there is ‘no full scientific certainty’ on a matter. On the one hand this could mean, and this is the ‘optimistic’ interpretation, that scientific results are uncertain (e.g. they cannot give us a precise probability as to the occurrence of a risk) or the scientific community is divided on the gravity of or causal links leading up to the risk. On the other hand, and as it turns out this argument is given frighteningly frequently, it could mean that there is no scientific data collected whatsoever and this forms the basis of an excuse for inaction. Therefore we, politicians and society as a whole, should urge that due care is taken in collecting relevant scientific data on possible risks and it is given due consideration by policymakers. However, it should not be (morally) permissible to wait for scientific certainty, which can never be reached anyway, or a detailed and refined cost-benefit analysis. This is because with the clearing up of the uncertainty to virtual certainty also comes a more limited timescale to take action or even the inability to address the harm before it manifests in reality. It is essentially an ethical choice which level of certainty we demand of the scientific body before we take action on their findings.

Working group on the precautionary principle in practice

Working group on the precautionary principle in practice

‘Humility’ is therefore one of the attitudes advocated by Gee and his colleagues who worked on the LLfEW reports. Humility as to the limits of our possible (scientific) understanding, humility as to the phenomena we are and possibly will forever be ignorant about and humility as to our confidence that we have truly learned from our past mistakes. Another important attitude advocated by Gee would be ‘courage’. This attitude is essential if we want to make the systemic changes needed to address the systemic issues that arise (like climate change issues or chemical exposure in our day to day life). The reason for rooting for this attitude is that actors involved in shaping the policies that will bring about these systemic changes, which are scientists, policy makers, businesses, but also (and maybe most importantly) the public, should have the courage to rebel against the status-quo and challenge the system. These actors should have courage to speak out against the system that binds them and relay their message to their peers; be the whistle-blower, be it in the boardroom, at dinner parties or in laboratories.

Key questions which are left, which I consider to be a challenge to address myself, are: what constitutes ‘proper’ humility and courage? How do human beings acquire these ‘virtues’? And, perhaps most importantly: is it a feasible solution to our recurrent and arguably inherently human failure to learn from past mistakes, or will aiming to instil these virtues in society be a case of ‘too little, too late’?

By Leverhulme Doctoral Scholar Vera Van Gool