Luna 9


"The Only Certainty is that Nothing is Certain"

LinkedIn Twitter Banner.png

Tackling the issues of communicating scientific uncertainty

Scientific uncertainty is not a glamorous thing to explain. When presenting data in a (hopefully) beautiful infographic the temptation for the designer is always to treat the data as fact. The designer designs it that way; the reader digests it that way. But what about when the reliability of the data is only, say, 75%? What factors need to be considered when communicating that lack of certainty, and at what point does the responsibility of understanding factual accuracy shift from the designer to the reader?

Adding caveats such as ‘this data only accounts for X and Y’ or ‘some parts of the data are missing’ are commonly seen to undermine the finality of the information. It almost makes it seem less credible if you admit there is a degree of uncertainty. However, all that happens is people start to only deal in absolutes – things either ‘are’ or ‘are not’; there is no in-between. This kind of broad stroke communication is often understandable when subjects need to be simplified or condensed, but when it comes to issues of healthcare, the economy, and immigration, it is vitally important that the level of certainty around research data is as integral to the communication as the data itself.

As a hypothetical example, a headline may read ‘Immigration up to 10% higher than this time last year’, but the data may reflect that the figure could potentially be as low as 3%. But it is left to the reader to unearth the reality - a 7% margin for error. That drastically changes the narrative.

The Cambridge University Winton Centre workshop

Back in the Autumn of last year Luna 9 was fortunate enough to be chosen to participate in a workshop tackling the challenges and potential solutions for communicating scientific uncertainty. Organised by the Winton Centre at Cambridge University, this meeting of minds was led by the fascinating Professor David Spiegelhalter. The 2-day event would bring together people working in the sectors of healthcare, climate change, economics, and immigration, to discuss the complexities of conveying ambiguous scientific results. By pooling the knowledge and experience of all these experts the chances looked pretty good that we, as a group, would all leave the event with a greater understanding about how to make progress when dealing with this challenging topic.


Luna 9 was selected to be one of a handful of ‘visualisers’ – essentially those whose responsibility it is to creatively articulate information – and I, as the representative of our fine agency, would be teamed up with people from different disciplines to form a discussion group. From start to finish it was a whirlwind of fascinating conversation, debate and experimentation with some of the most insightful people I’ve ever had the pleasure of meeting. There were Oxford and Cambridge academics, researchers and analysts from energy companies and environment agencies, medical researchers from across the globe, and then me. Fortunately, I was able to put my design skills to good use and let the Mensa team lead the conversation.

One of the things that struck me straight away was that, with sky-high IQs and half an alphabet of qualifications after their names, my colleagues on the day were more susceptible to the curse of knowledge than anyone. My previous blog explained the theory in more depth, but it was the gradual process of everyone stepping back from their position of knowledge and comprehension that allowed us to make big strides in the way we could best communicate hypothetical scenarios of scientific uncertainty.

The challenge

Our task was to design a series of question-based experiments that could be tested online with members of the public in the USA overnight. By adjusting the way the reliability of the data was communicated, we hoped to be able to see how that affected people’s decision making process. It was a busy afternoon that turned into a late night of putting together a series of questions for our audience to respond to. I was particularly involved in the wee hours as the design work quite rightly comes last in the process. We were testing the way changing the language used to describe levels of reliability has an effect on the reader – the theory being that the more complex and scientific the language, the less people pay attention to it. Conversely, we surmised that by introducing informal language, members of the public would relate to it more and follow its guidance.

I was keen to stress the considerations of the design throughout. Each decision that is made is done so for a reason – to draw the eye, give prominence to a certain bit of information, and so on. Changing the colours, typeface, sizes and hierarchy of the information have subtle but influential consequences which can start to shape the way a viewer digests the information in front of them. I’m pleased to say that my contribution, aside from designing the collateral for testing, was to educate everyone in the group about the value of investing time in the design process.


What we learned

The morning after we reconvened to analyse the results of our experiment. To our surprise, the more scientific wording had more people following its guidance. We were pleased to have been proved wrong as, in this case at least, it suggested that dumbing down is not the way to get people to engage with the nuances of scientific uncertainty. We would be the first to admit that this is a long way from being a reliable academic study. It is, however, a start.

The main takeaway I took from the workshop was the need to communicate that uncertainty means transparency. At Luna 9 we are always striving to communicate all of the information in the most engaging way possible, but never at the expense of clarity or context. If you are up front about the accuracy or reliability of your evidence then it allows, even entrusts, the reader to make an informed judgement with all the information available. As information designers, it is our responsibility to make sure that what we produce is accurate and gives the audience the context they need to make sense of it. From there, we can only hope that they share it in the same way.


Michael Green
Co-founder and Creative

Michael GreenComment