We constantly make decisions about who to trust.
Much of the time we’re bombarded with massive amounts of information on all sorts of different subjects, from science and health, to social issues, economics and politics. But no matter how hard we try – or brilliant we are – none of us can understand everything, and correctly assess the risks associated with the issues affecting ourselves and our communities.
We have no choice but defer to others, and the decisions we make about a person’s or organisation’s trustworthiness can play a huge part in our health and mental wellbeing. In some situations, such as whether to take a vaccine, it can be a matter of life or death.
During the pandemic, researchers conducted a series of large surveys investigating which factors were linked to vaccine hesitancy. One survey questioned more than 8,000 Americans in five different states, another almost 7,000 individuals in 23 countries and a final one included over 120,000 respondents in 126 countries. They all found that trust in science was a key factor in determining whether people intended to be vaccinated.
But what influenced this trust in science? Researchers on “epistemic trust” – which is our trust in someone as a knowledgeable source of information – have identified three main factors which we use to determine trustworthiness: how we perceive an expert’s level of expertise, integrity and benevolence (concern and care for society).
A recent study in Germany measured trust in science throughout the pandemic, and the factors affecting it. By analysing data from four surveys done at different points in time, and involving over 900 respondents, the researchers found that trust in science increased substantially after the pandemic began – and it was mainly due to positive assumptions about the scientists’ expertise in their field.
In contrast, the most pronounced reason for distrusting the scientists was a perceived lack of benevolence because scientists are often dependent on the funders of their research. So, the researchers recommended that science communication emphasised the good intentions, values and independence of the scientists.
In the UK, 72% of people reported a high level of trust towards scientists during the pandemic, compared to 52% towards the government. Although no studies specifically investigated perceptions of the scientists’ expertise, integrity and benevolence, negative attitudes towards the vaccine were mainly caused by lack of trust in the benefits of vaccination and concerns about future unforeseen side effects.
It’s okay to say “I don’t know”
Many of us, whatever our field of work, fear that showing uncertainty can damage our image – and we may compensate by expressing overconfidence in an attempt to win trust. This strategy has been seen from university press officers when writing about the findings of academic research – and also from some public health officials when communicating to the public during the pandemic.
But some studies show that while confident advisors are judged more favourably, people do not inherently dislike uncertain advice. In fact, when faced with an explicit choice, people were more likely to choose an advisor who provided uncertain advice (by providing a range of outcomes, probabilities or saying that one event is “more likely” than another) over an advisor who provided certain advice with no doubts.
It seems that advisors benefit from expressing themselves with confidence, but not from communicating false certainty.
In many situations, people are willing to trust those who can admit they don’t have a definitive answer. Good news come from recent experimental studies on physician–patient interactions, witness credibility and science communication which found that communicating uncertainty and even admitting our mistakes is not detrimental and can even be beneficial to trustworthiness.
So, failure in “expertise” can be compensated by higher integrity and benevolence. When communicating uncertainties in a transparent way, we are perceived as less biased and willing to tell the truth.
There’s a neurological basis
Another characteristic of trustworthiness is that it can also be weakened by what is known as “guilt by association” (you can be judged by the company you keep) – or moral contagion – the psychological mechanism behind that belief.
There’s a saying that a spoonful of tar can spoil a barrel of honey. And in fact, the food analogy makes some sense.
It is believed that throughout evolution, our disgust mechanisms, originally evolved to assess contamination and avoid disease from rotton or soiled food, also started to assess people. Our disgust reaction – when disgusted by people’s untrustworthy behaviour – is the same neurologically as our disgust reaction if food is off.
In support of this hypothesis, both disgust in food and moral judgement activate the same areas of the brain and the same facial muscles.
Interestingly, our disgust sensitivity (how easily we are disgusted) does indeed show a positive association with our level of distrust in others. In other words, if we are inclined to worry about pathogens on food, we’ll also be inclined have a lower level of social trust and feel that most people should be avoided.
But it is still unclear how this psychological process of “moral contagion” can affect our trust towards many organisations or individuals allegedly collaborating closely with each other, such as scientists, government, pharmaceutical corporations, universities and international bodies during the pandemic. In such a melting pot of organisations, it will depend on the groups we feel drawn to, and our personal sensitivities to misconducts such as lies, political scandals, conflict of interests or nepotism.
In the current climate, any person or institution who genuinely wants to be trusted should work on communicating their expertise, honesty and benevolence – and encourage those they work with to do the same.
This article was originally published on The Conversation. It is republished under Creative Commons.
Photo by Vlad Tchompalov on Unsplash.