Experts are often fallible, so expert advice should be examined carefully

Evidence shows that experts are frequently fallible, say leading risk researchers, and policy makers should not act on expert advice without using rigorous methods that balance subjective distortions inherent in expert estimates. Many governments aspire to evidence-based policy, but the researchers say the evidence on experts themselves actually shows that they are highly susceptible to “subjective influences” — from individual values and mood, to whether they stand to gain or lose from a decision — and, while highly credible, experts often vastly overestimate their objectivity and the reliability of peers. 

Evidence shows that experts are frequently fallible, say leading risk researchers, and policy makers should not act on expert advice without using rigorous methods that balance subjective distortions inherent in expert estimates.

The accuracy and reliability of expert advice is often compromised by “cognitive frailties,” and needs to be interrogated with the same tenacity as research data to avoid weak and ill-informed policy, warn two leading risk analysis and conservation researchers in the journal Nature.

While many governments aspire to evidence-based policy, the researchers say the evidence on experts themselves actually shows that they are highly susceptible to “subjective influences” — from individual values and mood, to whether they stand to gain or lose from a decision — and, while highly credible, experts often vastly overestimate their objectivity and the reliability of peers. 

U Cambridge reports that the researchers caution that conventional approaches of informing policy by seeking advice from either well-regarded individuals or assembling expert panels needs to be balanced with methods that alleviate the effects of psychological and motivational bias.

They offer a straightforward framework for improving expert advice, and say that experts should provide and assess evidence on which decisions are made — but not advise decision makers directly, which can skew impartiality.

“We are not advocating replacing evidence with expert judgements, rather we suggest integrating and improving them,” write professors William Sutherland and Mark Burgman from the universities of Cambridge and Melbourne respectively.

“Policy makers use expert evidence as though it were data. So they should treat expert estimates with the same critical rigor that must be applied to data,” they write.

“Experts must be tested, their biases minimized, their accuracy improved, and their estimates validated with independent evidence. Put simply, experts should be held accountable for their opinions.”

Sutherland and Burgman point out that highly regarded experts are routinely shown to be no better than novices at making judgements.

However, several processes have been shown to improve performances across the spectrum, they say, such as “horizon scanning” — identifying all possible changes and threats — and “solution scanning” — listing all possible options, using both experts and evidence, to reduce the risk of overlooking valuable alternatives.

To get better answers from experts, they need better, more structured questions, say the authors.

“A seemingly straightforward question, ‘How many diseased animals are there in the area?’ for example, could be interpreted very differently by different people. Does it include those that are infectious and those that have recovered? What about those yet to be identified?” said Sutherland, from Cambridge’s Department of Zoology.

“Structured question formats that extract upper and lower boundaries, degrees of confidence and force consideration of alternative theories are important for shoring against slides into group-think, or individuals getting ascribed greater credibility based on appearance or background,” he said.

When seeking expert advice, all parties must be clear about what they expect of each other, says Burgman, director of the Center of Excellence for Biosecurity Risk Analysis. “Are policy makers expecting estimates of facts, predictions of the outcome of events, or advice on the best course of action?”

“Properly managed, experts can help with estimates and predictions, but providing advice assumes the expert shares the same values and objectives as the decision makers. Experts need to stick to helping provide and assess evidence on which such decisions are made,” he said.

U Cambridge notes that Sutherland and Burgman have created a framework of eight key ways to improve the advice of experts. These include using groups — not individuals — with diverse, carefully selected members well within their expertise areas.

They also caution against being bullied or “starstruck” by the over-assertive or heavyweight.

“People who are less self-assured will seek information from a more diverse range of sources, and age, number of qualifications and years of experience do not explain an expert’s ability to predict future events — a finding that applies in studies from geopolitics to ecology,” said Sutherland.

Added Burgman: “Some experts are much better than others at estimation and prediction. However, the only way to tell a good expert from a poor one is to test them. Qualifications and experience don’t help to tell them apart.”

“The cost of ignoring these techniques — of using experts inexpertly — is less accurate information and so more frequent, and more serious, policy failures,” write the researchers.

— Redmore in William J. Sutherland and Mark Burgman, “Policy advice: Use experts wisely,” Nature (14 October 2015)

 

No Comments Yet

Leave a Reply

Your email address will not be published.

©2024. Homeland Security Review. Use Our Intel. All Rights Reserved. Washington, D.C.