Expert opinion is not always right

In the first blog of our new special series on Evidently Cochrane: “Oh, really?” 12 things to help you question health advice, Cochrane UK’s Director, Professor Martin Burton, takes us from experts to evidence. You can find all of the blogs here and on Twitter using the hashtag #OhReally.

A BBC news item reported recently that NHS Trusts use more than 100 different types of mesh in the repair of hernias, one of the most common operations.  That must mean at least 100 different surgeons – probably many more – are using their preferred type of mesh, each of them convinced that the one they are using is best.  I wonder how often they have each been asked the question, “why do you use that particular sort of mesh?”  I presume the answer would be: “Because I think this mesh is best for my patient”.  “How do you know?”.  “Because in my experience, having done 47 of these operations over the last……”  That is a typical exchange.  Believe me.  I’m an expert.

Actually, don’t please believe me simply “because I am an expert”.  One thing that people need to understand if they are to weigh up claims made about the effectiveness or otherwise of treatments is this: “Expert opinion is not always right”.

Experts don’t always agree

Many people – including the experts – often disagree about the benefits of different treatments and the harms they may cause.  You see this in many ways.  One sign is the wide variation across the country in the use of some surgical procedures for the treatment of common conditions.  If operation A is better than operation B for treating people with a particular disease, why is it that in some places everyone has operation B and not A?  It is often because the experts who do operation B think – and say – that it is better than A.  Perhaps there really is no difference and they are equally effective.  Or equally ineffective.  Or perhaps they have never really thought about the alternatives.  There could be several other reasons.  Be that as it may: experts are not necessarily right all the time.  Sometimes they are.  I’d like to say that for the best of them, they often are.  But beware of putting too much weight on a person’s so-called expertise.  Instead, consider challenging them to provide the evidence for their recommendations.

Challenging the expert, asking for evidence

But challenging experts may be easier said than done.  More years ago than I care to remember, when I was a junior doctor, I remember learning how to treat patients with different medical problems.   After a few months I would have a clear idea which tests to order for a patient, and what treatment to start.  All this under the close, expert eye of the consultant and her team.  Imagine my surprise (and some indignation and discomfort) in discovering, when I moved on to the next hospital, that my new consultant wanted his patients, with exactly the same condition, to be treated completely differently.  How could that be?  Surely they were both experts.  How could both their opinions be right?  These events occurred such a long time ago that “evidence-based medicine” was unheard of.  So asking the question “what’s the evidence” would have been unheard of.  It was something you simply never asked.

Times have changed.  Thankfully.  Nobody should be afraid to challenge the experts, not least a patient.  But here’s a ‘top tip’ – consider the framing of the question.  “Thank you for that recommendation.  I am really pleased to have your expert advice.  But I’d like to understand a little more about the sort of evidence that underpins it…”

Beware of putting too much weight on a person’s so-called expertise.  Instead, consider challenging them to provide the evidence for their recommendations.

Experts (and all of us) need evidence!

Early in my career I came across another ‘expert’ activity – expert ‘reviews’.  These were, I thought, comprehensive, up-to-date articles, published in various journals, bringing together experts’ views on various topics.  For example, describing the various treatment options for a particular condition and drawing conclusions about which one was most effective and safe.  I was never very sure what the criteria were that led to someone being asked to write such a review.  Just being ‘an expert’ and willing and able to write it I imagined.  So I was surprised that whilst I was still a trainee surgeon (albeit, quite a senior one) I was asked to write one.  I threw myself into it.  I spent ages finding papers by famous and distinguished people, describing how they treated the condition.  In those days, getting hold of the papers was a real effort; traipsing to libraries, finding bound volumes of journals, waiting in line for a photocopier.  So it went on.  The upshot of this effort was a strong desire to make sure that no effort was wasted.  In deciding which papers to cite in the final article, there was a general feeling of – “include them all, if they’re at all relevant, because after all that effort to get them and read them…”.

Looking back on this process it was all rather random.  Locating the papers to include involved a certain amount of serendipity combined with respect for those with big names in the discipline.  Appraising the papers rested almost exclusively on the status of the authors and the journals they were published in; big names writing in big journals must be right.  And bringing the results together in the review article was a form of writing in which information was brought together and combined in a way that did not follow any particular rules.  All in all, although this process was undertaken conscientiously and carefully, respectful of the expertise of my peers, it was very unscientific.

From ‘expert’ reviews to systematic reviews

I now know that this was far from ideal.  Far better to undertake the proper, rigorous process of “locating, appraising, and synthesising evidence from scientific studies according to a strict protocol” – in other words, to do a “systematic review”.  High-quality systematic reviews of randomised controlled trials are at the top of the “evidence hierarchy” and provide much more reliable evidence on the effectiveness or otherwise of a specific treatment.  Cochrane specialises in producing reviews like this, called Cochrane Reviews, and Cochrane’s standards in producing these are said to be the highest in the world.

Let’s return to the issue of asking an expert for the evidence underpinning their advice.  What might a really good answer look like?  How about this: “Well, there are several good systematic reviews of trials that allow us to conclude with a high degree of certainty, that in 75% of patients like you, this treatment will cure the problem.  At the same time, there is a 5% chance that you may experience some minor side-effects of the treatment”.

Experts’ advice isn’t always right.  But it is more likely to be right when it is based on solid, rigorous, scientific evidence.

Take-home points

  • Expert advice isn’t always right or based on careful consideration of the best evidence.
  • Always feel able to ask “What is the evidence? How certain can we be?”
  • Systematic reviews in general, and Cochrane Reviews in particular, are a good source of evidence.

Join in the conversation on Twitter with @CochraneUK #OhReally or leave a comment on the blog.

References and further reading may be found here.

Visit the Teachers of Evidence-Based Health Care website, where you can find resources which explain and illustrate why expert opinion is not always right.

This series of blogs is inspired by a list of ‘Key Concepts’ developed by the Informed Health Choices.


Martin Burton

About Martin Burton

view all posts

Martin Burton is Director of Cochrane UK, the centre responsible for supporting Cochrane activities in the UK & Ireland. He is Professor of Otolaryngology, University of Oxford, Honorary Consultant Otolaryngologist, Oxford University Hospitals NHS Foundation Trust and Fellow in Clinical Medicine at Balliol College. He is joint co-ordinating editor of the Cochrane ENT Group.

4 Comments on this post

  1. Avatar

    HealthWatch, and independent UK charity that has been promoting science and integrity in healthcare since 1991, held a special symposium to consider the type of evidence required for surgical procedures such as mesh implantation and the use of implants and devices. We concluded:
    • Approval has been a technical rather than a medical process.
    • The ‘equivalence’ system using Notified Bodies has failed.
    • Using and recording device serial numbers would be a simple first step.
    • The IDEAL-D framework provides for evidence-based implant development.
    • Adequately funded registries are needed with compliance monitoring.
    • Political action will be required to influence the developing rules and to draw agencies together.
    • There are academic responsibilities: early reporting; development of evidential standards; guidelines for data reporting and appropriate data amalgamation procedures.
    • Putting the issues into simple statements will be a powerful aid to progress.

    When it comes to medical implants, HealthWatch says:
     Implant approval should be graduated and supported by step-by-step evidence. This should replace the ‘equivalence’ system of approval using Notified Bodies which has failed.

     Those who implant a device must know (and be able to explain to their patient):
    o What it is and what its constituents are
    o How it is identified and tracked
    o How the evidence shows that it works
    o What risks are involved
    o What to do if things go wrong

     Regulators, academics and professional bodies should work together to achieve these aims

    The full symposium report is available at: https://www.healthwatch-uk.org/projects/medical-devices.html

    John Kirwan / Reply
  2. Avatar

    Recognition of experts by their droppings (guano-based eminence-based medicine). https://www.bmj.com/content/329/7480/1460.full.pdf+html
    .Oxman AD, Chalmers I, Liberati A. A field guide to experts. BMJ 2004;329:1460-1463.

    Iain Chalmers / Reply
  3. Avatar

    Thank you for your comment. In my experience – both as a clinician and a patient – when an expert sets out the evidence and the options clearly, the shared decision-making process can be relatively straightforward. But at times, an expert will admit to a significant degree of uncertainty. The best will then point patients to places and people to help them come to a decision that is right for them. At times this will be to another expert for a ‘second opinion’. Even in the NHS you are ‘entitled’ to a second opinion if you ask for one. And again, the best experts will be happy to facilitate this, often with the help of your GP. We all provide second opinions probably more often than you might imagine. But there seems to be a tacit assumption that this opportunity will not be ‘abused’; the NHS simply could not cope with everybody seeking a second opinion all the time. It is worth saying that if an expert is uncertain about the advice they are offering a patient, she or he will almost invariably discuss this with a colleague – a different sort of ‘second opinion’.

    Martin Burton / Reply
  4. Avatar

    Curious to hear your views on the practical application of this advice by members of public in the UK. We all know that getting an appointment with a specialist on the NHS takes time and persistence. One can and should ask that expert for the evidence, but you do not have a choice of expert opinions and it’s between you and this one expert to agree on the way forward. So practically, within a state health system you follow the best advice of the expert you are referred to.

    Anya / Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

*

UA-49496932-1