Doctor reassuring patient

Toolkits that assess, describe and direct the targeted improvement of patient experiences are often designed to inform us about quality on an organisational or system level. But it is equally important to support an individual’s desire to strive for improvement in their own personal practice.

Picker’s Individual Clinician Feedback programme supports professionals to collect, interpret, publish and act on patient feedback that relates to them as an individual.

Professor Ben Bridgewater, Consultant Cardiac Surgeon at University Hospital of South Manchester, on how individual clinicians assess their own performance…

“When you ask, ‘are you a good doctor?’ The answer is, more often than not, yes.

When you ask ‘how do you know?’ answers can involve anecdotes, cards from patients, exam success, lack of complaints or a proven ability to jump through hoops.

But this information isn’t helpful to a clinician looking to learn why a patient had a high quality patient experience, or how they might identify and target areas for improvement.”

“Constantly requesting and responding to patient feedback is what makes a good doctor. But this is not something we see happening widely in health and social care.

The Medical School experience is based on acquiring and assessing knowledge for future use. Once a clinician is qualified and working, distractions can prevent them from learning what it would really take to improve as a doctor, – pressures, unrealistic targets and applying skills on a mass scale often divert from the consideration of how care is experienced from a patient’s perspective.”

“Doctors who are already really good at what they do, are given the opportunity to be even better!

My colleagues and I, partnering with Picker, conducted a pilot at the University Hospital of South Manchester. Every time we saw a patient or performed a consultation, a system automatically identified that patient by name and address, and sent them a questionnaire inviting their comments on their experience.

For us as doctors the most seemingly pressing thing we have to do, day-to-day, is ensuring clinical effectiveness and safety. But having these feedback systems in place meant we owned the really relevant, specific information we needed to get better at our jobs. Every time I receive my patient experience feedback and measurement I pick up a different area where I am perhaps not scoring as highly as I could be.

As a doctor you say, ‘How can I be better – deliver better care and improve that grade?’

Sometimes it’s by speaking more clearly or providing more emotional support, but whatever the improvement needed is, this type of feedback and measurement gives you a specific direction in how to improve.”

 

“Proactive feedback research should be carried out on an appropriate scale – continually.

That way, you will be able to extract a summary of that data and learn from it in terms of best practice, and it will still feed into annual appraisal and five yearly revalidation.

I think it’s good that annual appraisals are now essential. I think it’s good that professional revalidation is now essential every five years. I think it’s disappointing that current data processes for appraisal are not robust enough to allow for real, long-term improvement.

The motivation for measurement should be that you want the feedback to do a better job, not because if you don’t do it every five years you won’t get the revalidation you need to practice.

“There seems to be a culture where the people dealing with potential problems in the quality of care have to believe beyond reasonable doubt that there is a need to act before any action can be taken.

This belief is totally out-of-sync with patient reality. We need to change the concept of clinical professionalism to be about individual doctors delivering an even better consultation to their next patient, and the patient after that, and the patient after that…”

Picker has already run Individual Clinician Feedback programmes with practitioners at the University Hospital of South Manchester (UHSM), the Stockport NHS Foundation Trust, Lancashire Teaching Hospitals NHS Foundation Trust, as well as a number of GP practices across the country.

 

Top Tips:

Here are some key tips to effectively understanding and measuring peoples’ experiences of individual clinicians:

  • Ensure the links between the patient and clinician are clear
  • Systems such as digital dictation recording, patient flow and appointment registers can all be used to establish and verify which patient saw which clinician and when.
  • Ensure a rounded perspective
  • Programmes should be designed to gather feedback directly from each care user group whenever possible. Programmes must also take into account that different users of care may require.
  • Be appropriate and engaging
  • The appropriateness of the tools and questions we use significantly affects their potential to collect high quality experience data. If not appropriate for the audience, respondents may misinterpret the meaning of questions, be unable to give answers, and/or decline to give their feedback.
  • Measure experience over satisfaction
  • As in our work with adults, we have found that research with children and young people must aim to examine experience rather than simply measure satisfaction.
  • Produce actionable results
  • Results must be presented in a format and at a level that is useful to staff. Tables, charts and targets all have their place, but the focus on appropriateness and engagement applies as equally when presenting results to staff as it does when considering mechanisms to collect feedback from patients and service users.

Get in touch:

If you would like any additional information on the contents of this case study, or if you have an example of good practice you would like to share with us, please contact:

Communications Team

Marketing@PickerEurope.ac.uk

Tel: 01865 208100

www.pickereurope.org


Talk to us about

Person centred care

Request a callback