Category Archives: remediation

Diagnosing (and responding to) the struggling learner

I am posting this on the heels of the last post because that one was rather grand and strategic in its approach and I felt it needed to be followed by something more practical that related to day-to-day supervision of trainees. I almost called it “remedial diagnosis” but there is a continuum – from feedback aimed at progressive improvement to focussed interventions to help a learner get up to speed in a particular area and onward to identified areas of major deficit requiring official “remediation”.  We all need to remediate bits of our practice (depending on your definition).

Three weeks ago I went to the dentist and had root canal therapy. This felt like “deep excavation” as in the (unreadable below!) danger sign by the major beach renovation that I noted in my morning walk today.  These problems are often revealed after large storms just as a learner’s problems are often revealed after exams.  Of course some required renovations may be largely cosmetic.


If some in-training formative assessment has suggested a risk for problems with the final exams (or indeed for performance as a GP at the end of training) the specific needs can only be targeted if the specific problems are identified. This will then suggest a more focussed approach for both learners and supervisors / educators.

What do we know about remediation?

Remediation implies intervention in response to performance against a standard. A recent review concluded (pessimistically, as systematic reviews often do) that most studies on remediation are on undergraduate students, focussed on the next exam, with rare long term follow-up and improvements that were not sustained. Active components of the process could not be identified. (Cleland J et al 2013 The remediationchallenge: theoretical and methodological insights from a systematic review.  Medical Education v47 3 pp242 -51).  A paper appealingly entitled “Twelve tips for developing and maintaining a remediation program in medical education” (Kalet A et al 2016 Med Teacher v38 8 pp787-792) has a few interesting observations but is directed at institutions. It noted the common observation that educators spend 80% of time with 20% of trainees, that many trainees will struggle at some point, may need more or less resources, and yet there is limited recognition of this or investment in resources at any level.  The relevant chapter in “Understanding Medical Education” (Swanwick T) notes that performance is a function of ability plus other important factors.  The quality of the learning and working environment is also important – sometimes maybe the fault lies more with us.  It observes that successful models of remediation aren’t well established and, as with the Kalet article, it advises personalised support rather than a “standard prescription”.

So, we are left a bit to our own devices in diagnosing and managing. Nevertheless, I think we are fortunate in GP training, historically, in that, up to now, we have had a personalised training culture that emphasises, accepts (and, indeed, wants) feedback.  Problems cluster into several areas.

Four common problems and ways to address them

  • Communication skills can sometimes be the most obvious limiting factor in performance. These can be subdivided into language skills (a large and well-addressed topic on its own) or more subtle skills within the consultation – use of words or phrases, jargon, clarity or conciseness, tone of voice, body language etc. These are often picked up on observation (or less often, but notably, from patient feedback). The most useful way to draw these to the attention of the learner, and to begin addressing the issues, is to use video debrief.
  • The easiest diagnosis is lack of knowledge. This might be revealed in a workshop quiz or a teaching visit or in a supervisor’s review of cases. Sometimes GP registrars (particularly if they have done previous training in a sub-specialty) underestimate the breadth of knowledge required for general practice. Sometimes this awareness does not dawn until the exam is failed and they admit “I didn’t take it seriously”. In GP training, considerable knowledge is required for the AKT and it underpins both the AKT and the OSCE. Sometimes the issue is the type of knowledge required. They may have studied Harrison’s (say) and be able to argue the toss about various auto-immune diseases or the validity of vitamin D testing and yet have insufficient knowledge of the up-to-date, evidence-based guidelines for common chronic diseases. They may have very specific gaps, such as women’s health or musculoskeletal medicine, because of personal interests or their practice case-load. In real life the GP needs to know where to go to fill in the gaps that are revealed on a daily basis but, for the exam, the registrar needs to have explored and filled in these gaps more thoroughly. The supervisor can stretch their knowledge in case discussions, monitor their case-load, direct them to relevant resources and touch base re study. Registrars can present to practice meetings (teaching enhances learning). Prior to exams it is useful to practice knowledge tests and follow up on feedback from wrong answers.
  • Consultation skills deficiencies are often about structure. They may be picked up because of difficulty with time management but, equally, there may be problems within the consultation. The registrar may not elicit the presenting problem adequately, convey a diagnosis, negotiate appropriately with the patient regarding management, utilise relevant community resources, introduce opportunistic prevention or implement adequate safety netting. All these skills, and others, are necessary in managing patients safely and competently in general practice. There are many “models” of the GP consultation which can be helpful to learners if discussed explicitly. It can also be useful to have registrars sitting in for short periods with different GPs in the practice in order to observe different consulting methods. However, this is less useful if it is just a passive process and the registrar does not get the chance to discuss and reflect on different approaches. The most useful coaching is direct feedback as a result of observation by supervisors and teaching visitors. This may require extra funding.
  • Inadequate clinical reasoning is the more challenging diagnosis. Good clinical unsafereasoning is something you hope a registrar would have acquired through medical school and hospital experience but this is not always the case. Even if knowledge content and procedural skills are adequate, poor clinical reasoning is an unsafe structure on which to build. This issue may come to light through failure in the KFP or through observation by the supervisor.  It may be necessary to go back to basics. A useful method is to utilise and explore Random Case Analysis (RCA)in teaching sessions.   A helpful article is particularly the use of “why?” and “what if?” questions when interrogating.Sometimes clinical reasoning needs to be tweaked to be appropriate for general practice. A re-read of Murtagh on this topic is always useful and Practice KFPs can reveal poor clinical reasoning.  Registrars can sometimes be observed to apparently leap towards the correct diagnosis or arrive circuitously at the correct and safe conclusion without the clinical reasoning being obvious.  In these circumstances it is useful to question the registrar about each stage of their thinking and decision making in order to practice articulating their clinical reasoning.In summary

    Remediation “diagnoses” can be made in the areas of communication, consultation skills, knowledge and clinical reasoning (and, no doubt, others). The “symptoms” often come to light during observation, workshop quizzes, in-training assessments, case discussions and practice or patient feedback.  Management strategies include direction to appropriate resources, direct observation, video debriefing, case discussion, practice exam questions (with feedback and action) and random case analysis. Most organisations have templates for relevant “plans” which are useful to keep all parties on track.

    Funders and standard-setters are more likely to have “policies” on remediation rather than any helpful resources on how to do it. There is not much in the literature and it is often difficult to develop expertise on a case by case basis (with much individual variation).  Prior to the reorganisation of GP training in Australia some Training Providers had developed educational remediation expertise which could be utilised in more extreme cases by other providers. As educators we need to develop our own skills, document our findings and share freely with others. Supervisors need to know what to do if they suspect a performance issue ie communication channels with the training organisation should be open.