Competency: necessary but not sufficient

This site is primarily a forum for highlighting limitations and frustrations with technical aspects of the the NHS ePortfolio. However, inevitably the discussion often broadens to problems with the ethos of training, or how it is perceived, and specific problems with the tools currently used to assess trainees. I was pointed in the direction of this paper by Dr AnneMarie Cunningham via Twitter. It highlights some of the issues with respect to the limitations of competency-based assessment of complex professional abilities.

A few points that struck a chord were:

* Competence does not necessarily predict performance

The sum of what professionals do is far greater than any parts that can be described in competence terms

* How do we assess “trustworthiness?”  –>  otherwise known as “the granny test”? The question “would you trust this doctor to care for your unwell grandmother?” may be a better test of trainee progress than any competency test

* The idea of trust reflects a dimension of competence that reaches further than observed ability. It includes the real outcome of training—that is, the quality of care

*  Innovation of postgraduate training should focus on expert appraisal of performance in practice

I hope that those responsible for postgraduate training engage trainees and ensure any future changes are; appropriate, of benefit to trainees and trainers, are evidence-based and “real-world tested.” A common theme from trainees is that current training is tick-box, uninspiring and lacks true mentor/mentee relationships. Olle Ten Cate’s paper contributes to this debate by highlighting the concept of trust and the complexity of assessing professional activities.

Postgraduate training can be better. But it requires enthusiastic mentors who are given the time, support, and freedom to educate and inspire the next generation of doctors.

Advertisements

5 responses to “Competency: necessary but not sufficient

  1. Some changes are being made: there are pilots of a different format and use of the current WPBAs, in large part due to feedback from trainees that has trickled back to the Colleges. Let’s hope the pilots are useful and changes occur quickly.

  2. Random passing doctor

    One of the major problems with eportfolio is that the WBAs are designed to measure competencies. This is not the same as measuring competence. I think the evolution of the eportfolio could be best summarised as “We need to demonstrate that doctors in training are competent. We don’t know how to. So we’ll find something we can measure, whether it is useful or not, and insist everyone does measure it. This will then let us tell everyone that we are assessing competence, and with luck they won’t realise that we aren’t”.

    Assessment should map onto the curriculum and the necessary outcomes. Instead, we are stuck with filling out grossly inadequate forms which make it difficult to extol the superb trainee, or adequately address the needs of a trainee in difficulty. This makes being a supervisor very frustrating, so I dread to think of the effect upon the poor sods still in training. We are striving to help our juniors to develop competence, and even excellence, despite the eportfolio, rather than with its assistance.

    • Thanks for your comments – it’s fantastic to get input from assessors as well as trainees. There are some moves to make the assessments more relevant – we will have to see how the SLEs work out. I completely agree that we should be striving for excellence, and at the moment the eportfolio does not actively support this.

  3. I find this website very interesting. 

     I have had major concerns regarding the direction of the current medical assessment for a number of years now.  

    I don’t believe the current WPA assess if a doctor is competent.  If anything it is possible for an incompetent doctor to have the correct number of satisfactory  WPA and then it is very difficult to terminate that doctors training as they have “evidence of competencies”

    A couple of years ago I looked into the evidence behind WPA to my surprise it is lacking.  I wrote to Mr Lansley (health sec) pointing out my concerns about a very bureaucratic  system that in my opinion takes time away from learning clinical medicine and provides no better assessment. It may in fact allow poorly performing doctors to progress.  There is no evidence it improves patient safety you could argue the opposite especially if you use the same evidence base for them  i.e. none!

      I got a letter back from the dept of health  saying please feed back via your trainers this is a new process we are developing,  in other words we don’t care.  I did try and feedback at ARCP when I asked the  panel to provide me with evidence that WPA did improve doctors training and assessment.  I had gone prepared with evidence that they didn’t (there was an article in BMJ a couple of years back reviewing the evidence can’t remember of top of head sorry) I was told “just do WPA or if you don’t like it get a different job”. Not the most constructive feedback from deanery especially as I had an evidence base for my argument.

    I know a colleague who is undertaking a PhD into portfolios and WPA his research has shown they are probably pointless for assessing competency.  Hopefully this will be his conclusions. I look forward to reading the thesis.  

    Good luck with the campaign I hope you have some success but I think it will be difficult to change.  We could freedom of information request  all deaneries asking how much money they are spending on this and ask them to demonstrate the number of lives saved for the out goings.  I would think millions of pounds spend and few to no lives saved.  Not sure N.I.C.E would recommend a new drug or surgical intervention on such poor outcome for that investment.

    • Some interesting points. There are people looking into the validity of various WPBAs and future papers will make interesting reading. I think it’s important to acknowledge that assessment is needed, and that conceptually on-the-job assessment fits with the complex nature of the role of a professional. It may be that the current tools are not the right ones, but then we need to develop others. Continuing assessment is here to stay with revalidation now established. We need to work with GMC and Colleges to develop systems and processes that work for all of us; from an ideological, practical and technical standpoint. Lots of work to be done!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s