Tag Archives: Training

Shape of Training – influence the next 30 years of medical training!

You have only days left to shape medical training for the next 30 years.

The Shape of Training review aims to plan how doctors should work and train in the next 30 years. This is your chance to directly tell decision-makers what you want postgraduate training to look like.

Screen Shot 2013-02-04 at 22.49.27

  • Should we have more generalists and fewer specialists?
  • Should there be a speciality of General Internal Medicine (distinct from Geris)?
  • Should all medical trainees CCT in GIM before starting speciality training?
  • Should more specialities dual-accredit and therefore contribute to the acute take (hello Rheumatology, Dermatology, Renal, Oncology……)?
  • Should F2 be abolished?
  • Is training flexible enough?
  • How can trainees be supported to learn from their experiences?  
  • Is the balance right in the current system between training and service provision?

I have strong views on many of these questions (the answer to the last one is NO!)

“This review takes place in a rapidly changing environment. Medical and scientific advances, evolving healthcare and population needs, changes to healthcare systems and professional roles, the push towards more care provided in the community, the information and communications technology (ICT) revolution, and changing patient and public expectations will all affect how doctors will practise in the future. We therefore need to consider what these changes mean for the way doctors are trained.”

The survey is long, but is so important that it’s worth the effort. I recommend you put aside some time, make an extra large cup of tea, and really dedicate some brainpower to your answers. This is the best chance you will ever have of influencing the shape of medical training. Don’t let it slip through your fingers.

In particular I would consider mentioning your views on WPBAs and the ePortfolio in questions

  • 13: How do we make sure doctors in training get the right breadth and quality of learning experiences and time to reflect on these experiences?  (Better software in which reflection could be logged on the go, and reflections could be tagged and organised/visualised/shared more flexibly would help. Time spent with mentors instead of filling in paperwork would also be great)
  • 14: What needs to be done to improve the transitions as doctors move between the different stages of their training and then into independent practice? (Interoperability of ePortfolio systems would be a start)
  • 18: Are there other changes needed to the organisation of medical education and training to make sure it remains fit for purpose in 30 years time that we have not touched on so far in this written call for evidence? (yes…..)

Go on. Respond… 

News from the trenches

If you have not yet already seen the blog by Dr Fitz “NHS ePortfolio WPBAs in CMT: are they educationally useful?” I recommend you make yourself a cup of tea and have a read. He has taken a significant amount of time out of his day as a medical trainee to document his experiences of using WorkPlace Based Assessments (WPBAs) in the “real world.” He has done this not because he likes to rant, but because he is genuinely interested in how he can support his own learning, and ensure that the assessments he undertakes are valid and useful. He is like many of us – he wants to be a better doctor – but wonders whether the system currently helps or hinders this.

teacup

Dr Fitz seems representative of many Core Medical Trainees (CMTs). In 16 months he has undertaken 51 assessments (ACATs*, CbDs*, miniCEXs*, DOPS* and Teaching Assessments) and 2 rounds of MSFs* (360 degree assessments) with 35 responses from colleagues in all. This is a little more than the minimum requirement for his ARCP (annual appraisal) and is a good bank of data on which to reflect. Well, it should be. Dr Fitz looks in detail at the contents of his ePortfolio and wonders what it really tells him. Of particular concern is the documented feedback that is at the heart of these assessments:

“Unfortunately the stats don’t look good. Over the course of 9 ACATs, covering the management of 55 patients over 12 months, I received 127 words of feedback. That is 14.1 words per ACAT and 2.3 words per patient seen. About 6 tweets.”

He is quick to point out that this does not necessarily reflect the amount or quality of verbal feedback he received, but

“…the educational benefit of my on-calls and the instruction I received from my consultants was separate to, not part of, my ACATs.”

Sadly the story is very similar for his CbDs, miniCexs and DOPS. The quality of the feedback documented on his ePortfolio is poor, and is of no use to him when he reviews his ePortfolio to prepare for his ARCP and consider what should be on his Personal Development Plan (PDP).

His conclusion is balanced and reflects the feelings of many trainees who have commented here previously:

“Overall, my experiences of the NHS ePortfolio assessments for CMT is that whilst they may act as a record of learning, they fail to be a useful educational tool in themselves. This is mainly due to the discord between how they are supposed to be completed and how they are completed in practice. Teaching, supervision and education is happening, but it is in spite of WPBAs rather than because of them.”

enthusiasm for portfoliosHe doesn’t end there and has several suggestions for improvements. They mirror comments already on this blog and range from urgent technical improvements (really, do I have to mention an app yet again?), to faculty development. This is just the kind of input Training Programme Directors, and National Programme leads want and need. Trainees have reasonable and real concerns about their training, and are engaged and enthusiastic about improving it.

Harnessing this enthusiasm will be vital.

If you haven’t already contributed to the Shape of Training review please do so. This is a collaboration between several higher bodies including the General Medical Council, Medical Education England, the Academy of Medical Royal Colleges, the Medical Schools Council, NHS Scotland, NHS Wales and the Conference of Postgraduate Deans of the UK. The review is considering what changes are needed to postgraduate medical training to make sure it continues to meet the needs of patients and health services in the future. This includes options to support greater training and workforce flexibility, and how to address the tensions between obtaining training and providing a service. You have until February to make your voice heard.

* ACAT= acute care assessment tool, CbD = case-based discussion, miniCex = mini Clinical encounter, DOPS = directly observed procedure

A Comment Cloud

This is a wordle, made up of all the comments left on this site:

The prominent words are those that have featured more frequently, and include: ePortfolio, think, learning, NHS, work, trainees, system, need, training, good, open source, evidence, use, app and people. 

Interesting.

What to do with all that data?

I encourage you all to read the blog of NHS ePortfolio developer @zingmatter, and look at the presentation he presented at AMEE 2012 (a recent Medical Education Conference) “Assessing NHS ePortfolio behaviour: variations in the online activity of doctors as they progress through training.”

With thanks to http://www.acunu.com/ for the badge!

I was at the presentation at AMEE 2012 and, although the presentation title may nor sound gripping, I was fascinated to hear what could be learnt from the vast amounts of data ready and waiting to be analysed on NHS ePortfolio site use. The development team (including @zingmatter) had done a great job of drilling down into some of the data, using Google Analytics and internal tracking,  in order to filter out some meaningful information from the thousands and thousands of logins and episodes over a year.

The Prezi can be seen in the “elastic elephant” blog

However, my first thoughts on seeing the conclusions of the presentation were “they’re asking the wrong questions” and “if they wanted to know that they should have just asked the trainees.” Many of the peaks and troughs seen on the graphs were entirely predictable (ie pre-ARCP), and some of the conclusions drawn by the developers on “depth of use” were weak. I could explain away many of the findings, as I know how trainees use the site is a function of what hoops are put in front of them to jump through. I was also sceptical about the conclusion that trainees change their behaviour in relation to the ePortfolio over the course of their training. FYs and ST6s may interact differently with the site, but there are so many confounders that a snapshot comparison is not a valid way to assess this: a longitudinal study would be required.

Despite these reservations, reading @zingmatter’s blog gives me hope for the future, as the developers at NES are committed to engaging with the needs of users. In our often passionate discussions on social media (including this blog and twitter) we must remember that we come from very different perspectives, and have unique sets of knowledge and skills.

As @zingmatter points out:

“There is a balance between college needs and trainee needs in the design of an e-portfolio and possibly this type of data can help inform this debate.”

We also have to make sure we are not misdirecting our frustration at the wrong people, and potentially alienating them:

“while I’m happy to ask simple questions about user flow, user experience and so on, questions about the educational implications of this data have not been well addressed as it’s not really in my sphere of knowledge (or in my job description). I would see the research I presented at this conference as a ‘this is the kind of thing we can do’ exercise that should lead on to better designed questions that will allow us to understand how best to develop an e-portfolio that supports effective learning and development through the effective delivery of a training programme.”

I really hope we can work together to ask the right questions and use all the data we have to inform the process. All we need now is the Royal Colleges on board and we can really maximise the potential of the ePortfolio.

Just imagine a world in which trainees didn’t hate the NHS ePortfolio. It has the potential to be a useful tool to encourage self-directed learning, provide evidence of experience and achievements, act as a showcase for job applications and excellence awards, and strengthen the relationship between trainee and trainer. This world is far away, but perhaps we are starting to see the path forward…

Surgical spirit: what the surgeons think of their ePortfolio

An article published recently in the Journal of Surgical Education looks at the experience of surgical trainees and their ePortfolio. As a Medical Registrar I am in danger of being disowned by my colleagues for suggesting that we may be able to learn something from the surgeons! But in relation to ePortfolio use, many parallels can be drawn between the experience of surgical and physician trainees.

The surgical ePortfolio (ISCP) became mandatory for British surgical trainees 5 years ago, with a compulsory £125 annual fee. In 2008 widespread dissatisfaction was reported. This article (by Pereira and Dean) surveyed 359 users across all specialities and geographical areas. Although ratings improved between 2005 and 2008 trainees were underwhelmed overall. Unfortunately the article is not open access, and is behind a paywall, so I have selected some quotes for discussion below.

My love don’t cost a thing (but my training does….):

“An evaluation by ASiT estimated conservatively the upward spiralling costs of surgical training to the trainee to be £130,000 even before the introduction of MMC, with ISCP and its mandatory annual fee amounting to an additional £1000 over 8 years of surgical training.”

No medic would claim to be poorly paid, but there must be honesty and transparency with regard to the significant financial burden placed on trainees. This is likely to become more pressing as graduates leave medical school with escalating debts. Value for money is high on the agenda.

The current cost of the physician ePortfolio is only £18 per trainee per year, but perhaps this needs review, especially in the context of calls for investment to improve functionality. Trainees have a poor understanding of the costs of training and there is a disconnect between payment of JRCPTB fees and any visible outcomes in terms of education and training. Surely a lesson for all Colleges and higher bodies is that greater engagement and consultation with trainees could help prevent widespread and growing resentment.

A teacher affects eternity; he can never tell where his influence stops 

Sir William Osler, a great clinical teacher

“..incentive for trainer and assessor engagement remains lacking. It is important that trainers are properly recognized and rewarded for the time that they spend assessing and supervising trainees if obliged to use increasingly time-consuming methods, and we would welcome any system that encourages them.”

We must spare a thought for the Consultants who are striving to support us in our professional development. Demands on their time come from all directions and, unfortunately, postgraduate education and training is often the thing that loses out and gets pushed to the bottom of the mounting to-do pile. The system needs to reward and encourage senior clinicians so that they make time to give high quality feedback to trainees during WPBA completion. But this is a long term aim that feels intangible and unattainable. In the short term, reducing the time it takes to complete WPBA paperwork will make everyone happer. An app seems the quickest way to achieve this.

A call for EBT: Evidence Based Training

“Recently the JCST has specified a minimum of 40 WPBAs per year to be completed as a ‘quality indicator’ for surgical training and career progression…Regional training programs have set directives for mandatory WBAs per annum, ranging from a minimum JCST dictat of 40 to the 80 required in London. These present a great challenge upon time available to any practicing surgeon.”

“…a recent systematic review that includes our first survey suggests that there is no evidence that they [WPBAs] improve physician performance. It goes on to conclude that multisource feedback may be helpful, but that individual factors, context of feedback, and presence of facilitation (ie mentoring) may improve trainee responses.”

These sentiments will sound familiar to physicians, many of whom also feel frustrated at the widespread adoption of WBPAs, for which there is limited evidence of value for trainees in the real world. Valid concerns have been raised about the difficulties of applying theoretically helpful frameworks and tools to the realities of clinical life, and it is unclear where the numbers set by training boards have come from.

“ISCP has improved its interface, but it and other electronic portfolios deliver an increasingly overwhelming bureaucratic burden of WBAs and domains of evidence to include in a portfolio. These have rapidly become entrenched in postgraduate physician training in the UK, spreading a plague of box-ticking exercises that continue to increase year on year….It is of particular concern that so many trainees (80%) felt that ISCP did not improve their training after a modal average of over three years using it.”

Again these feelings will be familiar to many of those who have commented on this site and engaged with the debate on twitter. Time is precious. Many feel that the current demands on trainees, coupled with inadequate technology, steals  it away from busy trainees and trainers.

Perhaps it is time to ask the question, who is the ePortfolio for? Is it a learning tool for trainees? Is it an evidence vault for Royal Colleges to check off competencies of registered members? It is unclear to me what the aims of the NHS physician ePortfolio was at its inception. Has this been reassessed as it has expanded and evolved? These is great potential to improve the ePortfolio so that it serves the needs of trainees, trainers, assessors and higher bodies better. We have an opportunity to seek clarification and contribute to making the aims and expectations explicit. Let’s not let it pass us by.

The authors of the paper conclude:

“The performance of ISCP has improved in the 4 years since its inception with proportionately less negative feedback. British surgeons remain dissatisfied with several of its tools, in particular its workplace-based assessments. Half a decade on, these assessments remain without appropriate evidence of validity despite increasing demands upon trainees to complete quotas of them. With reduced permitted training hours, the growing online bureaucratic burden continues to demoralize busy surgical trainers and trainees.”

These conclusions should ring alarm bells not only for the Royal Colleges, but for the wider community of healthcare leaders. The NHS faces many challenges, and a demoralized workforce will struggle to face them. Physician and surgical trainees feel overburdened and undervalued. The system needs to change. Who will lead this change? And where will the ePortfolio fit in? Answers on a postcard…..

E.A. Pereira B.J. Dean.British SurgeonsExperiences of a Mandatory Online Workplace Based Assessment Portfolio Resurveyed Three Years On. Journal of Surgical Education. J Surg Educ. (2012) doi: 10.1016/j.jsurg.2012.06.019

A. Miller, J. Archer Impact of workplace based assessment on doctors’ education and performance: A systematic review. BMJ, 341 (2011), p. c5064

E.A. Pereira, B.J. Dean British surgeons’ experiences of mandatory online workplace-based assessment J R Soc Med, 102 (2009), pp. 287–293

S.A. Welchman Educating the surgeons of the future: The successes, pitfalls and principles of the ISCP. Bull R Coll Surg Engl, 94 (2012) online

W.C. Leung. Competency based medical training. Review, 325 (2002), pp. 693–696

Can we make ePortfolio open source? a guest post from Karen Beggs

My first question is WHY?

Here are the main issues I hear about:

  • A lot of trainees aren’t happy with workplace based assessments
  • Internet speed is an issue in some NHS locations
  • Some people don’t like using an ePortfolio
  • Some people want to have more input into ePortfolio design
  • Some trainees want their seniors to be more engaged with their learning
  • There is a common misunderstanding that College membership fees are used solely to pay for the ePortfolio

So what are we already doing about these issues?

  • We are eliciting feedback directly from the wider ‘user’ community through social media to find out what usability improvements we can make…and get them done.

We’ve started this already… following a conversation last month with a trainee who was frustrated by the curriculum linking process, our architect made a simple change that was deployed a few days later (see demo here), reducing the number of clicks needed to make multiple links. We have also introduced a twitter feed, visible on the www.nhseportfolios.org home page).

We are moving to a more elastic hosting environment so that as the system gets busier it can engage more resources to deal with the increased load. We aim to have this fully implemented by autumn 2012.

The NHS ePortfolio team do not make decisions about assessment processes, training requirements or the use of specific workplace based assessments. Expertise in these areas usually lies with the Colleges, Postgraduate Deans and the Academy of Medical Royal Colleges. However, we can:

  • Help connect trainees with ideas (or complaints) with the relevant people, whether College, user group, developers or others. In July 2012 I attended the AoMRC Trainee Doctors Committee, and as a result will recommend to the Chair of the specialty ePortfolio User Group that a representative from the committee sits on that group. Some trainees are not aware of their own College decision making processes, and we will pass on contact details as required.

Would open source address any of these issues?

As far as I can see, NO!

But why don’t we just hand over the code to the large community of willing, enthusiastic OSS developers?

  • It’s not as easy as just handing out ‘the code’. Open source software must comply with a number of criteria (see www.opensource.org), many of which would contradict the current NHS ePortfolio license terms.
  • Who would fund re-writing and re-negotiating software licenses for the existing 25 or so organisations using the NHS ePortfolio? What if one of these organisations objects? It’s an integrated application with many shared features, so to separate out one ‘Customer’ would require a large-scale re-write. That seems to defeat the purpose.
  • The ePortfolio is integrated with a number of external (usually College run) systems and moving to an OSS model would have implications for each of these systems. Would Colleges want to pay to conduct a thorough risk assessment before signing up? And would they then want to pay for any adjustments needed to maintain the integrity of their own systems?
  • There would still have to be stringent controls over the quality of the code submitted. This would require a quality control team – possibly a larger one than we have at the moment. Who would pay for this?

I’m not sure I quite follow the argument… get rid of our current team of developers (some of whom have been with us for over 4 years), keep fingers crossed that some OSS developers can meet our commitments, beef up our QA team so they can check the code of the unknown OSS developers…. Seems that we increase our risks (of not meeting SLAs), decrease predictability (how can we hold anyone to a delivery date if we don’t employ them?) and end up with a QA team but lose our development expertise (the current team wouldn’t hang around for long – why would they?). I can’t see a sustainable business model in here unless we were to maintain a large core team – and if we do that, where are the assumed cost savings of OSS?

I have heard arguments that OSS is cheaper overall, but I don’t really see that cost is the problem (see My first question is WHY? above). It seems to me that the per capita charges for the ePortfolio are pretty reasonable. There is currently no charge made for any supervisor (educational or clinical), programme director, administrator, ARCP panel member or assessor using the ePortfolio. Per capita charges are based only on trainees at present. Would OSS have any impact on this? I can’t see that it would.

Final thoughts

If we were starting from scratch we would look at OSS as one of the options. We would probably look at an off-the-shelf ePortfolio too. We would be foolish not to. But we are not starting from scratch. We have an established, bespoke ePortfolio that is used across the professions (we have versions for Dentists, Nurses & Midwives, Pharmacists, Doctors and Undergraduates), is integrated with a number of external systems and capturing over a million forms submitted by ‘assessors’ every year. Each version has a custom set of features, making it adaptable and cost effective (sharing an underlying code base and database).

Many of the problems we hear about relate to complaints about the educational processes, and changes are already underway to address these (eg move to Supervised Learning Events in Foundation from August 2012). We contribute to these discussions when appropriate.

We have developed good relationships with our broad range of Customers, and continue to work with them to improve our change control and development processes. We work within the constraints of the NHS, which impacts our management of finance, procurement, stakeholder management, technology and decision making, as well as our governance arrangements.

We have an established application and an experienced team, whose expertise and commitment cannot be underestimated. Our development costs are at the lower end of the market, and maintenance charges are extremely good value. We can bring in additional specific expertise as and when we need to.

I can’t help but think the suggestion to move NHS ePortfolio to OSS is a solution to the wrong problem.

An app-ortunity

Quite reasonably I have been asked how an NHS ePortfolio app would benefit doctors, and what it would have to do to be worth any investment. In my opinion the need for an app is driven by the need to make WPBAs more relevant. An app would put control back in the hands of trainees, and make life significantly easier for trainers/assessors. This would reduce resentment towards WBPAs and would save an unimaginably huge amount of time for a stressed, squeezed, overworked profession.

The current situation:

I am doctor in training (this covers everyone who is not yet a Consultant/GP partner). I am required to complete a certain number of WPBAs to progress. One day I am at work and am on call admitting new patients to hospital. I think I’ve made a pretty thorough assessment of a patient with a condition I’ve not encountered before and ask my Consultant if, after presentation of the case on the post-take ward round they can fill in a mini-CEX. They say yes.

I present my case during the round and the Consultant provides some useful immediate feedback on my assessment, including a recommendation to read a recent review on the subject in an academic medical journal. However, the Consultant has another 7 patients to review after this and can’t stop to find a computer, login, wait for it to load up, access the NHS ePortfolio website, login and complete the assessment. “Send me a ticket” they say, with a genuine intent to complete is as soon as possible. My shift gets busier and after 13hours at work I go straight to bed when I get home. The next day i am very busy and forget to send the ticket via email. I remember when I get home but realise I don’t know the Consultant’s email address. It’s a weekend so I’m not in for another 2 days. I set a reminder with an alarm on my phone and on Monday the alarm prompts me to retrieve the email  from the hospital system and I send the ticket from the NHS ePortfolio site.

A week later the assessment has not been completed and I send a reminder. Three days after this I bump into the Consultant in the lunch queue and gently remind them about the mini-CEX. They make excuses, feel bad, and promise to do it ASAP.

A week after this the Consultant finally has some time for admin and discovers my reminder email in their inbox. They login and struggle to remember anything about the patient or the feedback they gave me. They have an overall impression of whether I’m any good or not and complete the assessment mainly based on this overall view, rather than the specifics of the case we discussed. I get an email to say that the assessment has been completed. At a later date I login and read the comments, which are brief, and get no educational benefit from the recording of the episode. I do however feel less stressed as that’s one less assessment to get ticked off. I can’t remember the author of the review recommended by the Consultant and never quite get round to searching for it.

A possible future situation:

One day I am at work and am on call admitting new patients to hospital. I think I’ve made a pretty thorough assessment of a patient with a condition I’ve not encountered before and ask my Consultant if, after presentation of the case on the post-take ward round they can fill in a mini-CEX. They say yes.

I present my case during the round and the Consultant provides some useful immediate feedback on my assessment. I get out my smartphone and login to the NHS ePortfolio app. I bring up the mini-CEX form and we complete it together adding comments based on the feedback the Consultant has just given, including the recommendation to read a recent review by author X in journal Y. There is a prompt to enter the Consultant’s email address so that they can validate the mini-CEX as an accurate representation of the assessment, and I input this as the Consultant dictates it. I save the form. The Consultant continues with the post-take ward round. I continue to admit new patients.

When I get home my phone picks up my wifi signal, and the ePortfolio app automatically synchs with my account so that the mini-CEX is uploaded. An email is sent to my Consultant and me to inform us of this new entry on my ePortfolio. I don’t have to waste time chasing up multiple assessments like this, so actually get round to looking up the review recommended by the Consultant, and learn something that will benefit my future patients.

It is essential that an NHS ePortfolio app:

  • is cross platform (iPhone, android etc)
  • can perform most functions offline with synching later with the main site. Most NHS hospitals have no wifi and poor phone sinal coverage. If an app required wifi it would be of no use to many, many, users

Another possible function would be to record reflection-in-action – essentially quick notes about things that happen that are particularly challenging, satisfying etc. There would then be scope to comment on this in the portfolio later (reflection-on-action).  Professionals must be reflective to learn and develop, but there is debate around the value of writing down these reflections. An app would at least make the process easier for those that wished to do this.

Oh, and of course ideally it would be free. But I’d pay £0.69 to make my life easier, wouldn’t you?