The article below was presented in June 2013 as a poster presentation at the 17th Annual International Association of Medical Science Educators (IAMSE) at St Andrews:
The Foundation Programme is a two-year generic training programme which forms the bridge between medical school and specialist/general practice. In August 2012 a change in the approach to trainee assessment was introduced in the form of Supervised Learning Events (SLEs) for 3 key tools, the Case-based Discussion (CbD), Mini- Clinical Evaluation Exercise (MiniCEX), and Direct Observation of Procedure (DOPs). These tools directly replaced the equivalent work-based assessment (WBA) forms that were used in previous years and included Likert scales. SLEs are not individually summative; the intention has been to encourage trainees to use these tools to support ongoing learning, reflection and discussion with assessors and supervisors.
The NHS ePortfolio is a web-based tool used by 15,000 (93%) UK trainees as part of the Foundation Programme that follows graduation from medical school. The purpose of this study has been to compare the level of trainee engagement with the new SLE tools (session 2012 – 2013) with the equivalent assessment tools during the previous year (session 2011 – 2012).
To measure engagement levels we have analysed aspects of behaviour relating to the submission of these forms to their ePortfolio account during a fixed, equivalent period of time, including the time-dependent pattern of form submission. We have compared the number of form submissions made by Foundation Year 1 (FY1) trainees and Foundation Year 2 (FY2) trainees. Data was generated from the NHS ePortfolio database counting the number of forms submitted each day, starting from day 1 at the beginning of the start of the Foundation session through to 6 weeks after the start of the trainees’ third post in May, 43 weeks after start of the training year. The total number of form submissions and the rate of form submission were compared between WBA and SLEs forms.
For reasons of brevity and to avoid data “noise” we analysed data taken from 3 representative Foundation Deaneries: consisting of 1600, 750, and 900 trainees, respectively.
A total of 119,019 form submissions were analysed. Comparison was made between WBA form types against equivalent SLE forms for the same time periods during the training year. This indicated that no appreciable differences exist between WBA and SLE form submission counts were observed with respect to the number of forms submitted or the rate of submission. Minor variations in number of DOPs versus SLE-DOPS forms were observed though this may be attributed to changes in training requirements and the use of “Signed Procedure” forms.
During 2011 – 2012 SLE versions of MiniCEX, CbD and DOPs forms were not available. On 1st August 2012 the SLE forms were introduced, replacing their WBA equivalents.
The accompanying figures show the total weekly form submissions during the training year, beginning on 1st August 2011, and with form submission during the training year beginning on 1st August 2012.
DISCUSSION AND CONCLUSIONS
The results presented here have shown the pattern of creation of SLE forms does not significantly differ from the worked based assessments (WBA) forms that they replaced. The total number created would largely be governed by the curricula requirements of the Foundation Programme; in the sample deaneries used in this study 69,000 WBA forms were completed compared to 50,000 SLE forms.
Anecdotal criticism of WBAs has been that assessment (and associated scoring) of procedures may not best support a learning process throughout training as trainees might wait until nearer the end of placements to be assessed thereby gaining better scores. It has previously been observed that the end of each placement is characterised by a significant rush by trainees to submit the required number of WBA forms, commonly within days of the placement end date. With the introduction of Supervised Learning Event (SLE) based forms it might be expected that these tools would result in more measured and timely submission of SLE forms during a trainee’s placement. However, no significant shift in trainee behaviour was observed when examining the number of forms. Indeed, the pattern of SLE form submission was remarkably similar to WBAs.
Of interest was the fact that having previous experience of WBAs did not result in FY2 trainee behaviour when switched to SLEs, as compared to FY1 who were entirely new to SLEs.
To conclude, the lack of any behavioural change in form filling patterns might be attributed to either (i) the lack of training/understanding by trainees (and supervisors) of how to use the SLE tools, or (ii) the nature of form-filling during training in a work environment is be its nature going to result in trainees leaving completion of WBA or SLE forms until late into a particular placement.
Following discussions during poster sessions the issue of the trainees being able to get SLE forms completed by assessors is problematic – the process of trainees having to request assessment/SLE completion using a ticketing system can result the “moment has passed”. The development of a mobile app that allows for offline completion of tickets and forms may have an impact on this so repeating this data analysis in 2014 may be telling
This is available as a Prezi slide presentation at the link
and is embedded below:
NES would like to thank the Academy of Medical Royal Colleges, the UK Foundation Programme Office and the representatives of the Postgraduate Deaneries for their continued support