Cross-department Benchmarking Data

Introduction

Rotamap provides twice-yearly benchmarks for our client hospital organisations. We review these at our autumn and spring events where we also discuss interesting questions and themes that have emerged from operational data. The benchmarks allow departments to see how their service is made up of different types of work — eg., extra usage. A department can compare their average against other departments but also, more importantly, examine their variation around this average. Rotamap also provides per-department service reports, containing data and charts which can help to identify trends within a department and assess its health. P-charts are discussed in this article; other features of the service reports were discussed in more detail at our Autumn 2018 event which you can read more about here.

Our spring 2019 event featured Dr Mark Cox of Chelsea and Westminster Hospitals, who presented his findings on using data from CLWRota to analyse the delivery of anaesthetic training sessions for UK junior doctors. You can read more about his study here.

In this article:

Key benchmarking figures

For our spring event held at the Royal Society of Arts in Central London on 22 March 2019, Rachel Christie and Tom Bermejo presented the latest edition of our benchmarking data for departments using CLWRota and Medirota. The data spans the range from the 1st January to the 30th December 2018, covering over two million sessions of work.

Anaesthetics benchmarks

The table below summarises the mean and median across CLWRota departments for each of the benchmarking metrics. 133 anaesthetics departments were included in the analysis, with three of these excluded for the Demand vs Actual metric. Our benchmarking packs include boxplots showing each of these metrics broken down by department (shown in full at the end of this article). Below is the Demand vs Actual boxplot, which measures the actual work achieved by the department as a percentage of its templated work. All other metrics are measured as percentages of all sessions across a department.

Benchmarking metric Mean (%) Median (%)
Demand vs Actual 95.2 95
Solos 4.2 3.2
Junior Solos 2.6 1.8
Senior Solos 3.1 2.3
Extras 8.8 7.3
Study leave 5.3 4.5

Figure 1. Boxplot showing anaesthetic departments' actual sessions as a percentage of demand

Medirota benchmarks

In the table below are the key figures for the Medirota benchmarks. 81 departments from surgical specialities were included in the overall benchmarking analysis, with 11 departments excluded from the Demand vs Actual boxplot. Departments with exclusively zero values for extras and cancellations were not included in these boxplots. The boxplots for extras and cancellations are constructed from 13 points of data for each department, taken at 4-weekly intervals; all other boxplots represent 52 points of data taken at weekly intervals.

Boxplot Type Mean (%) Median (%)
Demand Actual 95.2 95
Extras in Clinics 2.9 0.8
Extras in Theatre 3.6 0.9
Cancellations in Clinics 12.3 6.2
Cancellations in Theatres 9.9 6.6
Study leave 2.8 2.1

Figure 2. Boxplot showing surgical departments' percentage of actual sessions compared the demand

Leave versus cancellations

Cancellation data in Medirota and CLWRota can be particularly variable, with large peaks often coinciding with school holidays. These peaks can be identified using process control charts, which are provided in the department service reports. Figure 3 is a p-chart showing weekly cancellations over the year for a small paediatrics department, highlighting two weeks of the year during which the department's cancellation count was abnormally high. Both of these weeks were school holiday weeks, suggesting a correlation between cancellations and leave.

Fig. 4 explores this link in more detail, showing the leave taken (counted in missed sessions) and cancelled sessions per week within the department. Whilst there is a clear link between the two variables here, this is not seen in all departments; Figure 5 which shows the cancellations and leave taken in a contrasting anaesthetics department.

Figure 3. P-chart for cancellations in a paediatrics department.

Figure 4. Line graph of cancelled sessions versus leave taken for a paediatrics department.

Figure 5. Line graph of cancelled sessions versus leave taken for an anaesthetics department.

The lack of correlation can best be displayed using a scatter plot comparing leave taken to cancellations with a point representing each week of the year (Fig 6).

Figure 6. Scatter plot of cancelled sessions versus leave taken (counted in missed sessions) in a anaesthetics department

Further analysis was done to compare cancellations with different types of leave (Fig. 7) in the anaesthetics department. Despite the lack of correlation for total leave, opposing trends were found for different types of leave. Planned leave shows a slight positive correlation with the number of cancellations, whilst study leave and unplanned leave show a negative correlation; that is, when more unplanned or study leave is taken, fewer cancellations occur. A possible explanation for the negative correlation with study leave is that training courses will generally take place outside of school and public holiday periods.

Figure 7. Cancelled sessions versus different leave taken, per leave type, in an anaesthetics department

Using data from a large university trust, the number of cancellations in the anaesthetics department was compared to the leave taken (counted in missed sessions) in other departments from the hospital. A positive correlation (r=0.63) was seen between the two and Figure 8 shows how the cancellations in the anaesthetics department and the leave in the trust follow the same trend over the year. This suggests that further analysis of trust-wide data could be useful to investigate how departments interconnect and affect each other.

Figure 8. Line graph of the number of cancellations in the anaesthetics department and the leave taken (counted in missed sessions) in other departments within the trust.

Actionable data

During Rotamap's September 2018 Event, Corrina Davies from the Royal Bournemouth and Poole Anaesthetics department raised the possibility of using data from previous years to help make predictions for the upcoming year. With this in mind, Tom and Rachel presented some speculative analysis in March 2019, attempting to use leave data from one anaesthetics department to predict the department's capacity for the following year (Fig. 9). The graphs below show when the department's capacity is predicted to be above or below its expected demand, based on the templates in the system and the leave figures for the previous year.

Figure 9. Line graph for a department's predicted capacity for 2019 using the leave taken in 2018.

Departments could used this predicted capacity to estimate how many extras they would need to use at specific times of the year. Figure 10 looks at another department's data, comparing the predicted extra usage + cancelled sessions for 2018 (calculated with 2017 data) to the real figures for 2018. The close fit between the lines suggests this model could be a good starting point for data prediction in CLWRota and Medirota.

Figure 10. Line graph for a department's predicted extra usage for 2018 compared to the actual usage of extras.

Benchmarks

The benchmarking data for this season are below.

Questions

If you have any questions about the above ideas or would like to know more about how to get reports from your system please contact the Rotamap support team at support@rotamap.net or 020 7631 1555.

data