Introduction
With Bud Benchmarking, you can gain a clear understanding of how your business is performing compared to other Bud clients. This data-driven approach allows you to make informed decisions on key areas to focus on, helping you drive growth and improve outcomes.
Our comprehensive reports provide a month-by-month analysis of your performance, giving you a clear picture of how you're trending over time. This allows you to easily track the impact of any changes you make and adjust your strategy accordingly.
Your data is completely confidential and obfuscated, meaning that no other provider will see your data. You can feel secure in the knowledge that your business information is protected while still reaping the benefits of benchmarking.
Benchmarking is a chargeable bolt-on product, please speak to your customer success manager for more details.
What is Benchmarking?
Benchmarking is a process of comparing various performance metrics of your Training Provider, with other Bud users. The Benchmarking reports are presented at a very high level, meaning no personally identifiable data is present. If you would like to investigate any specific issues highlighted, the Operations Dashboard or Standard reports can support you further.
Why Benchmark?
Benchmarking allows you to
-
Gain Operational Insights: Review key measures to identify areas for improvement or maintenance.
-
Enhance Decision Making: Make informed decisions based on data-driven insights.
-
Set a Strategy for Improvement: Develop a clear plan to enhance your business performance.
-
Conduct Competitive Analysis: Compare your performance with similar training providers to gain a competitive edge.
- Measure Progress and Performance Alignment: Track the progress of your strategic initiatives and ensures that your performance aligns with your established goals. By comparing your current performance against the benchmarks, you can evaluate whether your strategy is on track and whether adjustments are necessary to stay aligned with your objectives.
How to: Use Benchmarking Reports
Data is presented as an average per calendar month, and the trend arrows indicate improvement or overwise over the previous month. Slicers are available to filter the report to certain Subject Sectors or Standards. Subject Sectors are defined by the ESFA, and are based on the Standard you deliver.
Furthermore, you can choose to compare yourself with only those Training Providers of a similar size to yourselves. Training Provider sizes are based on the size of the book of learners on Bud. They are banded into the following:
- 21-499
- 500-999
- 1000+learners
Please note: The cohort is based on the number of in-progress learners in the given month, as your book of learners changes you may cross into different bands.
Where applicable, a green arrow indicates an improvement over the previous month, whereas a red arrow a downward trend in performance:
Advisory Messages
You may see one or more of the following messages, depending on your selection from the slicers:
- Small Cohort: Your selection has reduced the sample size to 20 or fewer learners, and so may not be statistically significant.
- Current Month: Advisory you are comparing an incomplete month with a full month's data. Measures such as Login % or Engagement % will be particularly low early in the month but will tend to track upward throughout the month.
- Insufficient Data: At least three Training Providers are required to Benchmark against; the current selection doesn't reach this minimum.
- (Blank): You may occasionally see a measure represented by '(Blank)', which means there is no data available for this measure with the current selection.
If you receive any of these messages, try broadening your selection.
Detail of Calculations
Attributes are calculated and stored on a nightly basis, so we retain a view of the data as it was presented on that day. It is important to note that we don't rewrite any activity, for example in the case of late notification of breaks, withdrawals or ALS provision.
Calculations are performed at a monthly aggregation. The denominator is the number of learners who have been In Progress at any point in the month concerned. In Benchmarking we use the broader definition of being in progress, which can include any of the following Learning Plan statuses:
- In Progress
- Break Approved
- Break Requested
- On Break
- Withdrawal Approved
- Withdrawal Requested
- In End Point Assessment
(The number of withdrawals, breaks etc. is divided by the total in progress in a month to yield a percentage)
In most cases, there is a comparison of your measure vs the highest performing Training Provider within your banding size. Sometimes we show an average of all the Training Providers, for example, the percentage of learners receiving ALS, or time through their programme at withdrawal.
Rankings
For certain measures, you are given an indicator of your rank among your peers. Dense ranking is utilised, so where two or more Training Providers are tied, they are assigned the same ranking. For example, Rank 2 / 9 means you are ranked second of nine rankings (not necessarily nine Training Providers).
Trend lines
Additionally, you can see the trends over the last 13 months, with your measure, highest performing or average among you and your peers, and an orange line indicating how you ranked over time. Note that a change in ranking over time doesn't necessarily mean you're doing better or worse than before - it could be that number of Training Providers you're comparing against in that particular segmentation has changed over time.
Please note that Benchmarking data has been accumulated from 1st March 2023 onwards.
Report Definitions
Engagement is defined as a submission or message from the learner, a submission by the trainer, in-month Off the Job hours being approved, or a progress review being completed. Remember that a Progress Review must be signed by all participants to be called complete.
Logins are Learner logins only. It is the percentage of in progress learners who have logged in to Bud in that month.
Activity Engagement is the percentage of learners who have engaged in an activity within their learning plan. Activity Engagement is defined as submission by the Learner or their trainer or an externally completed activity where the results are captured in Bud (e.g. a SCORM activity).
EPA Duration is the average total number of days spent in End Point Assessment. Days are accumulated for every period of EPA entered into, and includes learners currently in EPA.
Past Planned End Date is the percentage of learners in the month, who have past their Planned End of Practical Period and are still in progress.
Length Past End Date is as above, but the percentage of learners past their Planned End of Practical Period by more than 28 days.
Marking Time is the average time between a submission being made and being marked. This includes activities marked as partially or fully complete. Time in hours is counted Monday to Friday 9am to 5pm, excluding English bank holidays. It is the number of hour boundaries (o'clocks), between the Submission Date and Time and the Marking Date and Time. For example:
- Work submitted at 09.15 and marked at 09:45 = 0 hours
- Work submitted at 09.45 and marked at 10.15 = 1 hour
- Work submitted 16:15 Tuesday and marked 09:00 Weds = 1 hour
- Work submitted 15:30 Friday before a bank holiday weekend, marked at 15:45 Tues = 8 hours
Learners On Break is the percentage of learners who have had a break in the month selected. This includes learners who were on a break for only part of the month - and who may have started on completed their break in that month.
Length of Breaks is the average length of breaks that were completed in the month concerned, irrespective of whether learners returned to learning or withdrew.
Break Return Rates show the learners who finished their break in the month concerned - the percentage who returned to learning as opposed to withdrawing.
Withdrawals are the percentage of learners who withdrew during the month selected. This excludes learners who withdraw in the qualifying period (zero-day leavers). The Withdrawal date used is the Withdrawal Processed date, as this is immutable in case of late notification of a withdrawal.
Time of Withdrawals is the average percentage of time through a programme when learners withdrew, for those who withdrew in the month concerned. Again, the withdrawal date used is the Withdrawal Processed date, as above.
Zero-Day Leavers is the percentage of leavers who withdrew within the qualifying period, in the month selected. The withdrawal date used is the Withdrawal Processed date as above, however, to determine whether they were a zero-day leaver we calculate the days between their Start Date and their Last Activity Date. If this is 42 days or less then they are classed as a zero-day leaver.
ALS Plan shows the percentage of learners who had an active Additional Learning Support Plan in place during that month.
ALS Provision is the percentage of learners who received ALS provision, at some point during the month concerned. Late notification of provision is accommodated if logged before the end of the month in which the provision was made. Note: this measure will be available in a future release
Funding Claimed is the percentage of learners for whom funding is being claimed, out of those who have an active ALS Plan in that month. In case of excess claims, this could be more than 100%.
Learner Growth Rate looks at the percentage change in Learner Numbers from the previous month modelled over the last 13 months, and allows you to see how you compare to growth rates in the industry and in providers a similar size to you.
OTJH Quotient utilises our usual reporting structure for benchmarking as well as introducing a box plot diagram to see the range of averages across all training providers.