DRAFT RESPONSE

Consultation: Outcome Based Success Measures – The Next Stage

INTRO

SUMMARY

Q1: Do you agree with the proposed new progression measure?

This seems, in principle, a reasonable measure to make.

It will measure any learning at a higher level than the learning completed in the previous 12 months, so there are however some potential anomalies that would need to be considered. For example, if a learner attained a L2 certificate and then went on to a Level 2 Apprenticeship for which this was a partial requirement, this would not be counted as a progression, whereas we feel that government policy in general would certainly view it in that way. Similarly, achievement of a Functional Skill at Level 2 prior to an Apprenticeship entry would not make a positive impact either.

There is a small danger of undesirable behaviours being encouraged should providers progress their learners on to unsuitable higher level courses in order to merely bolster figures on this measure – however we do acknowledge that if this measure is not to be used as part of the Minimum Standards framework then this risk is mitigated.

In general we believe that our responses to this document underline our long-standing view that there is a need to properly consider how work-based learning is measured as distinct from “mainstream” FE – a “one size fits all” approach will not necessarily suffice.

Q2: Do you agree with the principles and features underpinning the extended Minimum Standards framework?

In themselves, the principles outlined seem reasonable. However we are unsure about a “sustained” learning measure and what this actually tells anyone, as it infers that time spent learning is at least as important, if not more so, than actually what they are studying. We disagree with this view.

The consultation also does not acknowledge the difference in the use of outcome measures as regards the differences in intervention policies between independent training providers (ITPs)and Colleges. Whilst it refers to the possibility of a referral to the FE Commissioner for example, and the use of the statistics by Ofsted, it does not acknowledge the default position that ITPs receiving Grade 4s will (by default) have their contracts terminated. Our position is that this is iniquitous, and that ITPs requiring intervention as a result of outcome measurements (or indeed any other criteria) should, as is the case with Colleges, be given a chance to improve before termination is invoked.

Q3: Do you agree with the proposals for how the new Minimum Standards framework would be used?

We have a concern that there could be inherent tensions between the drivers to maximise qualification-based outcome measures, and local outcome agreements that may not include them.

The consultation appears to slightly favour the use of destination measures from the previous summer being used as reference points as opposed to “the most recent” set of figures, which has some sense to it. However there is a danger that this may not (for example) keep pace with mergers, demergers and other institutional/organisational changes that take place, which may query the robustness of the figures in their use as comparators for a basis of intervention.

Overall we believe that the risks connected with new measurement systems are greater once they start being used for minimum standards and performance management reasons. We would strongly suggest that a suitable piloting period would be required to ensure that data collection/matching works as it should, and that the results are based on statistically robust samples.

We believe the current thresholds are reasonable and should be maintained, with a review on an annual basis.

Q4: Is the proposal for treating learning for the unemployed as a separate type of learning for the purpose of Minimum Standards a fair way of accounting for those learners?

Yes

Q5: What is your view on whether we need to make any special allowance for learners with difficulties and disabilities in the destination measures Minimum Standards framework?

AELP would concur with the general approach that LLDD learning destination measures may not be differentiated because learning support and planning should have taken place to put the learner on an equal footing at the start of their programme.

However we do not believe this can apply to employment destination measures where wider societal, cultural and other attitudes are at play that are not in the gift of the provider to substantially influence.

Q6: Do you agree that the outcome measures should form a core set of measures for local outcome agreements?

We would however refer to our earlier answer to Q3, regarding our concern that there could be inherent tensions between the drivers to maximise qualification-based outcome measures, and local outcome agreements that may not include them.

Q7: In order to inform local outcome agreements, what other information is needed alongside the outcome measures data?

We do not feel this question can be properly answered without knowing what each local area wants, and who will be responsible for driving outcome plans.

Q8: Do you support the idea of a widget sitting on provider’s own websites with a consistent set and presentation of data?

Yes

Q9: Do you support the idea of an FE performance table focused on apprenticeships and higher levels of learning?

Such a table may have some merit under current arrangements for Apprenticeships – although it should be careful to differentiate sectors, where very different influences are at play in affecting completion rates. The more transient nature of the hospitality workforce for example may not compare directly with those in, for example, accounting. We have the same reservations as regards the comparisons across age bands and levels, the ranges of which would need to be carefully ascertained to ensure that there are fair and reasonable comparisons being made. Similarly local labour market conditions need to be taken into account, so at the very least there is a good deal of contextual information that would need to surround performance tables of this type in order to ensure they are meaningful.

However given that any changes taking place now must incorporate the Apprenticeship reform agenda, we do not think comparisons can be made across Apprenticeships under the trailblazer arrangements, as there is no way of knowing exactly what part of an Apprenticeship is being offered to which employer by any specific provider. It will never be clear therefore whether like is being compared with like.

We would not have any significant concerns regarding performance tables for higher levels of learning outside of Apprenticeships.

Q10: Do you agree that individual scorecards will provide a useful tool for both providers and the key local stakeholders with whom they are working?

We look forward, we are sure along with other representative bodies in the sector, to being fully consulted and engaged on what such scorecards may comprise.