18th November 2014
Dear colleagues
We are very committed to a fair evaluation of Frontline and would not have submitted a proposal to DfE on any other basis. It is to ensure fairness that we have been transparent in publishing the protocol and have proposed a Delphi process to agree the assessment tool for practice quality. The project advisory group discussed the joint APSW-JUC SWEC statement at its meeting on 4th November and in the light of that discussion, we make the following response.
The scope of the evaluation
A detailed comparison of all the various routes to qualification, including in-depth research on service user experience, would indeed be very helpful, but it would also require a very large budget. The evaluation commissioned is a pragmatic design within the budget available and is framed around the effectiveness of Frontline as a training route. Service user views will be sought as part of the Frontline unit case studies, but a comparison of service user experience of students on mainstream programmes would not be feasible within the budget available and would be best left to a longitudinal study. The assessment of practice quality will, however, include a focus on empathetic and collaborative approach to service users.
The evaluation hypothesis
The null hypothesis for the simulated practice assessment is that there will be no difference between the three groups. The alternative hypothesis is that there will be a difference. The hypothesis is two-tailed, so does not specify the expected direction of the difference. While Frontline might aspire to produce practitioners of higher quality than conventional programmes, it is equally possible that more well established programmes will produce higher quality practitioners than Frontline which is in its first year of operation.
Practice assessment tool
The practice assessment tool will be developed from Marian Bogo’s OSCE measures(see It makes sense to start with these rating scales because they have already been through a process of validation and they are generic social work practice assessment tools which map on to elements of the Professional Capabilities Framework and the Knowledge and Skills statement. The final tools to be used in the evaluation will be developed through a Delphi process in order to build consensus. This will involve equally weighted groups of university social work academics, practice educators, practitioners and service users. We were surprised that the APSW/JUC SWEC statement did not mention this aspect of the protocol.
As well as two observers’ rating of practice skills in a simulated interview there will be rating of a written task which will ask questions about assessment and likely help. This will test social scientific understanding, although we are not asking for a full-blown academic essay as this would place undue burden on the participants.
The financial support for Frontline
The Department for Education has indicated to us that it is considering funding an additional economic element, as originally proposed by the research team, to compare the costs of Frontline with the costs of delivering mainstream postgraduate programmes and aid the assessment of value for money. The analysis of the simulated practice assessment will also include separate analyses of co-variance (ANCOVA) to control for the effect of important factors such as students’ caring responsibilities, paid work commitments and caseloads. This was always the intention but was not spelled out in the protocol.
Longitudinal element
The invitation to tender did not include a longitudinal element but it has always been the DfE’s intention to commission this in future, following up Frontline graduates a few years post-qualifying, as is currently happening for Step-Up. This will be subject to Ministerial approval and funding. DfE is currently unable to commit research funding that far ahead.
The advisory group
This includes Professors Imogen Taylor and John Carpenter, who as well as research expertise on social work education have a long history of delivering mainstream social work education. Advisory group members associated with Frontline are primarily there to ensure the feasibility of data collection from Frontline units. This largely qualitative element is acore aspect of the evaluation.
The DfE is keen to ensure that the evaluation is both independent and methodologically robust and has offered to meet APSW and JUCSWEC representatives to discuss the points raised in the statement. Members of the research team will be available at the January meetings of JUC SWEC and APSW to answer further questions.
The protocol will be updated in line with the clarifications we have given above.
Jonathan Scourfield and the evaluation team, Cardiff University