Employees guide to NJC evaluation process
Employees guide to evaluation process forposts evaluated using the NJC scheme
Introduction
NYCC has used the NJC (National Joint Council) scheme for the evaluation of posts up to and including SO2, in accordance with the local agreement with Unison. Posts SO1 and above have been evaluated using the Hay methodology under license from the HayGroup and a pay modelling process took place at the end of initial evaluations are the end of 2006 to link the results of the evaluations under both systems of job evaluation to a common pay structure based on existing spinal column points.
The NJC job evaluation scheme was developed at national level to assist local authorities to evaluate posts using a bias free and tested scheme and to ensure that there was a freely available scheme for use by them to enable equal pay/single status issues to be addressed.
Training for NYCC evaluators was carried out during 2005 by trainers from the Yorkshire & Humberside Employers’ Organisation and this was reinforced by further evaluation and moderation training provided by a consultancy used by the consortium of West Yorkshire and North Yorkshire local authorities to start off the work of using the NJC scheme across these authorities. NYCC and each of the other authorities then continued with their own work and timescales on the introduction of the NJC scheme. On-going training is continuing in-house for maintenance purposes.
Process
The NJC scheme looks at the content of jobs across 13 different factors and uses a set of factor level descriptions to enable judgements to be made by the evaluation panel on the level/depth of use of each of these factors within each post as described in the job record document or job description.
Evaluation is by a trained panel. Panels comprise one HR representative, a member of the JE team and a Unison representative. Unison representatives are either members of the project team or drawn from a range of trained volunteers from across all directorates who have made a valuable contribution to ensuring that the work was completed on time.
From July 2005 to the end of 2006, many employees were asked to complete a job record document; this document is evaluated by the trained panel. Directorates were asked for a list of benchmark posts and a selection across the county of names of postholders in those posts with a large numbers of employees to ensure that a comprehensive amount of information on those posts was obtained by completion of the full JRD. It was intended that all other posts, the non-benchmark posts, were also evaluated and for those posts a shorter, scheme through directorates, employee newsletters were distributed to inform employees about the process and to publish the list of benchmark posts within each directorate so trying to ensure that all who wished to be involved could do so; this included the facility for other staff in benchmark posts to complete the shortform JRD for evaluation if they wished, for example if a postholder felt that their own post differed in content from the benchmark post inot which they had been grouped.
Panels are issued with the JRDs and job descriptions in advance of the meeting. At the end of a day’s evaluations the panel review their work to ‘sore thumb’ the results to ensure the results look logical against benchmark results and the structure of results for other posts.
The results of evaluation panels’ work are quality assured within the JE team. Results are then grouped together for vertical moderation – a review of the scores within an organization structure - by a different panel of two Unison representatives and two trained senior managers. Periodically, horizontal moderation panels meet to review like-type posts across the organization to check consistency of result. These moderation panels may query the results of the original evaluation and request review and re-moderation of results before confirmation of scores with directorate senior management and HR.
Use of the NJS job evaluation scheme in NYCC
Employees whose posts have been evaluated using the NJC scheme will have received an evaluation descriptor which summarizes the thirteen factors of the scheme and the factor level definitions which apply to the evaluation score for their own post.
The attached summary provides further detail about the factor levels which have been used in NYCC to apply to posts up to and including SO2; although the scheme can be used for all levels of job, within NYCC is has been used just to SO2 and therefore some levels available in the scheme are not used. The summary details all levels which have been used.
The NJC evaluation process measures jobs by assessing their content against the descriptions for each of the thirteen factors and the panel agreeing on the most relevant factor level description for the post for each of the thirteen factors. The level scores are then recorded and are reviewed by a moderation panel. After moderation, the scores are converted into points from the NJC chart and it is these points which relate to the resulting payscales.