NAPLAN Analysis: Teacher Module

PURPOSE

This module is designed for classroom teachers to undertake an analysis of NAPLAN data. It is intended to help teachers:

→understand the school results

→confidently analyse data on the cohort, students and skills development

→develop strategies to integrate this information with classroom assessment and curriculum delivery

→identify issues for further exploration, and

→aid in the development of the school’s Literacy and Numeracy Plans.

CONTEXT

ANALYSING THE DATA

For classroom teachers, the focus is on the students’ results, identification of issues from the data, and strategies to analyse and use the information. This involves three types of analyses: point-in-time analysis (to identify concerns for this testing period), trends over time (to identify longer term issues), and value added growth of cohorts and individual students (from one testing period to the next).

For classroom teachers, analysis of the NAPLAN results should be undertaken in groups based on Stage, Band of Development or Faculty. It is here that valuable discussion emerges based on the students, curriculum and pedagogy of the classroom.

Little value is gained by one or two people doing the analysis, then presenting this to the staff.

In the following analysis:

  • the text in RED indicates the questions to ask of the data
  • the text in BLUE is the actual worksheet to record your answers

SMART Application

Landing Screen

When you open the application this screen shows the main features of the SMART2 application. The key elements are:

1.Scope of Analysis (Region, School or Student) and available Reports

2.Test, year, cohort group and test Domain

3.Subgroups (by gender, school/system-created groups, etc)

4.Data analysis tools and displays

5.Summary of the key features from the NAPLAN data

6.Key data in relation to the National Minimum Standards.
‘% below NMS’ is the bottom band
‘% at or below NMS’ is the bottom two bands
‘% at proficiency’ is the top two bands

7.Groups of schools (at least 5 schools containing a minimum of 30 students) and students within schools can be created for closer comparison

8.elearning Modules: self-paced tutorials for navigating SMART2 and analyzing the data.

At the School level, the analysis can be undertaken for the cohort, test items and skills, and for individual students. Different ‘school groups’ of students can be formed for closer identification and analysis (e.g. students who have participated in a Reading program, etc).

Many of the analyses in the application are also located in the ‘Reports’tab and can be downloaded as .pdf files.

COHORT ANALYSIS

Creating Student Groups

The ability to create groups of students at the school is a powerful way to analyse NAPLAN data in a targeted way. This can be used to analyse results in a variety of ways, e.g. class by class, for those students who have participated in a reading program, or to exclude students from the analysis who have left the school since the tests were done.

1.Select ‘Manage Groups: Students’ from the menu.

The following screen opens and displays the data to be entered:

2.Click the ‘Create New Group’ button.

3.Enter the ‘Group Name’ (e.g. Year 2/3 Reading Program)

4.Select students to be members of the group. Tick the appropriate box beside the students’ names.

5.Click the ‘Add’ button to select students.

6.Save the group and close the screen.

Growth in student scores for Reading, Writing and Numeracy can be viewed by comparing test scores this year with those of the same students two years ago (e.g. Year 5 students this year compared with their Year 3 results).

For further information, click on the eLearning button and select the ‘Student Growth’ module. Here, the progress of each student in the cohort is displayed as an arrow.

For each Year Range, Domain and Cohort, indicate if there are differences between:

  • Top end / bottom end for the Cohort
  • Top end / bottom end for boys and girls

STUDENT GROWTH – Distribution between Subgroups

2015 - 2017 / 2015 - 2017
Cohort / Boys / Girls / Cohort / Boys / Girls
Reading / Y3  Y5
Y5  Y7
Y7  Y9
Writing / Y3  Y5
Y5  Y7
Y7  Y9
Numeracy / Y3  Y5
Y5  Y7
Y7  Y9

Identifying Students

Each arrow represents the growth of a student since the last test (two years ago). For those arrows showing either large growth (orange) or low/negative growth (blue), identify the students and answer the following questions:

Poor Growth / Excellent Growth
  1. Which students have had either Poor or Excellent Growth?
  1. From these, identify those students whose growth is a surprise.

  1. Indicate some possible explanations forthese results

ITEM ANALYSIS

Item analysis is undertaken to identify particular skill areas in which the school is doing well or poorly.

This gives an indication of the differences between the school’s results and those for the state/territory on each item and literacy/numeracy skill.

Each column can be sorted. Filter values can also be added to show only the data of interest, e.g. entering <-10 shows all items where the school’s % correct is more than 10 below the state % correct /

To show the greatest difference between the school and state/territory:

Click on the ‘Difference’ column to show the negative values at the top. This shows where the school’s results are below those of the state/territory.

  • Note the skills where the students have performed poorly.
To identify results by skill area:
  • Either sort by ‘Description’ or enter a filter value (e.g. “3D”)

This report shows the items and skill areas where the school’s results are 10% above or below the state/territory results. This is the first place to identify those skills or groups of skills where students have performed well or poorly.

By looking at the ‘Item Analysis’ on the screen and the Report ‘Analysis by Question Options’, in which skill types or curriculum outcomes has the school under-performed by 10% or more compared with the state/territory?

UNDER-PERFORMANCE IN SKILL AREAS – 10% or more

Year 3 / Year 5 / Year 7 / Year 9
Reading
Writing
Spelling
Year 3 / Year 5 / Year 7 / Year 9
Grammar & Punctuation
Numeracy

To analyse these questions in more detail, select one and double-click. The question’s statistics will appear.

  1. From the multiple-choice information, see which incorrect options were selected by the students. Are there option/s that stand out?
  2. Now select ‘Scan’. The actual test item will appear. Are there plausible reasons why students chose the incorrect option/s?
  3. Re-order the ‘Response’ Column to find students who selected the incorrect options. Are there any surprises here?

1.Select the item for analysis.

2.The ‘Question Details’ section reveals the proportion of students answering each option, with the difference between the school and state/territory.

3.When hovering the mouse over the lightening icon, the ‘Distractor Analysis’ indicates possible reasons for the wrong answer.

4.Students’ answers are indicated with the option chosen. These columns can be sorted (click on column title).

5.To go to all answers from a particular student, double-click on their name. The student’s ‘Student Report’ can be obtained from this.

6.The actual test item can be accessed from here.

  1. In the following table, note the skill areas covered in the incorrect option/s. These may indicate areas of student confusion from the school’s teaching program/scope and sequence, etc

SKILL AREAS FOR INCORRECT OPTIONS

Year 3 / Year 5 / Year 7 / Year 9
Reading
Writing
Spelling
Grammar & Punctuation
Numeracy

Another way to indicate poor performance on individual items is by referring to the ‘Student Response Analysis’ Report.

Numbers in the columns indicate the WRONG options selected for each item. For example, in Item 21, Option 3 was the most common incorrect answer selected by students.

This is one of the most powerful modules in the SMART application. It provides specific strategies for integrating the NAPLAN skills into teaching and learning across the curriculum in the classroom. It is directly mapped to the Australian Curriculum and National Statements of Learning.

In the above screen shot the Teaching Strategies can be accessed directly (see red oval). When selected, specific strategies for each literacy/numeracy skill area and Stage can be selected. These are formatted for A4 printing for use directly in the classroom.

Hint: Print the Teaching Strategies and make them centrally available for staff to use in the classroom
as part of normal curriculum delivery

The Teaching Strategies can be accessed directly from the internet without going through the SMART Data application:

STUDENT ANALYSIS

For classroom teachers, student analysis can best be undertaken by creating ‘Student Groups’for further analysis – see ‘Creating Student Groups’ section (above). This is where more detailed analysis of NAPLAN results centred on the student can occur as a basis for the diagnosis of student learning.

This analysis is important for individual students and student groups created in the application. Gender differences can also be ascertained for further analysis. /

Look at the results for students at or below the NMS or particular groups (including gender).

  • What do the results show about students?
  • Are there any surprises with student placement in the bands (top and bottom)?
  • What learning support is indicated for individual students from these results?

PUTTING THE INFORMATION TOGETHER

The analysis of NAPLAN results for the Classroom Teacherraises the following questions for further investigation:

Value Added Student Growth

  1. Is there any discernible pattern in the distribution of student growthbetween 2013-2015 and 2014-2016 in relation to two sets of subgroups?
  • Top and Bottom students
  • Boys and Girls
  1. Across the testing Domains, which students are indicated as needing support or intervention?

Item Analysis

  1. In the testing Domains and/or year cohorts, is there a pattern showing deficiency in skill achievement?

Synthesis

  1. What implications can be made about the distribution of student growth in the school?
  1. Whatresponse is indicated by the pattern of skill deficiency for the whole-school/Stage/Band/Faculty?
  1. How can SMART’s Teaching Strategies be employed in the classroom to support curriculum delivery and assessment?
  1. What would be an effective way to respond to these results across the whole school or within a Stage, Band or Faculty?

Linking Data with Pedagogy (SMART eLearning Module)

This model is a feature of the SMART application as a strategy for linking the data from NAPLAN testing with whole-school planning and classroom-based pedagogy, assessment and reporting. It is suggested as a structure for, and approach to, leading learning throughout the school based on information from NAPLAN testing. Each element of the model is explained in the eLearning Module of the SMART application.

Using Data to Inform Learning

Information on student achievement helps to inform decisions about learning and teaching. Used wisely, data is diagnostic; it suggests questions to be asked and informs decisions that ultimately affect student learning and teacher pedagogy. Data forms the basis for information about assessment for learning, assessment of learning and assessment for teaching.

The following principles should be applied to use data wisely for the cohort, the class and the student.

1

NAPLAN Analysis – Teacher Module

  1. Engage with the data

If learning is everyone’s business, so is data analysis. Staff engagement with data on student achievement is crucial for developing a whole-school understanding of, and response to, student performance. To promote this engagement with the data and with the process of analysis, it is essential to involve a critical mass of teachers in analysing data on student achievement, and then share this with all teachers. The conversations are as important as the analysis!

  1. Ask questions of the data

Questions are starting points for future action; judgements are stopping points. Use the data to ask ‘How are we going?’ ‘How do we know?’ ‘What are we going to do about it?’ The third question forms the plan for action: if we collect and analyse data on achievement, we are obliged to use this information to improve student outcomes. Any judgements should be as a result of questions arising from the data analysis.

  1. Investigate the important issues

Any analysis of data on student achievement should be a planned process, not the result of ‘busy work’. Be strategic in what to analyse and how to analyse it.

With NAPLAN the key areas are:

  • Status: achievement at a point in time (means, percentage in bands, item analysis).
  • Gain: the difference for the student between this test and the previous score on the same skill. This measures value added of matched students, and can raise questions about classroom pedagogy. Also, determine if student gain is different for different subgroups (top/bottom; boys/girls, etc).
  • Improvement: how the school is going on any indicator compared with previous years. How does this year’s Year 5 compare with last year’s Year 5 (different cohort)? Is the pattern of results similar? If so, it could indicate issues with sequencing of learning or pedagogy across particular skill areas.

With semester reporting the key issues involve analysing any marked differences in grade distributions between:

  • Semester 1 and Semester 2
  • different Learning Areas within the same cohort
  • different cohorts in the same Learning Area
  • Maths and Science compared with Religious Education, English and HASS
  1. Drill down

Start looking at an overview of the cohort. Then look at the class, then the student. How are the student results distributed? What are the patterns? What are the questions arising from the analysis? Is student performance consistent across multiple data sources?

  1. Identify students

Put a face on the data. Having a clear idea of individual students and their results is the aim. Are there patterns across different students or student groups that produce good questions for analysis? Are there any surprises? How do these results compare with other sources of student achievement data for the same students?

  1. Use questions AND evidence to inform decisions

This is the point of the exercise. Analysing and reporting student achievement is meaningless unless there is a diagnostic purpose for doing so. This involves asking ‘What are our students’ learning needs?’ and, therefore, ‘What are our learning needs?’. Until we get relevant answers to these questions, we do not have a way forward. When we do, we frame a course of action that plans for improvement in student learning, by improving our learning.

1

NAPLAN Analysis – Teacher Module

NAPLAN Analysis: Teacher Module 1September 2012