Using Student reports

The Student report allows you to analyse a student's performance on their test by examining their responses to each item. You can generate a Student report from anywhere that students appear within the ACER Data Explorer.

The Student report is available from within the ACER Data Explorer.

The following topics are covered in this article:

Understanding the Student report

The Student report provides a detailed diagnostic picture of a single student’s test. In addition to the student’s scale score and achievement band, the student report presents the items shown to the student during their test and allows you to analyse their responses.

Correct responses are represented by green icons and check marks, incorrect responses by red icons and crosses. Items that the student saw but did not respond to and items the student skipped altogether are shown as grey icons with dashes and empty circles respectively. While both of these types of missing responses are scored as incorrect when estimating students' overall scale score, it is important to be able to distinguish between them when looking diagnostically at students' results.

What is an item?

We use the term ‘item’ as it is more accurate than the colloquially used ‘question’. Test items are not always questions, but instead statements or visual prompts to be responded to or instructions to follow (for example, ‘complete this sentence’ or ‘simplify the equation’). The type and format of each item is determined by the skill and knowledge being assessed.

StudentReport.png

Items are displayed in the chart according to their difficulty on the achievement scale for the learning area. More difficult items appear higher in the chart, while easier items are lower.

Item difficulty

Item difficulty is a measure of the extent of skills and knowledge required to be successful on the item. This makes it possible to allocate each test item a score on the same scale used to measure student achievement. An item with a high scale score is more difficult for students to answer correctly than a question with a low scale score. It could generally be expected that a student is able to successfully respond to more items located below their scale score than above.

Item difficulties are estimated based on the performance of individuals with a range of abilities who respond to that item, first at the item trial stage and later verified in real test results. The concept being assessed in the item is one aspect of item difficulty. Other factors may combine to make an item more or less complex. For example, the level of abstraction, the number of steps required, whether the question involves problem-solving or computation, the question context, the required precision of response, cognitive load, etc. An item assessing a concept that is introduced earlier in the curriculum may still be quite complex. Conversely, an item assessing a concept introduced later may be simpler.

By referencing the difficulty of an item, or a group of items, and the proportion of correct responses by a student or within a group, it may be possible to identify particular items, or types of items, that have challenged students.

Student achievement

The student's overall estimated achievement – their scale score – is represented within the chart as a dashed horizontal line and can be used as an important reference point when analysing item responses. A shaded area either side of the student's scale score represents the confidence band or margin of measurement error around the student's estimated scale score. This can be considered as their zone of proximal development.

We would generally expect that if a student's estimated overall achievement – their scale score – is above the difficulty of a given item, then the student is capable of responding correctly. Similarly, items whose difficulty is well above the student's own scale score are likely to challenge the student.

Items located within the confidence band are likely to elicit a relatively even mix of correct and incorrect responses, as this is at the current limit of the student's abilities in the learning area.

Identifying patterns to guide teaching

By default, the Student report displays the test items from left to right in the order they appeared in the student's test, but you can sort the report chart and table by clicking each column header to rearrange the items in more useful ways.

For example, sorting the report by Difficulty immediately makes it easier to see any unexpected responses – difficult items responded to correctly, or easier items with incorrect responses. These anomalies might be good candidates for further investigation. Your own professional judgement and knowledge of the student will be required to determine the significance or otherwise of these cases.

StudentReport_Difficulty.png

Grouping student's responses by Result (correct vs incorrect) or category (for example, Strand) can also help you find meaningful patterns that might suggest areas of need for the student.

Each test is intended to assess students' overall achievement across the learning area. Students will respond to items across a variety of categories (strands, processes, and skills etc.) and this information may support you in better understanding your students' strengths and areas of need.

But within a single test there may only be a small number of items assessing particular strands or skills, so this information cannot be taken as a definitive judgement on their abilities in these areas. Instead, this information may guide further investigation, warrant additional closer assessment, or be used in conjunction with other sources of information to provide a more complete picture of each student's abilities.

StudentReport_Result.png

StudentReport_Strand.png

Exporting your data

Use the Export button at the top of the page to download your students' results data into Excel (.xlsx) format, or download a PDF version of the report.

Was this article helpful?
1 out of 1 found this helpful

Articles in this section

Transforming Learning Systems
6 & 7 February 2025 | Register now