Using the Item Performance report

The Item Performance report allows you to analyse your students’ performance on each test item, or question. This diagnostic report is designed to allow you to identify common areas of need within a group of students.

The following topics are covered in this article:

What is an item?

We use the term ‘item’ as it is more accurate than the colloquially used ‘question’. Test items are not always questions, but instead statements or visual prompts to be responded to or instructions to follow (for example, ‘complete this sentence’ or ‘simplify the equation’). The type and format of each item is determined by the skill and knowledge being assessed.

Understanding the Item Performance report

The Item Performance report is designed to allow you to identify common areas of need within a group of students. Each icon in the chart represents a test item that was seen by at least one student in your selected cohort.

The chart displays items in one category at a time, for example, items in the Number and Algebra strand. You can view items in different categories by using the options in the panel at the left of the page.

ItemPerformanceReport.png

Setting your report parameters

Cohort selection

By default, students in all year levels and tagged groups that you have permission to view will be included in the report. You can focus on the results of a smaller group of students by adding year level and tag filters.

These filter options are based on students' current details, while the details displayed in the report chart and table reflect students' details at the time they completed their tests.

When you first load the report, only the most recent result for each student within the reporting window will be included, which means some results will not be visible.

Reporting window

Change the reporting window to find the results you need. Quick selections allow you to focus on one school year or semester at a time, or you can also select a custom date range.

Student achievement band

By filtering the item performance data by students' overall achievement band, you can directly compare what you know about those students’ abilities to the difficulties of the items they saw, which will make any meaningful patterns or anomalies more apparent.

Tests

In some cases, it's useful to filter results by the test your students completed, especially when using non-adaptive, linear tests. Selecting a single test also allows you to generate the Test items chart, which is equivalent to the traditional Group report.

Item category (strand, proficiency, type etc.)

Examine your students' responses to different types of items. Depending on the learning area, items are categorised in different ways to reflect the content, skills or processes that they are assessing of students.

You may find different patterns in your students' responses to items in different categories that indicate areas of strength or targets for further development.

Chart type

The All items chart presents every test item seen by your students, regardless of test. This allows you to analyse results across multiple tests, including adaptive tests.

The Test items chart is equivalent to the traditional Group report, which displays students' responses to items in a single linear test. You must select a test from the Test drop-down menu before you can generate this chart.

Reporting on all items seen by your students

One of the primary aims when analysing students' assessment results is to understand where and how to target teaching most effectively. This often requires identifying those areas where students have performed unexpectedly, for example, a particular type of item with a low correct response rate.

Use the Percentage correct check box and slider to highlight items according to the proportion of correct responses among those students who saw them in their tests.

For example, if the slider value is set to 50%, items that have a correct response rate below 50% will be represented by a red icon with a downward arrow and items with a correct response rate above 50% will be represented by a green icon with an upward arrow.

ItemPerformanceReport_PercentageCorrect.png

Remember that each item may have been seen by different numbers and groups of students, depending on which test they completed.

So the significance of a 25% correct response rate, for example, may change depending on whether it represents 1 correct response from 4 students, or 7 from 28. You can exclude items only seen by a handful of students by adjusting the Seen by count, which may help you to focus on the most meaningful data for your student group as a whole.

Item difficulty

Item difficulty is a measure of the extent of skills and knowledge required to be successful on the item. This makes it possible to allocate each test item a score on the same scale used to measure student achievement. An item with a high scale score is more difficult for students to answer correctly than a question with a low scale score. It could generally be expected that a student is able to successfully respond to more items located below their scale score than above.

Item difficulties are estimated based on the performance of individuals with a range of abilities who respond to that item, first at the item trial stage and later verified in real test results. The concept being assessed in the item is one aspect of item difficulty. Other factors may combine to make an item more or less complex. For example, the level of abstraction, the number of steps required, whether the question involves problem-solving or computation, the question context, the required precision of response, cognitive load, etc. An item assessing a concept that is introduced earlier in the curriculum may still be quite complex. Conversely, an item assessing a concept introduced later may be simpler.

By referencing the difficulty of an item, or a group of items, and the proportion of correct responses by a student or within a group, it may be possible to identify particular items, or types of items, that have challenged students.

For example, if the rate of correct responses to Probability items is relatively low, even for less difficult items, it may suggest students require some assistance in that area.

Student achievement

One of the most powerful ways to use the Item Performance report is to focus on items seen by students in the same achievement band.

If your report includes students of a wide range of abilities, it may not be surprising that some higher achieving students responded correctly to a particular item, while lower achieving students responded incorrectly to the same item.

But if you narrow your results by student achievement band, you are reducing the 'noise' that large variations in student ability may be contributing to your data. Knowing that, overall, the students in the report are achieving at a similar level – and by using the achievement band descriptions to understand the skills and knowledge they can typically demonstrate – any anomalies or outliers become more significant.

Examining items

View more details about an item and the students who saw it by clicking its icon within the chart. Here, you will be able to directly contrast students’ achievement and the item’s difficulty, view the item itself, and open individual student reports for even deeper analysis.

We would generally expect that if a student's estimated overall achievement – their scale score – is above the difficulty of a given item, then the student is capable of responding correctly. Similarly, items whose difficulty is well above the student's own scale score are likely to challenge the student.

ItemReport.png

Reporting on items by test (the traditional Group report)

While the nature of computer adaptive testing means that it is not possible to represent students' item performance in a simple table, you can generate an equivalent to the Group report within the Data Explorer for all non-adaptive, linear tests:

  • PAT Maths 4th Edition
  • PAT Reading 5th Edition
  • PAT Science 2nd Edition
  • PAT Spelling Skills

Within the Item Performance report:

  1. Select a single test from the Tests drop-down menu in the panel at the left of the page
  2. Select the Test items chart type

Limiting the report to students who saw a common set of items (in other words, completed the same test), allows you to directly examine and compare students' different responses to the same items.

ItemPerformanceReport_TestItems.png

Sort and filter the table by each of the columns at the left or by the rows at the top of the table to identify patterns and anomalies that are worthy of further investigation. For example, by comparing the difficulty of each item with the percentage of correct responses amongst your students, you may be able to find unexpected strengths (difficult items with many correct responses) or areas in need of further support (less difficult items with few correct responses).

Exporting your data

Use the Export button at the top of the page to download your raw data into Excel (.xlsx) format.

Was this article helpful?
0 out of 0 found this helpful

Articles in this section