back to articles

Selecting Interim Assessments

Eval selecting interim assessments

Beginning in late 2022, ISTE set out to explore how edtech buyers choose the “right” digital assessment products through the Benchmark project. In addition to understanding more about assessment selection and how to identify well-built products, Benchmark also aimed to develop a set of resources to help edtech buyers in digital assessment selection, and explore what “assessment culture” is, how to define it, and how it can impact teaching and learning.

 

In order to improve product discovery and evaluation, edtech buyers need to know what to look for in products – in other words, what are quality indicators that make some edtech products better than others. Recent resources from ISTE that support edtech buyers in purchasing decisions include:

 

     ● Advancing Edtech Evaluation and Selection page – One of ISTE’s key initiatives, this                      webpage includes information and resource links related to edtech product evaluation                and selection.

     ● EdTech Index – The Index (formerly the EdSurge Product Index) contains over 1,500 edtech          product profiles in a searchable database alongside resources for edtech buyers to find,                compare, decide, and validate edtech products.

     ● Teacher Ready Evaluation Tool – The Teacher Ready Evaluation Tool provides edtech                      buyers a standardized way to evaluate an edtech product’s classroom usability. Grounded          in research about the learning sciences and user experience design, the tool’s criteria                    guides users in completing valid, reliable evaluations of edtech products.

     ● ISTE Seal – A product certification that showcases products aligned with the ISTE                            Standards and research-based product usability indicators (including user interface,                      learning design, digital pedagogy, and inclusivity). 

 

In order to aid edtech buyers in understanding the digital assessment landscape, one goal of Benchmark was to expand coverage of formative assessment products with the ISTE Seal. Additionally, ISTE explored the prospect of developing the Seal for interim assessment products by working with experts in both product development and psychometrics to pressure test the applicability of the Seal framework to aimswebPlus, Pearson’s flagship interim assessment platform. As described by Pearson: 

 

          aimswebPlus is a tool for teachers and educational teams in Multi-Tiered System of                       Supports (MTSS), Response to Intervention (RTI), and special education contexts.                             aimswebPlus offers nationally-normed, skills-based benchmark assessments and                           progress monitoring tools integrated into one application across reading and math                       domains, with additional add-on measures across dyslexia and behavior/social-emotional           skills. aimswebPlus informs daily instruction and provides growth results to caregivers                 and district/state audiences in reading and math achievement using curriculum-based               assessment and standards-aligned content for students in PreK through Grade 12.

 

Can the Seal apply to interim products? After several months of pressure testing, the ISTE and Pearson teams identified factors unique to interim assessments that make formal product evaluation challenging under the current Seal framework. Here’s the unique affordances for interim assessments that make formal evaluation more difficult:

 

Interim assessment products are typically adaptive

Commercial interim products are often computer adaptive, whereas this is not the case with formative products. The mechanisms that ensure adaptability requires deep technical expertise to understand and are difficult to make visible for a product reviewer.

 

The psychometric requirements of the products are rigorous

Unlike any other product types currently offered as part of the Seal (curriculum, formative assessment, platform, and professional development), good interim assessment products have rigorous psychometric requirements. Training product reviewers to understand those requirements and be able to spot them in product design would be difficult, as existing Seal training protocols are already rigorous as is.

 

Product designs are often proprietary

For typical Seal applications, product design features are easy to spot via a slides-based presentation and a “sandbox” account. In contrast, interim assessment products feature a substantial amount of “under the hood” functionality (including psychometrics and adaptability) that are difficult to make visible and are proprietary (e.g. scoring algorithms). This makes a Seal application practically difficult.

 

Although interim assessment products don’t easily lend themselves to the current Seal framework and product evaluation process, school and district leaders still need guidance on selecting the right fit. To that end, ISTE R&D worked with Pearson to offer the following suggestions:

 

Remember that interim assessment products are tied to curricular content, measurement standards, and student growth/outcomes. This means that for an interim assessment product to be considered usable, it must be aligned with curricular content standards (e.g., Common Core, state standards) and measure student performance in a valid, reliable manner over time. These additional considerations increase the complexity of evaluating teacher usability, making product certification more difficult. Helpful questions to ask might include, “How does the product guide instructional decisions?” and “What does the product measure and what should teachers do with those measures?” 

 

Evaluating the usability of interim assessment products requires a special skill set. Schools and districts should include multiple stakeholders in the procurement process; however, for evaluation purposes, it’s necessary to have relevant expertise in assessments and measuring student learning outcomes. While not every school district can retain a psychometrician on staff, those who are evaluating interim assessments should be familiar with and knowledgeable about topics including assessment design, item creation, curricular alignment, and data-informed instruction (i.e. how teachers are using test data to inform instruction / better meet students’ needs).

 

Interim assessment products are best deployed within a multi-year strategic effort. Particularly with interim assessment products, it’s crucial for schools and districts to adopt a long-term perspective. Unlike formative assessment tools, which can often be implemented more rapidly, interim products demand a multi-year, strategic effort to properly train staff, integrate with existing curricula, and collect meaningful longitudinal data on student learning outcomes. By viewing interim assessment products as a long-term investment rather than a quick fix, schools and districts can more effectively leverage these tools to inform instruction and impact student learning outcomes. 

 

When evaluating interim assessment products and in an effort to find “the right” assessment combination for one’s school or district, decision makers must acknowledge and recognize the complexities involved in the process, including adaptive testing, psychometric rigor, and alignment with curricular standards. To maximize the benefits of interim assessment products, schools and districts should involve stakeholders with assessment expertise in the selection process and commit to a multi-year implementation strategy that will move the district in ways that align with the desired assessment culture. By adopting a long-term, strategic approach, educators and education leaders can more effectively leverage interim assessment products to inform instruction and impact student learning outcomes.

 

The Benchmark research project was supported by funding from the Chan Zuckerberg Initiative and the Walton Family Foundation.

More Articles

Evaluating Edtech

Nov 13, 2024
Teaching Coding in K-12: Compare Popular CS Education Platforms
Article comparison computer science
Comparison
Computer Science

Edtech Purchasing

How Schools Are Holding Edtech Products to a Higher Standard
Eval purchasing how schools are holding

Assessment

Aug 27, 2024
Understanding Assessment Culture
Eval assessment understanding assessment culture