When educators explore educational technology options, they're often bombarded with claims about "research-backed" tools, "evidence-based" programs, and products sporting various certification badges. But what do these labels actually mean, and how can busy educators cut through the marketing language to find genuinely useful evidence?
A recent study from LXD Research sheds light on how educators actually approach evidence evaluation. The study revealed that educators place high value on research evidence, but are most compelled by research that demonstrates relevance to their specific teaching contexts. When studies show success in similar schools, with comparable student populations and teaching conditions, educators view this research as highly credible. The challenge isn't educator skepticism toward research itself, but rather the frequent disconnect between research settings and the practical realities of diverse classroom environments.
With a variety of products to sift through, understanding how to interpret different types of evidence becomes essential. While the research landscape includes various certification programs, validation badges, and research claims that may seem overwhelming, these tools can actually serve as valuable guides when you know what to look for. Learning to decode these different forms of validation can help you quickly identify tools that demonstrate genuine impact on student learning in settings you'll recognize as similar to your own.
The Current State of EdTech Evidence
If you've used the EdTech Index before to search for products, you've probably noticed the filters just below the search bar. These filters can help you narrow down the options and only show you the most relevant products. The Validations are right next to the audience, and their components are top priority: safety, interoperability, inclusivity, evidence, and usability. This article focuses on the Evidence-Based design section of this filter. Each of the organizations represented evaluates EdTech products, but here's the thing: they're not all measuring the same thing. Some focus on research design, others on how product development happens, and still others on effectiveness.
Understanding these differences is crucial because when you filter for "evidence-based" products, you might still see 90+ options. The key is knowing what type of evidence each validation actually represents, so you can find programs that fit your specific context and needs.
Decoding the Validation Organizations
EdTech Validation Organizations & Their Certifications
ISTE (Non-profit)
● ISTE Seal: Evaluates products against educational principles and ISTE standards, covering privacy, assessment, student-centered learning, and accessibility.
Digital Promise (Non-profit)
● Research-Based Design Certification (ESSA 4): Validates that a product's design and logic are grounded in learning sciences research.
● ESSA Tier 3+ Certification: Recognizes products with rigorous research showing statistically significant positive impacts on student outcomes.
LeanLab (Non-profit)
● ESSA Tier IV - Building Evidence Certification: Recognizes companies with research- based logic models and plans to study tool effectiveness.
● ESSA Tier III - Promising Evidence Certification: Provided for well-designed correlational studies showing statistically significant positive effects.
EduEvidence (Global Non-profit)
● Efficacy and Effectiveness Certifications: Offers Gold, Silver, and Bronze certifications aligned with ESSA Tiers I-IV standards. All LXD Research studies are reviewed and validated by EduEvidence.
Instructure (For-profit)
● ESSA I-IV badges that are connected to an online platform where teachers can comment on products and districts can assess usage and effectiveness.
Real Examples: What This Looks Like in Practice
Let's look at how this plays out with a variety of products you might recognize or may be new to you:
Creative Tool: Padlet offers visual collaboration tools for creative work and education. Padlet earned an ISTE Seal, indicating it was designed with solid educational principles and learning science research in mind. Teachers can feel confident that the tool was built with pedagogy in mind, though this doesn't tell us about specific learning outcome impacts. By partnering with LXD Research, Padlet documented its research-based design and had the documentation validated by EduEvidence, earning an Efficacy Bronze badge.
Digital Tier 1 Curriculum: Writing A-Z is a complete K-5 writing curriculum built on the latest research and evidence-based practices. Writing A-Z partnered with LXD Research to update the logic model, validate it for ESSA compliance, and submit it to EduEvidence for international certification.
Virtual Foundational Reading Curriculum: Strongmind is an online foundational reading program for virtual schools. LXD Research conducted a cross-cohort comparison, evaluating their new scope and sequence against their older one, earning an Effectiveness Silver certification with EduEvidence. This study was built on their Digital Promise Research-Based Design ESSA Tier 4 certification as well as their EduEvidence Efficacy Bronze, demonstrating evidence that their ‘recipe for success’ outlined in their program logic model was effective in a classroom setting.
Multi-modal Tier 1 Curriculum: Zaner-Bloser Handwriting is a handwriting program for K-5 that includes explicit, systematic instruction in letter formation, handwriting fluency, and written communication. Zaner-Bloser Handwriting documented and validated its research-based design in partnership with LXD Research and was then certified by EduEvidence, earning an Efficacy Bronze badge. Zaner-Bloser Handwriting also conducted a correlational efficacy study on program adoption rates, which showed that schools with high adoption rates had significantly more students meeting handwriting proficiency. This finding was validated by EduEvidence, earning the program an Effectiveness Silver badge.
Supplemental Math Program: Frax is an instructional tool designed to help students build a conceptual understanding of fractions. Frax has multiple validations, including ESSA Tier 2 evidence from a study conducted by the developer company’s internal research team that LXD Research reviewed and validated with EduEvidence. In this research study, students using Frax with moderate or high usage showed greater improvement than non-users, with proper controls and statistical analysis in place.
Supplemental Instructional and Assessment Tool: Edpuzzle is an online platform that helps teachers transform videos into interactive lessons by allowing them to embed questions, voiceovers, and notes into videos. Edpuzzle began by collaborating with LXD Research to complete their research documentation, earning the Digital Promise Research-Based Design ESSA Tier 4 badge. Additionally, they conducted a research study that earned the Digital Promise Evidence-Based EdTech ESSA Tier 3 badge, as well as the EduEvidence Effectiveness Silver badge.
Formative Assessment: Classtime is a summative and formative assessment solution for teachers that complements in-class and distance teaching with immediate feedback on every student's level of understanding. Classtime has progressed through multiple EduEvidence levels through partnering with LXD Research, as they conducted increasingly rigorous research studies, which were also presented at the ISTE+ASCD Live Conference in June 2025. Their EdTech Index profile now shows this research journey, from initial usability studies to controlled effectiveness research. Classtime now has Efficacy Bronze and Silver, as well as Digital Promise certifications.
Practical Navigation Tips
When you're evaluating EdTech products, consider these strategies:
1. Start with your priorities: If you need to justify spending federal Title I dollars, look for ESSA Tier 3+ validations. If you're more concerned about usability and teacher adoption, design-focused validations might be more relevant.
2. Don't stop at the badge: Click through to see the actual study details and email the company for a copy if the study isn’t online. A product with evidence-based validation might have a study of 50 students or 500 students—the sample size and methodology matter.
3. Consider the context: Research conducted in urban Texas elementary schools might not directly apply to your rural Wisconsin high school, but it's still valuable information. The actual study details can help you decide how the information might apply to your context.
4. Use filters strategically: When the EdTech Index shows 148 math products, start with validation filters to narrow down to maybe 92 products, then use other criteria like grade level and specific features to continue to narrow the search.
Why This Matters More Than Ever
The validation landscape has become both more complex and more essential for educators. Our research reveals a clear pattern: while educators consistently value research evidence, they're most influenced by studies that demonstrate relevance to their specific teaching contexts. When research shows success in similar schools with comparable student populations and conditions, educators view it as highly credible and actionable.
This creates both an opportunity and a challenge. The opportunity lies in the growing number of products with genuine research backing—companies are investing in proper validation studies, and the evidence base is stronger than ever. The challenge is that with more badges and certifications available, the noise level has increased alongside the signal.
Here's what we're seeing:
● Educators want to try products themselves, but they need research evidence to justify adoption decisions
● Leaders are looking for products that are both engaging and safe, with evidence that they will make progress towards meeting and exceeding grade-level goals.
● State procurement processes require higher levels of evidence, making these distinctions more important.
The landscape is evolving rapidly. States that have implemented the science of reading requirements are now turning attention to math, science, and other subject areas. Evidence requirements have become standard on all RFPs for core and supplemental products. Understanding these validation systems now will serve you well as evidence requirements continue to expand.
Moving Forward
What's encouraging is the shift toward transparency in EdTech research. More companies are moving beyond marketing claims to conduct rigorous studies, and validation organizations are working to standardize how evidence gets communicated to educators. This means you're increasingly likely to find meaningful research backing for the tools you're considering.
The key is knowing how to interpret what you find. When you see multiple validation badges on a product, you now understand that some tell you about thoughtful design processes while others demonstrate actual learning outcomes. When you filter for "evidence-based" tools and still see dozens of options, you can dig deeper into what type of evidence each represents.
The validation landscape will continue evolving, but the fundamental principle remains constant: understanding what different types of evidence actually tell you empowers you to make decisions that serve your students best. With this framework in mind, you can navigate the research claims with confidence, knowing exactly what questions to ask and where to find the answers that matter most for your classroom.
______________________________________________________________________________________
LXD Research specializes in designing and conducting research studies on EdTech programs, with a focus on providing meaningful evidence that helps educators know how programs work and under what contexts. To learn more about our work or to access our research on educator decision-making, visit lxdresearch.com.