When you're browsing the EdTech Index and see a product with multiple validation badges, you might wonder: what's the difference between all these certifications? In a previous post, we gave an overview of the validation landscape. Now, let's dive deeper into one specific type of validation that often gets overlooked but provides crucial information: design badges.
Design badges answer a fundamental question that comes before "does it work?"—they answer "was it built to work?" These certifications tell you whether a product was thoughtfully designed using sound educational principles and learning science research. Think of them as examining the blueprint before the building goes up.
What "Research-Based Design" Actually Means
When a product earns a design-focused validation, it means the company has documented how they translated learning science into actual product features. This isn't just about having a nice interface or checking boxes on a features list. It's about showing that every major component of the tool connects back to established research about how students learn.
For example, if a program claims to build math fact fluency, a research-based design certification means they've documented:
● Which specific research supports their approach
● How that research informed specific features in their tool
● What their "theory of change" is—the logical path from using the tool to achieving learning outcomes
● What conditions need to be in place for the tool to work as intended
This documentation often takes the form of a logic model, which is essentially a roadmap showing how the tool is supposed to work and why. It's the company saying, "Here's our aim for how this will help students learn, and here's the research backing up each step of that process."
The Major Design-Focused Certifications
Let's break down the key design badges you may encounter:
Digital Promise Research-Based Design (ESSA Tier IV)
Digital Promise's Research-Based Design certification validates that a product's design and underlying logic are grounded in learning sciences research. This certification ensures that research about how people learn is core to the theoretical framework driving product design and is evidenced throughout the product.
To earn this certification, companies must provide substantial documentation:
1. A Well-Defined Logic Model: The product must have a detailed logic model based on empirical research that provides a clear rationale for its intended impact.
2. Annotated Bibliography: Companies must cite well-designed, empirical research to describe the basis for and demonstrate at least five significant and distinct design decisions—with images or videos of each feature to demonstrate how they work.
3. Qualitative Research Study: At least one well-designed qualitative research study must be conducted to understand learners' and/or educators' experiences with the product and its potential benefits. This evidence indicates the product is designed for a positive learning experience and has had input from at least a handful of potential users.
4. Public Accessibility: The research basis must be easily accessible to education practitioners, learners, and communities—ensuring transparency about why the product is designed the way it is.
5. Data Privacy Commitment: A signed FERPA letter ensures data privacy protections are in place.
Products like Edpuzzle started their research journey here. Collaborating with LXD Research, Edpuzzle documented how their video-based learning approach connects to research on active learning, formative assessment, and student engagement. This certification represents a serious investment in research-based design—a signal that the company is committed to transparency and continuous improvement based on the latest research findings.
ISTE Seal
The ISTE Seal takes a comprehensive approach to validating educational technology, examining not just pedagogical design but also practical usability and implementation readiness. What distinguishes the ISTE Seal from many other validations is its hands-on evaluation process—products are actually tested to verify they function as intended in educational settings. This certification examines:
● Pedagogical foundations (is the learning approach sound?)
● Student-centered learning features
● Usability for educators and learners
● Assessment capabilities
● Privacy and data security
● Accessibility and inclusivity
When you see the ISTE Seal on a product, it verifies several key qualities that matter for classroom implementation. The certification confirms the tool has been tested and proven to align with ISTE Standards, ensuring it supports the competencies students need for future-ready learning. It indicates the product has a strong user interface and user experience built on best practices for digital-age learning—meaning both teachers and students can navigate it effectively. The seal verifies the tool is grounded in digital pedagogy and the learning sciences, not just featuring technology for technology's sake. Importantly, it confirms the product incorporates inclusivity and accessibility considerations for diverse learners and educators, and that it genuinely supports teaching with technology while promoting critical technology skills students will need beyond the classroom.
Toddle worked with LXD Research on their ESSA IV documentation, which set them up for completing the ISTE Seal and Digital Promise Tier 4 applications. Earning the ISTE Seal signaled that Toddle was designed with solid educational principles in mind—functioning as a learning management system that crosses boundaries between curriculum, assessment, and creativity tools in ways that align with how educators know students learn best. Toddle will be featured in the exhibit hall at the ISTE 2026 Annual Conference as a Seal holder, and potentially invited to participate in co-design workshops with educators and to attend exclusive ISTE events.
LeanLab Education ESSA Tier IV - Building Evidence
LeanLab Education's ESSA Tier IV certification recognizes companies that have completed a logic model or theory of change based on rigorous research and have a concrete plan—or research already underway—to study the effects of their edtech tool.
This badge tells you the company has done two important things: they've thought through their educational approach systematically and documented it in a research-based logic model, and they've committed to finding out if it actually works. It's essentially saying, "We've built this on solid research foundations, and we're planning to prove it delivers results."
This certification represents an important starting point in a product's evidence journey. Companies holding this badge have demonstrated they're serious about building an evidence base, even if they haven't yet completed effectiveness studies. For educators evaluating newer or innovative tools, this badge indicates the company has laid the groundwork for future research and isn't just making unsupported claims about their product's impact.
Amplify Desmos Math, a K-12 problem-based mathematics curriculum, earned the LeanLab ESSA Tier IV certification by documenting its theory of change and research plan. The certification confirms that their approach—which combines problem-based lessons, intervention, personalized practice, and assessments—is grounded in research at its foundation, with a plan to study its effectiveness in classrooms next.
EduEvidence Efficacy Bronze
EduEvidence is a global non-profit evidence review organization that provides an international standard for research documentation quality. As a global organization, EduEvidence provides a certification that's recognized beyond U.S. education contexts. This can be particularly valuable for educators in international schools or districts considering tools that may be used across different educational systems. Their Bronze certification (aligned with ESSA Tier IV and the ISTE Seal) validates that a product has a documented research-based design that has been independently reviewed by qualified researchers.
For the Efficacy Bronze certification, the expectation is that the product was reviewed by qualified researchers to establish its connection to published studies and that the company has a clear research plan. This independent review ensures that when companies claim their design is "research-based," those claims have been verified by experts who can assess whether the connections to learning science research are legitimate and well-documented.
All LXD Research studies are reviewed and validated by EduEvidence, providing products with internationally recognized certification. When 95 Percent Group partnered with LXD Research to create and validate their logic model, they earned this certification—confirming that their K-5 curriculum design is built on current research and evidence-based practices, as verified by independent researchers.
What You Can Learn from a Logic Model
Across design badges, a central element is the product’s logic model. A product's logic model is like looking under the hood. It shows you the company's thinking about how their tool produces learning. A well-developed logic model typically includes:
Inputs: What resources, research, and expertise went into building the tool?
Activities: What do teachers and students actually do with the tool?
Outputs: What immediate results happen (engagement, time on task, completion rates)?
Outcomes: What learning gains are expected, and on what timeline?
Assumptions: What needs to be true for this to work? (Teacher training? Certain amount of usage time? Specific student readiness levels?)
Let's look at a concrete example. When Just Right Reader documented the research-based design of their Take Everywhere Literacy Packs with LXD Research, their logic model showed:
● Research foundation: Studies on foundational reading skills, retrieval practice, family engagement, and authentic storytelling
● Design features: Repeated reading practice, integrated assessment to personalize book sets, and videos for families with mini-phonics lessons
● Theory of change: Tailored phonics practice → increased decoding skills and reading confidence → improved fluency and comprehension → stronger foundational literacy skills and confidence in reading
● Key assumptions: Books aligned to students’ current skill, family involvement and support for home reading practice, consistent use across settings
This documentation earned them an EduEvidence Efficacy Bronze badge and the Digital Promise Research-Based Design certification. But more importantly, it gave educators insight into why the program is designed the way it is.
The Critical Limitation: This Isn't Evidence of Effectiveness
Here's what design badges don't tell you: whether the tool actually improves student learning outcomes.
A product can have impeccable research-based design and still not work in practice. Maybe the implementation requirements are unrealistic. Maybe the underlying research doesn't transfer to the specific population the tool targets. Maybe competing priorities mean teachers can't use it with the necessary frequency or fidelity.
Think of it this way: An architect might design a beautiful, structurally sound building based on the best engineering principles. But until it's built and tested in real weather conditions with real occupants, you don't know if it truly functions as intended. Design validation is the blueprint review; effectiveness research is the building inspection after construction.
This is why products like Strongmind's foundational online reading curriculum have both types of certification. They started with Digital Promise Research-Based Design (ESSA Tier IV) certification and EduEvidence Efficacy Bronze—documenting their blueprint. Then LXD Research conducted a cross-cohort comparison study, earning them an EduEvidence Effectiveness Silver badge—showing that their blueprint translated to real results in the classroom.
Real-World Application: Making Decisions with Design Badges
So, how should you actually use this information when evaluating EdTech tools?
Scenario 1: You're exploring supplemental tools for a new initiative
You're implementing a new literacy block structure and need digital tools to support independent practice. You find three products that all look engaging, but only one has a Digital Promise Research-Based Design certification.
What this tells you: That product can show you exactly how its design connects to reading research. You can review their logic model to see if their approach aligns with your literacy framework. Even without effectiveness data yet, you know their design choices are intentional and research-grounded.
Scenario 2: You're comparing similar products
Two math practice tools have similar features and pricing. One has an ISTE Seal; the other has no validations.
What this tells you: The ISTE Seal indicates the first product was designed with student-centered learning principles, proper assessment practices, and accessibility in mind. While both might work, you have more confidence that the validated product was built with educational best practices from the start.
Scenario 3: A sales rep is pitching "research-based" features
Every EdTech sales pitch mentions research. But when you ask for their logic model or research-based design documentation, one company immediately provides it while another seems confused by the question.
What this tells you: The company with documentation has done the hard work of connecting its claims to actual research. The other might have good intentions, but hasn't systematically thought through their educational approach.
The Starting Line, Not the Finish Line
Design-focused validations represent an important starting point in a product's research journey. They tell you a company is serious about evidence, transparent about their approach, and grounded in learning science.
But they're at the beginning of the story, not the end. The most valuable products combine thoughtful, research-based design with rigorous effectiveness studies that prove the design works in real classrooms.
In the next blog of this series, we'll explore effectiveness badges—the validations that tell you whether a tool's carefully designed blueprint actually translates into student learning gains. We'll look at what makes program evaluation research rigorous, how to interpret different study designs, and when you absolutely need this type of evidence to make informed decisions.
Until then, when you see design badges on the EdTech Index, you'll know they're showing you something valuable: a company that has done its homework on the front end, building a tool based on how students actually learn. That's not everything you need to know—but it's a solid foundation to build on.
________________________________________________________________________________
Have questions about evaluating EdTech research? Want to understand what validation badges mean for your specific context? Visit lxdresearch.com to learn more about our work helping educators navigate the evidence landscape.
The EdTech Index occasionally publishes guest posts from partners in the education research community. LXD Research is an education research consultancy that helps Edtech companies document evidence and earn validation badges. The views and examples in this post reflect LXD Research's client work and expertise. Explore validation badges and quality indicators mentioned in this article on the EdTech Index at edtechindex.org.