Blueprint in Action

The design of the Blueprint was informed by a user-driven pilot process spanning different types of stakeholders to understand the Blueprint’s potential gaps in design. It also tested for the ability for adoption in diverse contextual settings. The field-driven design process was critical to incorporate stakeholder voices and pressure test the Blueprint for variability and generalizability.

Scroll down to see the Blueprint in action with our pilot partners.

Mathematica partners with the Bill and Melinda Gates Foundation grantees in many different programmatic areas. This pilot focused on the evaluation of automated essay scoring tools from a variety of grantees designed to create more opportunities for students to practice writing. The pilot tested the Blueprint against a set of research questions from the current cohort of grantees using the essay scoring tools. An exhaustive list of elements was identified from the current research plans of the grantees in a collaborative engagement. The pilot demonstrated a high coverage of 88% across the current evaluation plans. This highlights the ability of the Blueprint to act as a translational layer between researchers and practitioners, as well as substantiates the researchers’ use case.

 

The focus of the pilot was to map the full Blueprint against the entire student information system of Infinite Campus. Special emphasis was given to see how the Infinite Campus’s student information system mapped against the Blueprint in the behavior and student engagement submodules. The Blueprint had more than 80% coverage of all elements. The highest element coverage areas were: population, outcomes, family and community. The Blueprint demonstrated 83% coverage in behavior and student engagement submodules against the student information system.

LearnPlatform has a collection of historical Rapid Cycle Evaluations (RCEs) on different products and interventions to understand their impact. RCEs results drive the creation of universal evidence reports (UERs), an open-source form to document education evaluation information. The pilot design focused on utilizing the collection of RCEs and its summary variables to understand complementary overlap of these variables with Blueprint elements. The process assessed mapping and coverage of modules such as population, assessment, and education technology characteristics. The pilot demonstrated more than 90% coverage and 88% understandability score across the identified elements. This pilot demonstrated the use case for the education technology organizations in the sector and emphasized on how alignment with the Blueprint can improve EdTech tool data models to serve the practical data and research needs of both practitioners and researchers.

Transcend Education, in collaboration with Van Ness Elementary School, implemented the Whole Child Model that focuses on providing a set of school-wide practices that create a safe, connected environment for children and adults. At the intermediary level, the research team cross-mapped the existing evaluation framework being used against the Blueprint to assess coverage and applicability. It used the Blueprint to validate a well-defined research hypothesis around school culture and mindset. Additionally, it identified elements needed to answer the research question and conduct analysis. The pilot demonstrates an effective use case for practitioners and its value as a translational layer between practitioners, researchers, and solution providers.

The evaluation process entails two distinct approaches: 

  1. Assess the applicability and coverage of the Blueprint by cross-mapping the modules and submodules against the established evaluation framework of different pilot partners.

  2. Instrument a research question with the partners using the Population, Intervention, Comparison and Outcome framework and use the Blueprint to identify and classify the elements needed for evaluation in the specific context of the research question.

The success of the pilots was measured by three major criteria:

  1. Coverage: The coverage is defined as the total number of elements mapped as a percentage of the total number of elements identified to represent the education data needs in the  evaluation frameworks of the pilot partners. 

  2. Understandability: The understandability is defined as the assessment of clarity in definitions and understanding of the elements by other stakeholders. The precision, organization and ambiguity was assessed using qualitative and quantitative data. 

  3. Generalizability: The generalizability is defined as the extent to which the elements, modules, and sub-modules are aligned to real-world application and experiences. The generalizability was calculated as a median of the coverage for all pilots and aggregated the scores of all pilots to determine the overall generalizability of the Blueprint.