Implementing the Tier 1 Data Review Protocol

Unlock the potential of your team’s collaborative planning with the Tier 1 Data Review Protocol—a step-by-step process designed to transform student data into actionable teaching strategies. But this document isn’t just about filling out a form—it’s about sparking deeper thinking. The protocol encourages teachers to consider key questions as they plan future lessons that build on student understanding, focusing on acceleration rather than reteaching or slowing down.

What makes this tool powerful is its flexibility. Whether you’re using it to guide your own planning, facilitate a coaching conversation, or integrate it into a broader team discussion, the protocol’s detailed questions are designed to work no matter when you start—whether it’s your first data review or a mid-year adjustment. Even planning one lesson with this mindset can make an immediate difference in your instructional impact. And as a coaching tool, it empowers coaches or data analysts to guide teachers through prompts and questions that prioritize growth over remediation.

Explore the protocol to see how it can shift your data conversations from simply identifying problems to collaboratively planning solutions that accelerate learning and sustain student engagement.


Task Overview

The Task Overview section serves as the foundation for effective data analysis and instructional planning. This section is critical because it grounds the entire review process in the context of the task and provides a starting point for deep analysis. Without this clarity, discussions can easily drift or focus on surface-level observations. Instead, a strong setup aligns everyone’s focus on actionable trends and forward-thinking instructional strategies that accelerate student learning.

 

Understand Standards and Task

This section is where teams define what proficiency looks like and anticipate how students might approach the task. By establishing clear criteria, educators ensure they have a shared understanding of success and can analyze student work with consistency and focus.

This step is essential for ensuring the team focuses on specific evidence in student work. It shifts the conversation from generic observations about who got an answer correct to targeted analysis about what skills and knowledge we expect to see being used to perform well on the task, helping educators plan responsive instruction that meets students where they are while keeping them on track for grade-level success.

From Fixed to Focused Thinking:

One of the biggest challenges in analyzing student work and planning next steps is overcoming fixed thinking—those quick assumptions or surface-level conclusions we sometimes make. The Tier 1 Data Review Protocol is intentionally designed to shift teams from a fixed mindset to a focused mindset by grounding every step of the process in evidence from student work.

This mindset shift isn’t about pointing out what’s wrong; it’s about rethinking how we interpret what we see and turning assumptions into actionable insights. In every section of the protocol—from identifying proficiency criteria to reviewing trends and creating action plans—you’ll find opportunities to move:

  • From vague observations (“They didn’t try”) to focused analysis (“Their work shows partial steps, but they stopped at listing smaller factors.”)

  • From assumptions (“We just need to reteach everything”) to targeted next steps (“A scaffold, like visualizing equal groups, can address this misconception without reteaching the entire lesson.”)

By adopting a focused mindset, educators transform their data conversations. Instead of being stuck in reactive cycles—blaming the task, the students, or past instruction—teams develop forward-looking plans that accelerate learning. The result? More targeted instruction, better use of time, and higher expectations for what all students can achieve.

This mindset shift isn’t a one-time adjustment; it’s woven into every step of the protocol, ensuring that conversations remain evidence-driven, productive, and actionable. It’s about moving beyond assumptions and keeping the focus where it belongs: on student growth and success.

  • Fixed: "The task is about writing."
    To Focused: "Can we be more specific? Let’s look at the standard this task aligns to and clarify what specific skills or knowledge it assesses. For example, are students expected to write persuasively, or is it more about structure and organization?"

  • Fixed: "This is a math problem about fractions."
    To Focused: "Let’s dig into the standard this problem addresses. Is it about performing operations with fractions, conceptual understanding, or application in a real-world context? This distinction helps us focus on what to evaluate."

 

Set Criteria for Looking at the Work

This section focuses on establishing clear expectations for proficiency while preparing to analyze student work thoughtfully. It is not about assumptions or guesses—it’s about using evidence from actual student work to identify patterns of success and misconceptions.

Here’s why this step matters:

  • Defining Proficiency: By determining the key qualities of a successful response and identifying anticipated strategies (both correct and incorrect), teams build a common understanding of what success looks like. This clarity helps ensure consistency as work is analyzed.

  • Anticipating Misconceptions: Educators can brainstorm possible misconceptions (like those listed) as a starting point. These are helpful guesses but do not need to match perfectly what will be found during analysis.

  • Clarity Around Success: Teams outline what a proficient response looks like and predict strategies—correct or incorrect—that students might use. This helps focus attention during analysis on both errors and strengths.

  • Consistency: Defining key qualities and strategies ensures a common lens for all teachers, creating shared understanding and alignment in how work is evaluated.

By prioritizing evidence-based analysis, this step keeps the focus on what students produced, not just what teachers observed during instruction. Misconceptions like "dividing by 13" or "ignoring equal distribution" may surface as trends, but they are only valid if grounded in the patterns revealed by student work. This shift leads to deeper insights and better planning for subsequent lessons.


Let’s face it—when we’re analyzing student work, it’s easy to fall into vague or surface-level observations. But this part of the protocol is all about digging deeper and focusing on the evidence. Here are a few common Fixed examples you might hear and how you can redirect the conversation to keep it productive and focused.

When we hear these examples, it’s a signal to pause and refocus on what really matters: evidence from student work. By asking these questions and redirecting the conversation, we can move from surface-level observations to insights that help us better understand our students—and plan next steps that actually work.

  • FIXED: "Students need to show their work."
    To Focused: "Okay, but let’s get specific. What do we actually want to see? Should they list all the factors, write out their calculations, or explain why their answer works? ‘Show their work’ can mean a lot of things, so let’s outline exactly what we’re looking for to demonstrate understanding."

  • FIXED: "The answer should be right or wrong."
    To Focused: "The final answer does matter, but we can’t stop there. Let’s look at how they got to their answer. Did they list some factors correctly but miss the largest one? Or maybe they got the right answer but can’t explain it? The process matters just as much as the result—it helps us see what they actually know."

  • FIXED: "They didn’t understand the problem."
    To Focused: "That’s a good start, but let’s dig deeper—what didn’t they understand? Did they think it was a division problem instead of a factor problem? Did they start listing factors and stop too early? Let’s look for clues in the work so we can figure out exactly where they got stuck."

  • FIXED: "This group just didn’t try."
    To Focused: "I hear you, but let’s take a closer look. Did they write anything down, even if it’s just a partial list of numbers or a first attempt? Sometimes what looks like a lack of effort is actually confusion, and small attempts can tell us where we need to step in and support."

  • FIXED: "Students should just guess and check."
    To Focused: "Guessing might seem like an easy approach, but it doesn’t help them understand the concept. What steps do we actually want to see? Maybe we want them to systematically list factors or check their work by multiplying back. Let’s focus on teaching strategies that build understanding instead of relying on guesses."

 

Sort and Analyze Student Work

This step is not about assumptions—it’s about ensuring that analysis stays grounded in what students produce, not just what teachers observed in class.

Here’s why this part matters:

  • Focus on Evidence, Not Guesswork:
    It’s tempting to assume we know where students struggled based on classroom observations, but those ideas must be verified through actual student work. This step ensures that misconceptions are identified based on patterns in the work, not assumptions.

  • Brainstorming Misconceptions as a Starting Point:
    The misconceptions listed before (like "dividing by 13" or "ignoring equal distribution") aren’t just educated guesses—these should describe what’s in the work.

  • Preparing for Productive Conversations:
    Setting criteria upfront helps focus discussions on specific evidence:

    • Did students calculate all factors?

    • Did they stop too early?

    • Did they mistake the task for division instead of factoring?

This focus keeps the conversation productive and actionable.

  • FIXED: "Students made a lot of mistakes."
    To Focused: "What kinds of mistakes are we seeing? Let’s group the errors into categories—are they calculation errors, conceptual misunderstandings, or misinterpretations of the task?"

  • FIXED: "Some students did well, and others didn’t."
    To Focused: "Let’s unpack this. What did the students who ‘did well’ do that was different from those who didn’t? Identifying patterns will help us plan next steps."

  • FIXED: "They just guessed."
    To Focused: "That’s a possibility, but let’s look closer. Did they start listing numbers or attempt any calculations? Even a partial effort can tell us something—maybe they didn’t finish because they got stuck."

  • FIXED: "Some students just didn’t try."
    To Focused: "Let’s not assume that yet. Even if their work looks incomplete, there might be clues about where they’re struggling. Did they list a few factors? Did they write anything down that could show partial understanding?"

  • FIXED: "They didn’t show their work."
    To Focused: "Let’s clarify—what specific steps do we expect to see? Should they list all the factors, show division, or verify their solution? If we’re clear on that, we can check for evidence of their reasoning."

 

Review Categories for Trends

This step is about looking at the big picture—identifying patterns in student understanding and misconceptions. By analyzing these trends, teams can make meaningful connections between what students are doing and what might be holding them back.

Here’s why this part matters:

  • Finding the Root Causes: When misconceptions show up, it’s important to look at their origins. Did students struggle because of a previous skill, like writing remainders as fractions and decimals? This reflection helps teams pinpoint the root cause, not just the symptom.

  • Connecting to Instruction: Understanding where misconceptions came from (e.g., previous lessons on long division) helps teachers adjust future instruction to address those gaps while keeping the learning moving forward.

  • Seeing the Impact: By identifying how misconceptions affect student thinking—such as seeing cookies as “breakable” instead of whole or ignoring remainders—teachers can plan targeted strategies that clarify concepts without reteaching everything.

  • Spotting Trends Across Groups: Once patterns are identified, compare them to trends from other tasks or standards. Are the same misunderstandings showing up repeatedly? Recognizing these patterns allows for proactive adjustments that benefit the entire class.

  • FIXED: "Students just didn’t get it."
    To focused: "That’s a start, but let’s look at why they didn’t get it. Did they misunderstand how groups should be equal? Or maybe they focused too much on remainders? Let’s use their work to identify where their thinking broke down."

  • FIXED: "This is because we didn’t teach it well enough."
    To focused: "Let’s pause and focus on the evidence. What in their work shows us where they got confused? Did they apply long division correctly but misinterpret the remainder? Understanding the misconception helps us refine instruction, not blame it."

  • FIXED: "They just forgot what we learned before."
    To focused: "That might be true, but let’s dig deeper. How do we know they forgot? Is their work missing key steps, or did they try something different? Let’s look for patterns to confirm whether prior learning is the root cause."

  • FIXED: "It’s obvious—this is just too hard for them."
    To focused: "Let’s not jump to conclusions. What specific part of the task tripped them up? Did they list some factors but stop too early? Did they try to divide without considering equal groups? Let’s focus on what the work is telling us."

By analyzing misconception origins and impacts, teams can move beyond surface-level observations and develop targeted strategies. This step keeps the conversation focused on evidence-based patterns and helps ensure instructional adjustments are meaningful, intentional, and forward-looking.

This step focuses on identifying patterns of student engagement and learning needs to inform targeted next steps for different groups of learners. By categorizing students into skill, will, and thrill deficits, teachers can address academic gaps, engagement issues, and opportunities to challenge students at higher levels.

Bringing It to Life: Evidence-Based Responses

When categorizing students into skill, will, or thrill deficits, teams must base their analysis on evidence in student work, not assumptions. For example:

  • Skill Deficit: Evidence might show incomplete steps or errors in converting remainders.

  • Will Deficit: Work might appear rushed, missing contextual reasoning despite correct calculations.

  • Thrill Deficit: Early finishers may complete the task correctly but display off-task behaviors because the learning lacks challenge.

  • FIXED: "These students are just lazy."
    To focused: "Let’s look at their work. Are they rushing through because they’re overconfident? Or do they stop short because they don’t see the value in the task? Let’s focus on how we can re-engage them with strategies that connect to the context."

  • FIXED: "They just don’t get it."
    To focused: "Okay, but what don’t they get? Let’s pinpoint the specific skill—are they struggling to convert remainders or understand equal groups? Targeting the right skill will help us support their learning more effectively."

  • FIXED: "They’re fine—they finished early."
    To focused: "Finishing early is great, but are they being challenged? Let’s think about how we can extend their learning with additional representations or problems that deepen their understanding."

  • FIXED: "We just need to reteach everything."
    To focused: "Let’s pause. Do we need to reteach the entire lesson, or can we target specific skills or strategies? For example, could students create visual models to clarify where they got stuck or rushed?"

By focusing on evidence-driven analysis and categorizing students based on skill, will, and thrill deficits, teachers can:

  1. Design supports for students who are struggling.

  2. Reignite engagement for students who are disconnected.

  3. Extend learning for students who are ready to go further.

This step ensures that every group of learners is met where they are and given the tools to grow—without slowing down or watering down instruction.

 

Action Plan

The Action Plan step brings everything together—trends, misconceptions, and instructional next steps. This is where teams decide what to do next based on their analysis of student work, ensuring the plan is specific, actionable, and forward-thinking.

Why This Step Matters

  • Focused Adjustments: Instead of reteaching everything, this step helps teams design strategies that directly address identified gaps while maintaining rigor. For example, using scaffolds like square paper and manipulatives targets the specific issue of visualizing groups without slowing down instruction.

  • Increasing Engagement: This step focuses on strategies that actively involve students, like drawing representations, correcting errors, or using prompts to encourage ownership of learning. The goal is to keep students engaged in grade-level work.

  • Clarifying Next Steps: Reflecting on trends allows teams to identify adjustments to instructional delivery, such as introducing models, using formative questions, or presenting incorrect responses as a learning opportunity.

  • Monitoring Progress: By using formative assessment strategies (e.g., targeted questioning), teachers can gauge whether the action plan is working and adjust instruction in real time.

  • FIXED: "Let’s just reteach the whole lesson."
    To focused: "Reteaching everything isn’t always necessary. Let’s focus on the specific misconception we identified. For example, how can we help students make sense of equal groups? Would a scaffold like square paper or visual models address that more effectively?"

  • FIXED: "We’ll tell students to pay more attention."
    To focused: "Instead of just telling them to focus, how can we help students engage more deeply? Could we use prompts or questions that encourage them to reflect on their answers, like ‘What number of groups do you have? How do you know?’"

  • FIXED: "They need to stop guessing and just try harder."
    To focused: "Let’s think about why they’re guessing. Do they need a clearer strategy for solving the problem? For example, presenting an incorrect response first could challenge them to think critically and engage more intentionally."

  • FIXED: "We’ll just keep practicing until they get it."
    What to Say/Do: "Practice is important, but let’s make it purposeful. What scaffold or strategy can we add to ensure students understand what they’re practicing? Could they use manipulatives or create representations to check their thinking?"

Moving Forward: Intentional Planning

The Action Plan step ensures that instructional adjustments are purposeful and evidence-based, not reactive. Whether it’s introducing visual models, using targeted prompts, or encouraging ownership by correcting errors, the strategies outlined here are designed to move learning forward—not backward.

By focusing on specific next steps and actively engaging students, teams can close gaps, build confidence, and ensure all learners are supported in reaching grade-level expectations.

 

Reflect

The Reflect step brings the process full circle. It’s an opportunity for teams to pause, celebrate what worked, and identify tweaks to make the process even more effective next time. Reflection is short and intentional—its goal is to ensure the process remains a tool for growth, not a checklist.

Why This Step Matters

  • Celebrate Successes: Taking time to identify what “felt right” helps build team confidence and reaffirms the value of the process. Did the team identify actionable trends? Did the discussion lead to specific next steps that move learning forward? Recognizing wins is critical.

  • Refine the Process: Reflection isn’t about redoing everything—it’s about small, meaningful changes. For example, incorporating a next-day task, as suggested here, ensures the work connects seamlessly to ongoing instruction.

  • Continuous Improvement: This step encourages teams to see the protocol as flexible and adaptable. What worked for this group of students or task might shift slightly next time—and that’s okay.

  • FIXED: "This process took too long."
    To focused: "I hear that. Let’s think about what took the most time. Were there parts we could streamline, like clarifying misconceptions earlier or focusing on just one key trend?"

  • FIXED: "It was fine—no changes needed."
    To focused: "That’s great to hear! What felt most useful? Is there a small tweak we could try next time, like bringing in an example of a task students will work on tomorrow?"

  • FIXED: "We didn’t find anything new."
    To focused: "Even if we didn’t uncover something surprising, did this process help confirm what we already knew? That’s still valuable because it ensures our next steps are focused and intentional."

  • FIXED: "Students just need more practice."
    To focused: "Let’s dig a bit deeper. What specifically do they need to practice? Could we incorporate a scaffold or strategy—like visual representations or targeted questions—to help students practice more effectively?"

Moving Forward: Making It Work for You

Reflection doesn’t need to be perfect—it just needs to be honest and forward-looking. By identifying what worked and small adjustments for next time, teams can ensure the protocol evolves to meet their needs and, most importantly, supports student growth in a meaningful way.

Whether it’s celebrating a win or adding a next-day example to deepen the connection to instruction, this step keeps the process relevant, actionable, and effective.


The Tier 1 Data Review Protocol isn’t just about analyzing student work—it’s about creating a culture of reflection, collaboration, and forward-focused action. By grounding discussions in evidence, identifying specific trends, and planning purposeful next steps, this protocol empowers teachers to address misconceptions, engage all learners, and accelerate progress without slowing down instruction. Whether you’re refining one lesson or rethinking an entire unit, this process helps teams stay focused on what matters most: ensuring every student has the opportunity to succeed at grade level. Start small, reflect often, and watch how this approach transforms the way you use data to drive meaningful outcomes in your classroom.

Next
Next

Data-Driven Decision Making