Skip to content

Maximizing Tier 2 & Tier 3 Data: Progress Monitoring, Intervention Fidelity, and Intervention Effectiveness

October 17th, 2019

As educators, we're typically really good at determining when a student has a need—largely because of our experience and because of our interest in the well-being of our students. And as we increase our data literacy and invest in excellent data analysis tools, we become better and faster at identifying the student's specific need and aligning an intervention to match it.

The piece in the intervention process where most of us struggle comes after we’ve implemented an intervention. We usually struggle more with coming back to monitor progress and evaluate whether the intervention we selected is indeed making a positive difference.

There can be confusion around what questions to ask, what data to use, and how to organize our work around those steps. So let’s do a deep dive into how we can use data for evaluating intervention effectiveness, monitoring student progress, and tracking intervention fidelity.

Forming a Team to Measure Intervention Effectiveness

The work of reviewing Tier 2 and Tier 3 data at the student level is usually done by an Intervention Data Analysis Team (DAT). This team typically meets every 6 weeks for academic interventions and every 3 weeks for social-emotional behavior (SEB) meetings to review individual students’ progress. The Intervention DAT includes the person doing the intervention and someone who has training in data analysis.

Based on the data, they evaluate the effectiveness of the intervention, determine whether the intervention should be continued, adjusted, or exited, and specify any action items. Input is typically provided by the parents and the classroom teacher, and the decisions they make are shared with other stakeholders.

(As a note, your Program DAT looks at these same data at the grade, school, and district level, to answer system-level questions about intervention program effectiveness and ways to strengthen Tier 1 efforts. That will the the topic of a future blog.)

What Data Do We Need to Determine Intervention Effectiveness?

Primarily, we're looking at progress monitoring data and intervention data. We'll look at both in this blog.

It's important to note that sometimes in these conversations, we can get derailed talking about other factors that influence the student on a daily basis, like his or her situation at home. That type of information is extremely important and certainly part of the whole child data picture. However, the goal is to talk about the intervention: is it working, and is it working fast enough?

Let's start by looking at progress monitoring.

What is Progress Monitoring? How is Progress Monitored or Measured?

Progress monitoring is a standardized process of evaluating progress toward a performance target based on rates of improvement measured by frequent (usually weekly) assessment of a specific skill.

Progress regarding academics should be measured by a valid and reliable progress monitoring assessment (e.g., FastBridge). Progress monitoring assessments are very sensitive to growth and able to carefully measure whether a student’s skills are increasing week by week.

There are two types of progress monitoring assessments: General Outcome Measures (GOMs) and Skill-Based Measures (SBMs).

GOMs measure foundational skills that are related to general outcomes, such as reading or math competence. Essentially, GOMs track whether a student is “generally” on track for grade level.

SBMs measure a student’s relative expertise or mastery of a specific skill. In other words, it measures progress on the specific skill being targeted by the intervention. GOMs and SBMs should be alternated every other week.

Progress regarding SEB are typically measured by Direct Behavior Ratings (DBRs). DBRs are a brief rating of a specific behavior within a specified observation period. They often take the form of a teacher reflecting at the end of a class period on a specific student, the skill or behavior being targeted by intervention, and estimating/recording the percent of time that behavior was exhibited by the student.

What Questions Should We Ask About Progress Monitoring Data?

For Academic Progress Monitoring Measures:

  • What's the rate of improvement (ROI)? How does it compare to the ROI goal?
  • What's the trend? How does it compare to the trendline goal? To the 25th percentile?
  • Are 80% of the data points within 20% of each other? If not, there’s likely some outlier data that indicates that there's something wrong with the data—whether it be that the assessment isn't being administered correctly, that the student isn’t engaged, or that the assessment isn't high-quality (reliable and valid).

For Behavior/Social Emotional Progress Monitoring Data:

  • Did the student achieve 80% behavior rating for 4 weeks sustained? What percentage of points did the student earn on a daily basis?
  • Is there a particular day of the week, setting, period, or time of the day in which the student increased or decreased the behavior?

Now that we've looked at progress monitoring data, let's dig into intervention data.

What is Intervention Fidelity

Intervention fidelity refers to whether we implemented the intervention the way that we said we would, per the student’s intervention plan.

When it comes to making decisions about intervention effectiveness, these data are just as important as progress monitoring data. If we didn’t implement the intervention exactly the way we said we would, we still don’t know the outcomes or impact of the actual intervention plan we initially outlined.

How Can We Measure or Track Fidelity of Intervention? 

In order to reflect back on intervention fidelity, we need to collect data about the intervention as it’s being implemented. Here are some important metrics to record:

  • What's the participation percent for the intervention (e.g., how often did the student actually receive the intervention)?
  • If the participation or attendance in the intervention was high, was the student engaged in the intervention?
  • What frequency/duration did the student receive the intervention?
  • Who implemented the intervention?
  • In what setting was it implemented (location, group vs. individual, pull-out vs. in class)?

How Do We Determine Intervention Effectiveness?

I recommend starting with the progress monitoring data. Using the data, answer our two main questions: is the intervention working, and is it working fast enough?

If the answer is yes, your team will likely choose to keep the student on the intervention until he/she is on track. At that point, the intervention can be faded out, to ensure a smooth transition.

If the answer is no to either question, look at your implementation fidelity. Was the intervention implemented as prescribed? If not, where were the implementation issues? Here are some example questions to ask:

  • Is it an attendance issue? If so, why? Is the student chronically absent? Is the student in athletics or has something else scheduled at this time? Does the intervention take place on days the student is often absent?
  • Is the interventionist getting called into other meetings and not holding interventions? If so, why?
  • Is it a professional development issue? Has the interventionist been trained in the specific intervention being implemented?

Illuminate eduCLIMBER supports districts nationwide with valid and reliable progress monitoring assessments, intervention fidelity tracking tools, and visualizations to help measure intervention effectiveness. Reach out to learn more.

 

Download meeting agendas, guided data questions, student plan templates, and more in our free MTSS Toolkit Template Pack:

*****

Illuminate Education partners with K-12 educators to equip them with data to serve the whole child and reach new levels of student performance. Our solution brings together holistic data and collaborative tools and puts them in the hands of educators. Illuminate supports over 17 million students and 5200 districts/schools.

Ready to discover your one-stop shop for your district’s educational needs? Let’s talk.

Share This Story

Leave a Reply

Your email address will not be published. Required fields are marked *