Sensitive Data and Lethargic Data
The first quarter is over. What do we look for to determine results? Do you look for the percent of students who scored above a 70 on a single test? This is lethargic data. It doesn't tell you what you really need to know.
Lethargic DataA passing rate (the percentage of students who scored above a 70) is lethargic because it doesn't show growth well. It doesn't respond quickly to changes in student performance. It isn't all that reliable.
I'd really love to go into the research statistics, but then you'd probably stop reading. So let's just talk the logic.
Lethargic Data in ActionImagine 30% of your students are slightly below level and you give an on-level test at the end of the quarter. What can you expect? 30% of your students are likely not going to pass.
That doesn't mean the same 30% didn't learn. It doesn't mean the curriculum and instruction were ineffective. How much growth occurred? Can you tell whether instruction was effective?
That's the problem. Lethargic data doesn't tell you the information needed to make effective improvement decisions and adjustments.
Sensitive DataStudent data such as pretest-posttests or 3-week quick probes are more sensitive growth. They are the data needed to answer the real questions you have. Questions such as:
- Did the curriculum and instruction create growth?
- How much learning actually occurred?
- How effective was the instruction?
Sensitive data will move up and down based on specific teacher and curricular inputs. It's like the needle on a tachometer. You push the pedal, the RPMs go up. You let go of the pedal, the RPMs go down. This is sensitive data. This is informative!
Both data have their place in our toolbox and data dashboards, but which is right at this time? Which is right for your purpose?
In the next post, we'll discuss some examples of sensitive data for school leaders.
Here's a previous post, Data is NOT the Solution for Teachers