X
Business

Why yes, I can do longitudinal data analysis

Because I'm not really all that busy (sarcasm drips here), it's fallen to me to figure out what is happening to our MCAS scores. For those of you outside the Commonwealth of Massachusetts, the MCAS is our battery of standardized tests in English, Math, science, history (the list goes on and more tests are being piloted as we speak).
Written by Christopher Dawson, Contributor

Because I'm not really all that busy (sarcasm drips here), it's fallen to me to figure out what is happening to our MCAS scores. For those of you outside the Commonwealth of Massachusetts, the MCAS is our battery of standardized tests in English, Math, science, history (the list goes on and more tests are being piloted as we speak).

For some of our schools, things are looking quite bright. The high school, for example, has some of the most improved scores in the state. Other schools just got hammered last year. Still others managed to hold their own, but have some socioeconomic subgroups that are sending up red flags.

Obviously, this is of concern, especially since the state continues to make graduation requirements tied to these tests more rigorous and increasingly holds schools accountable to make "adequate yearly progress" in improving our scores.

The real questions we want to answer are twofold:

  1. Can we identify and remediate areas of particular deficiency that have shown up over time for groups of students?
  2. Are there cohorts of kids that are struggling more than others over time (meaning, for example, that the gains observed at the high school are more the result of a high-achieving bunch of kids rather than the particular efforts of the teachers).

Both of these questions include the term "over time," which leads to something really fun called longitudinal data analysis. Basically, we need to analyze trends in the MCAS data for several groups of kids (defined by their year of graduation) to modify curriculum, address remediation needs correctly going forward, and predict where our major areas of difficulty are going to occur in the coming years.

This, of course, was the idea behind "data-driven instruction." Yet many schools haven't gone beyond looking at yearly snapshots of students and tweaking curriculum for the next year. For example, if eighth-graders struggled with ratios, then next year's eighth graders will get extra work on ratios.

Fortunately, in a former life I was a statistical programmer analyzing clinical trial data, most of which is longitudinal in nature, so I'm actually looking forward to this project. I just hope no one shoots the messenger as the analyses come together; administrators might lie, teachers might spin results, but the data do no such thing.

Editorial standards