I promised I’d return to the question of usefulness and usability in Instructional Improvement Systems, as it relates to the Wireless Generation report I discussed in an earlier post. Here are their conclusions, interspersed with my comments:
“First, data must be fresh: between a day and a week old . . . One large district discovered, soon after the launch of its teacher-facing system, that teachers started to call the help desk to complain as soon as the data are even three days stale.
This teacher behavior happily dovetails with research (discussed previously) that shows that short-term data analysis is most convincingly correlated with improved instruction.
“Second, data must be rich, providing multiple sources so that educators can ‘triangulate’—home in on a particular problem with the confidence that different measures agree. Many standardized assessments (including those sold as ‘formative’) are tuned for the middle of the curve, not for below-proficient students; they may be able to pick out at-risk students but do a poor job diagnosing what is causing at-riskness.
Third, data must be fine-grained enough to be instructionally actionable . . . for instance, if standards do not differentiate two-digit multiplication items that are cast as computation versus word problems, teachers may not uncover the students who need extra support in approaching word problems.
The two points above outline the classic problem of moving from a data system that provides general descriptive information (about the whole dataset) to one that provides actionable information about particular challenges.
“Fourth, if users are truly to explore data, access tools must be Google-fast and Apple-simple, with response times of, at most, a few seconds.
Who can argue?
“Finally, data needs to be clean and accurate. Happily, the best way to establish accurate education data for a student is to show it to that student’s teacher—or, of course, the student—and provide him or her with a way to address errors, for instance, by calling a help line or clicking a ‘report a problem’ link.”
Understanding variations in user behavior is an often-neglected aspect of information systems design and is a defining pursuit, as I see it, of the field of knowledge management. Concrete, human, details, such as these, increase my confidence in the expertise of the analyst.
So far, so good.
Next time we take on this issue, we’ll look at some of the peculiar dynamics that arise when initiating systems change in a non-competitive environment, such as characterizes good chunks of the human services world.