The question of when to aggregate, and when to distribute, innovative energy in the development of data systems is an omnipresent one. I’ve discussed before how user-friendliness, especially important in the human services field, depends upon the robustness of the underlying information system. A recent Wireless Generation (an organization covered in an earlier post) report (a) discussed the history of information system development in state school systems and (b) provides suggestions about how to make such systems accessible, engaging, and useful.
First, a quote from the report (It is a bit long; feel free to focus on the parts I italicized, if time is a concern.):
The first generation [of school data system development] was accountability and verification systems for [No Child Left Behind-NCLB] data. States created a pipeline of data from local education agencies (LEAs) that includes building-level enrollment (i.e., which students are enrolled at which school), demographics, and test data. . . .
In the second generation, largely funded by [the Institute for Education Sciences-IES] State Longitudinal Data System (SLDS) grants since 2005, states created data warehouses to store multiple years of data at the student level and provided tools to policymakers and researchers to study the data. Most states load their SLDS by capturing data from the first generation NCLB pipeline as it flows by. So the warehouse has the same data sets: building-level student enrollment (with a statewide unique student identifier), demographics, and state test results.
Spurred on by subsequent doses of IES funding and by the exhortations of the America COMPETES Act and the Data Quality Campaign’s “10 Essential Elements,” some states have begun to enhance their SLDS warehouses. A popular addition is post-secondary data, so that k-12 systems can track the success of their graduates in college or the workplace . . .
The other major enhancement that states are making to second-generation systems is driven by the teacher effectiveness Zeitgeist. To be able to calculate value-add for teachers and principals, teacher data must be linked to student data over multiple years. This is not easy. Many districts, let alone states, struggle to accurately match students and teachers. And exceptions abound: team teaching and pull-out interventions for example (who gets the credit for student growth?). . . .
By funding a single, state-wide system, the state can use its aggregated purchasing power to dramatically reduce per-student costs and steer the savings to ensuring excellent applications—such as ‘early warning’ systems that identify students at risk of dropping out early enough to prevent it—and properly investing in professional development (often skimped on by LEAs), for instance, in analyzing data and planning instructional changes based on it.
It is certainly true that state-wide purchasing power can drive down the costs of he next phase of information systems development in our nation’s school. Why, then, not extend the concept and initiate such database integration at the national-level? Clearly, political custom is a central part of the answer. But so, too, are personnel resource distribution and local implementation costs: The federal government faces some of the same challenges, in promoting national data systems, as do multinational corporations, in developing global operations. The custom of developing curriculum at the state-level creates a “localization cost” for any federal initiatives in the area that does not exist for the states.
The point here is not, specifically, to advocate for a national curriculum, but to point out how organizational culture, custom, and power distribution has real implications for the economics and operations of data systems.