School Solutions                                             Page 45


Program Evaluation Made




 Ben Sayeski

In an environment of increased scrutiny and accountability, questions about the instructional quality of districts, schools and teachers come to the forefront of strategic planning. Each year, school districts spend an enormous amount of time and resources collecting and warehousing data in an attempt to answer questions of quality. These data include student characteristics, state tests, formative assessments, intervention programs and reform initiatives. While the data are necessary, these warehouses become overwhelming.

For example, the Los Angeles Unified School District, where we are working, has about 664,000 students for whom detailed demographic and academic data are collected. Over a six-year period, LAUSD tracked the results and conditions of close to 8 million test administrations in addition to attendance, discipline and Title I statistics. LAUSD is left with one key question: How do we make sense of all these data?

Even for districts of average size, the question remains daunting, with complex interactions among student characteristics, program data and test outcomes to consider. Professional judgment and experience cannot be replaced with even the most sophisticated data analysis systems and software. The sheer magnitude of the data collected on an annual basis, however, forces the complementary skills of professional judgment and data analysis to be used in tandem.

A Matrix for Meaning
My firm, Education Strategy Consulting, believes school districts should not have to choose between statistical rigor, the ability to communicate results, and the professional judgment necessary to act on data.

Our tool, the ESC matrix, integrates absolute and value-added measures into an interactive and visual framework that allows a district to put all the pieces together so they can make sense of their data. The visualization helps to answer questions such as the following: Which programs are producing the fastest academic growth among students? What is the return on investment when we overlay the cost of intervention programs? How can we set up a constructive conversation about which programs we should sunset?

With the increased availability of disaggregated measurements, decision makers are asking tougher, more strategic questions about data than ever before. Yet most administrators don’t have the tools available to answer these questions effectively. The matrix provides visual and numeric results related to any number of strategic questions. The basic output uses the following questions to frame a conversation about quality:

Which schools are outperformers, scoring high on both absolute and value-added measures?

Which schools present opportunities, scoring high on value-added measures but not yet reaching absolute standards?

Which schools raise red flags, scoring well on absolute measures but not demonstrating value added?

Which schools are underperformers, scoring poorly on both absolute and value-added measures?

Easy Visualization
Our firm provides high-quality, value-added analysis for use by educators to improve the achievement of all students. Our visualization software, the ESC matrix, provides an interactive format to allow a range of users the ability to search, explore and find answers to their most pressing issues.

As a member of the AASA School Solutions Center, we can take the data you have and make program evaluation what it should be — statistically rigorous, easily communicated, professionally informed … and easy.

Ben Sayeski is managing partner of Education Strategy Consulting in Charlottesville, Va. E-mail: b.sayeski@escmatrix.com.


Give your feedback

Share this article

Order this issue