Data in Your Hands

A district uncovers hidden patterns in learning and site operations using the Quality School Portfolio by Raymond Yeagley

Several sessions on decision support systems and how they benefit school districts captured my attention at last year's management information systems conference, sponsored by the National Center for Education Statistics and the Arizona Department of Education. The highlight was a demonstration of a locally developed data warehouse used by the 60,000-student Tucson, Ariz., Unified Schools.

I was impressed with the work this district had done, but it seemed to me that a decision support system would be impractical for my much smaller system in Rochester, N.H., even with the high-end technology available in our schools. Implementation would require additional technical staff beyond the financial capacity of our 4,500-student, low-wealth district.

I was wrong. Within a few weeks, our district was invited to participate in an AASA pilot project using the Quality School Portfolio, developed by the Center for Research in Evaluation, Standards and Student Testing at UCLA under a U.S. Department of Education grant.

QSP is one of several tools available for integrating and analyzing school data (see additional resources). It is an easy-to-use database that will load records from any source that exports to a text file. Analysis tools in QSP permit multiple levels of disaggregation, provide automatic tracking of goal progress based on the indicators matched to those goals, support cross-sectional and longitudinal analysis and include reporting functions with graphs, tables and "dashboard" displays that make the information clear and easy to understand.

Getting Started

Collecting and organizing data are neither the most important aspects of data use nor the most difficult tasks. Much of the information needed by schools already is available in electronic format.

Administrative software packages used by most schools store student profiles, grades, attendance and discipline records. Testing companies can provide electronic versions of their scoring reports that include student-specific information. Additionally, scanners permit districts to automate tabulation of surveys and other local data collections. Availability and compatibility of data are no longer a barrier.

A greater challenge than collecting data is creating a process to transform the data into easily accessible, useful information that staff members will employ for school improvement. Building on a goal-setting and accountability process already in place in our district, we identified four principles to guide our efforts in Rochester:

* Instructional change is the first priority. Data will be used to identify district, school and classroom strengths and weaknesses, then find ways to reinforce the strengths and address the weaknesses to improve student learning.

* Staff training is essential for effective data use. Staff members must understand not only how to interpret the information accurately, but also how to identify, adapt and apply more effective instructional strategies based on their analysis.

* Communicating results is a vital component. Communicating a complete, accurate and understandable picture of our district's performance to all of our constituents will improve community support and encourage school effectiveness.

* Inviting feedback closes the loop. Obtaining feedback from constituents is as important in assessing district and school progress as measuring student achievement. In addition to the traditional performance indicators, the district can benefit from obtaining and analyzing data on community satisfaction and all other aspects of operation.

Asking Good Questions

One of the most difficult challenges for a district is to conduct a critical self-assessment. Critics and supporters alike often find it easier to start with their conclusions, usually characterized as "logical" or "common sense," then search for evidence to support them. Likewise, observers frequently limit their examination to data that already are available.

Both of these approaches reduce the value and utility of the inquiry. A more productive approach is to start with probing questions that will get at the heart of the district's performance, then find the data and new ways to assess the data that will answer those questions.

Our school district has suffered from these limitations for years. We have looked at item analyses on the state and national assessments to see what questions were missed most frequently by our students. We have studied individual student results to identify missing skills and have provided extra help in those areas, as time and resources permitted. We also have looked at the districtwide averages to see whether the trends were headed in the right direction. While this approach has helped some students, it has been of little value for systemic instructional improvement.

A database like the Quality School Portfolio encourages the broader inquiry by integrating information from a variety of sources to identify previously hidden patterns in student learning and other aspects of school operations. This allows educators to address needs systematically, instead of on a student-by-student basis.

For example, the district may look at common characteristics of students having difficulty with a specific skill or set of skills. Did they come from the same feeder school? Are there common demographic characteristics such as limited English proficiency? Was there a particular reading series or instructional approach used in their early grades? Are the problems gender specific? Is their annual progress in line with students who may have started at a higher achievement level? The integrated data collection can simultaneously focus on individual indicators and characteristics, while expanding the view to include a multitude of contributing factors.

A second level of questioning won't be answered from the database. Once the patterns of need have been identified, it is crucial to ask what can be done to improve the situation. What will have to change in the classroom? How can the teacher find programs, learn skills and access resources that will make a difference? How can the school assure that those changes will be made? What support can be provided to address non-instructional needs? Analyzing data will be of little value if it doesn't lead to instructional change.

Staff Training

It isn't necessary for your whole staff to become psychometricians or statisticians for effective data analysis. However, an understanding of some basic statistical principles is essential to avoid the most common traps and misuse of the information.

You don't want your staff, for instance, to average percentile ranks, to imply causation from correlation, or to declare the first good round of test scores is proof positive that a particular technique will be successful for all students. Conversely, your efforts to use data will fall flat if staff members look at the information and simply wonder what it means or what to do with it.

Toward this end, Rochester has developed and continues to refine a matrix of data-related competencies for staff members. The general areas in the matrix are: understanding and using data and data sources; matching indicators to classroom instruction and school goals; improving test design and interpretation skills; understanding basic statistics; relating data to instructional practices; and using data to communicate student and school performance to constituents.

Within each of these areas are several subcategories, with three proficiency levels for each. We begin by having staff conduct a self-assessment, then design professional development offerings around identified needs. We then measure staff progress in these competencies, just as we measure student academic performance and include this as part of our district's assessment.

Constituent Communications

One of the greatest challenges faced by a data-using district is communicating the information to the public. Most parents and community members, and even some school system employees, have little understanding of data and statistics. It is essential that the conclusions from data analysis be clear, concise, accurate and truthful.

QSP uses tables as well as an array of graphical reports, including dashboard displays that can be shown on a district's Web site. As easy as these may be to read and understand, they are still subject to misinterpretation. Therefore, it is advisable that an explanation be available for every conclusion.

We have seen that the potential for misuse of data is staggering, both from intentional manipulation and through lack of understanding. The old adage, "figures don't lie, but liars figure," is particularly important to remember in communicating information on school performance, as any slight exaggeration, speculation or error is likely to be touted by opponents as a self-serving lie.

However, not all of the data need to be integrated into a single source to be useful. Rochester is using a number of databases to monitor different aspects of our operations and is looking beyond student-related data to assess school district performance.

For example, we periodically will use stand-alone surveys of students, parents, businesses and other constituents to measure community satisfaction and determine whether the curriculum matches the needs of our graduates. These may be analyzed in QSP, although separately from the main student files, or in other databases and spreadsheets.

Similarly, we will collect data on the effectiveness of our professional development programs, technology use, recruiting and staffing efforts and a number of other topics. In short, we will gather data about every aspect of our schools' operations and performance.

Raymond Yeagley is superintendent of the Rochester School Department, 150 Wakefield St., Suite 8, Rochester, N.H. 03867. E-mail: yeagley@rochesterschools.com