Creating School Accountability Reports

A researcher offers guidance on delivering what parents most want to know about their schools by RICHARD S. BROWN

School accountability report cards can serve a variety of purposes. They can inform the public. They allow district leadership to monitor progress toward goals and identify poorly performing schools for targeted intervention. They also give individual schools a chance to highlight accomplishments that might otherwise go unnoticed.

According to Education Week's 1999 report "Quality Counts," 36 states today produce school-level report cards. Many school districts in the other 14 states have created their own accountability reports, though these do not allow for statewide comparisons.

Twenty-six of the states that require school-level report cards make them available on the Web for easy access, but only 13 states require the report cards to be sent home to parents. Even fewer people apparently see these report cards, according to the Education Week report, which surveyed educators, taxpayers and parents.

Of educators, only 51 percent had even seen a school report card for their area. Only about 40 percent of the parents indicated they had seen a school-level report card. And for taxpayers it was only about 25 percent. Thus, a majority of people are not seeing the report cards even in areas where they are being produced. Clearly, the message is not getting out to a broad audience.

What's Presented?
Primarily, the school accountability report cards focus on student achievement, usually in the form of standardized test scores. All 36 states that require school report cards publish test scores as a part of their presentation. The second most common element is dropout rates, followed by graduation rates, post-graduation plans, advanced placement coursework and other course-taking activity.

But is this what people most want to see? Not really, according to a study of school accountability report cards by Beldon, Russonello and Stewart and Research/Strategy/Management, two public opinion research firms.

Their survey of parents, taxpayers and educators (counselors, principals and teachers) indicated that the single most important element they wanted to see addressed on a school-level report card is school safety. Right behind was teacher qualifications. These interests persisted across all three groups of respondents.

Those issues were followed, in order, by average class size, graduation rates and dropout rates. Only then did the survey respondents express interest in student performance data. What you conclude from such findings is that test score information is important, and the public expects it to be there. But it is not the most important need in the eyes of the user.

Another interesting finding was that student demographic data, like ethnic percentages and students with limited English proficiency, was considered by all groups to be the least important element of 21 pieces of information. This suggests that student demographics should be provided as a context and not as an explanation. It shouldn't be played up highly in the report card presentation.

Comparison Data
School report cards inevitably lead to comparison. Parents and educators want to see how a particular school rates in relation to national averages, state averages, district averages or similar schools. The district's leadership also may be interested in providing a comparison of performance against individual school goals.

In addition, the school's performance over previous years is desirable and relevant. Questions such as "How did the school do on this measure last year?" are common. When presenting multiple years of data, it may be possible to indicate a trend on particular measures.

The data should be presented with context in which to interpret it.

The issue to wrestle with here is deciding which comparisons are relevant and appropriate. At present, most school accountability report cards publish information from previous years and allow comparisons to be made between the school's performance and the state averages. District and national averages are less common, primarily because no national tests are given to all students in all states.

Summary Scores
As you present these assorted pieces of information, you must decide on the district level whether to combine all of the data into a summary score or to issue a grade for each school.

How to combine disparate data, particularly for schools that have different types of data elements and different scores from different tests is a tricky technical issue and should be attempted only when the summary measure can be established with good measurement qualities. In addition, the question of how to weight the different pieces of information to come up with a single summary score is important.

For the 1997-98 school year, a dozen states ranked all schools, usually into categorical assignments. That number will increase as education officials in California are currently devising a plan to create an Academic Performance Index score for each of the more than 8,000 schools in the state.

Desirable Indicators
When developing an accountability report card, focus on those measures that are under the school's control. No single indicator, including SAT9 test scores or ITBS test scores, is sufficient to measure all things or serve all purposes. Provide demographic data as a context only, not to be used as an excuse for poor performance on those measures that are under the school's control.

At the Center for Research on Evaluation, Standards and Student Testing, we've looked at hundreds of different report card representations and have seen a variety of features. Some are as limited and primitive as a typewritten page or two of data with some explanatory language thrown in. Others, showing evidence of professional typesetting and design, offer 16 to 18 pages of data for a given school.

However, most parents and members of the public won't read a school report of a dozen or more pages. People tend to want something brief but informative, with access to additional information as needed.

Typically, school report card data is presented in table formats with some explanatory text accompanying and maybe some comparison information, such as district averages or the previous year's performance. The few graphical representations consist of bar charts or trend lines.

When images are used, they typically relate to a single indicator, failing to present graphically the multivariate nature of the school. That is, there is generally an image for one indicator, another image for other indicators, and so forth. The collection of information for a school is not presented in a holistic way.

In our examination of accountability report cards, we found that presenting the information in a more holistic way (see related article) could convey a great deal of information about a school in less space without losing clarity.

An appealing aspect of Connecticut's Strategic School Profile allows for school officials to write a fairly lengthy narrative for the school. Unfortunately, of the nearly 80 we sampled covering the 1996-1997 school year, only four wrote anything. The rest were empty. The opportunity was there to display information about the school that the reader otherwise wouldn't know, but school leaders did not take advantage.

Key Considerations
When you decide to craft an accountability report card for your schools or district, there are several important considerations:


  • Define the audience.


    For whom are the report cards intended? Educators and the public do not always agree on what information is important or how it should be represented. What is the purpose of the report card? This helps to determine what you should include and how you are going to represent it. Although a school report may serve multiple purposes, no single report card can serve all purposes or constituents. It is best to avoid trying to do so.


  • Be clear about the purpose.


    Is the report card intended for accountability or accounting? There is a difference. If its purpose is accountability, it should be more action focused. You want to say, "Here is where this school is now and this is the action we plan to take to get to where we want to be." This differs from accounting, where you're just stating the numbers.


  • Carefully select indicators.


    Determining who the report is intended to reach helps you to decide what the selection of indicators should be. If the audience is parents and taxpayers, keep in mind they tend to be less interested in things like professional development at the school. If it is designed to communicate with educators, providing information in terms of how the school is meeting its goals in the area of professional development opportunities would be relevant.


  • Consider the format and presentation.


    How easy is the report to interpret? Is the information easily digested yet accurately presented? The format will be governed by the method of delivery and how you choose to make the reports available once they are completed. As we've seen in our study, although several states are creating reports, people aren't looking at them. This is due partly to the fact the reports are not being made easily accessible to parents and the general public.


  • Consider the availability and quality of existing data.


    In our efforts to create school reports for Los Angeles-area schools, we pulled together data from seven different sources. We found the data is not always maintained at the same level of analysis in each of the data sources. In addition, some data are not updated regularly or maintained in a way that ensures currency.

    Some measures that you might like to see across the district or state may not be available, so you may want to consider setting up your data structures first and collecting your data over a period of time before starting the reporting process. You cannot have a good accountability system or school report card without good data.


  • Consider the credibility of the messenger.


    Interestingly, the Education Week report surveyed educators, taxpayers and parents to identify who was the most credible source of information for school report cards. Non-profit organizations were considered to be the most credible by the parents and taxpayers. The least credible source for all three groups was the local media. But educators and non-educators didn't agree on which groups were most credible. For the educators, the districts and principals rated pretty high. Parents and taxpayers had them rated a bit lower.

    Richard Brown is a project director at the Center for Research on Evaluation, Standards and Student Testing at UCLA, Box 951522, Los Angeles, CA 90095. E-mail: rbrown@ucla.edu