Features

Data Analysis in Administrators' Hands: An Oxymoron?

Statistical strategies, not gut feelings, form the hallmark of good instructional decisions by Theodore B. Creighton

For too long many, school leaders have made decisions about instructional leadership with intuition and by "shooting from the hip." All too often, the decision-making process fails to include data collection and data analysis.

During my many years as a principal and district superintendent, I often questioned the enormous amount of time we spent collecting numbers. Each morning from 30 to 40 minutes was spent in gathering and reporting attendance. The annual state-mandated testing procedure began in early October and seemed to exist in one form or another for the entire year. We collected our test scores, filed them in the office and seldom used them again.

School districts everywhere collect and maintain many forms of student data. Standardized test scores, average daily attendance figures and transcript data are required by states for funding purposes. However, most schools collect these data to satisfy administrative requirements rather than to assess and evaluate school or student improvement. Standardized test scores generally are reviewed only briefly before the local newspaper calls. Average daily attendance is reported to state education agencies, then filed away.

Educators rarely examine these data in a systematic way to assess the quality of teaching and learning at their school. Of course, this presupposes that superintendents, central-office administrators and principals have an understanding of data analysis and ways to use this analysis to improve teaching and learning.

Misunderstood Notions

Few things are more feared than the thought of statistical analysis. To most educators, statistics means endless calculations and memorization of formulas. Statistics is seen by most as a formal domain of advanced mathematics, represented by a course or two taught by graduate school professors trying to make a student's life as painful as possible. Courses in statistical methods are usually taught with formal proofs of mathematical theorems and the derivation of statistical formulas as a main focus.

The educator's fear of statistics likely stems from a variety of factors, but principal and teacher preparation programs must accept the fact that the presentation of statistics in education probably lacks four important components.

First, instruction on statistics does not emphasize the relevance of data to the day-to-day lives of principals and teachers. Second, it does not fully integrate current technology into the teaching and learning of statistics. Third, few (if any) statistics courses are designed for students enrolled in educational leadership or teacher education programs.

Finally, many statistics courses taught in colleges of education focus a major part of time on inferential statistics as a tool in conducting research projects and dissertations. Far less time is spent on statistical strategies that might help the principal or superintendent improve his or her skills in problem analysis, program and student evaluation, data-based decision making and report preparation.

Trouble Spots

* Lack of Relevance.

Traditional courses in statistics elicit the frequent student query: "When will I ever use this stuff?" And rightfully so, as our research indicates most classes at the college level are taught as a hard-core mathematics course devoid of powerful and practical applications relevant to school administration and student learning.

We seem to realize the importance of relevance in courses on the principalship and instructional supervision, but we have been slow to add the same practical connections to our statistics and research courses. Unless this change is made, we will continue to face high levels of anxiety in our principal preparation programs.

* Integration of Recent Technology.

The advance of technology and the large selection of user-friendly computer software can assist us as we move toward a more practical and relevant presentation of statistics for educators. Several good statistical packages exist. These include GB STAT and the Statistical Package for the Social Sciences, better known as SPSS. Better yet, we can use Microsoft Excel to perform our data analysis. All are easy-to-use, menu-driven statistical programs applicable for analyzing student standardized test scores, attendance and dropout data, college entrance requirements, etc.

These common software programs can tabulate the number of males and females in a school, calculate average grades of students, compare test scores by gender, determine if there is a statistically significant difference between achievement of athletes and non-athletes, compare computer-assisted instruction with other methods of delivery, and test the effectiveness of whole language versus phonics instruction.

* Statistical Analysis Designed for Educators.

Again, many courses concentrate on psychology, sociology and other social sciences with little mention of ways that statistical analysis can assist school leaders in their day-to-day decisions. We need to work with data collected from real classrooms, focusing on student instruction and assessment, attendance and dropout rates, college entrance tests and instructional program evaluations.

* Descriptive and Inferential Statistics.

While inferential statistics are more likely to be used in research studies and dissertations, descriptive statistics are more likely to be used in the schools. Descriptive statistics help us summarize, organize and simplify data (percentile ranks, means, median, modes, range, standard deviation) and inferential statistics use sample data to estimate parameters and test hypotheses.

In most cases, educators encounter data in the schools that are related to populations rather than samples. In other words, data are collected from entire classes or grade levels, entire building populations and entire district populations. Administrators are not interested in generalizing their school data findings to other schools or to estimate parameters and test hypotheses.

Defining Statistics

Statistics is not advanced mathematics. The majority of statistical analyses useful to administrators can be completed with a basic understanding of mathematics and is more conceptual than requiring complex calculations. Statistics is a set of tools designed to help describe the sample or population from which the data were gathered and to explain the possible relationship between variables.

A superintendent might wonder if the mathematics instruction in his district's schools is being delivered in a manner that treats boys and girls equally. In other words, is math being presented in an equitable manner at his school? A simple statistical procedure called the Pearson correlation can help identify a relationship between math scores and gender (a simple stroke of the computer keys with the help of EXCEL, GB STATS or SPSS). If the results of the analysis point to a pattern of boys receiving higher scores in mathematics on standardized tests, the principal may want to look more closely at classroom instruction to determine if perhaps instructional strategies can be altered to address the equity issue.

A director of secondary education might be interested to know if a relationship exists between students' performance on the district's writing assessment and their socioeconomic level. In other words, do students who come from lower socioeconomic backgrounds really perform at lower levels as we are led to believe? Or are other variables responsible for the variance in writing performance? Again, a simple correlation analysis will help describe the students' performance and help explain the relationship between the issues of performance and socioeconomic level.

Data analysis does not have to involve complex statistics. Data analysis in schools involves the collection of data and the use of available data to improve teaching and learning. Interestingly enough, administrators have it pretty easy. In most cases, the collection of data has already been done. Schools regularly collect attendance data, transcript records, discipline referrals, quarterly or semester grades, norm- and criterion-referenced test scores and other useful data. Rather than complex statistical formulas and tests, it is generally simple counts, averages, percentages and rates that we are interested in.

Worthless Instruction

There are several reasons why data are little used in our schools and why it is so difficult to engage school administrators in data analysis. Most of us have graduate degrees that required at least one course (if not more) in tests and measurements or statistics.

Can you recall any in-depth discussion in those classes about what to do with assessment information in planning how to help students do better? These classes generally are taught by researchers and focus on hard-to-understand formulas and too few examples related to the daily lives of school administrators.

Gerald Bracey, internationally recognized as an expert in the understanding of educational statistics, states that "many of the university professors who create and use statistics are more comfortable using them than they are teaching other human beings what they mean."

It is not with pride that I suggest the real solution to the problem must originate in our principal and superintendent preparation programs. Though there are a few bright spots at some universities, for the most part there is no attempt to increase aspiring administrators' understanding of data analysis or the use of that analysis to improve teaching and learning.

Let me also suggest the situation is not likely to change until we hear loudly from practicing school administrators. As you send your aspiring principals and superintendents to university preparation programs, insist that the preparation include more relevant and practical instruction in data analysis and the use of data to improve decision making in the workplace.

Analysis by Superintendents

As administrators, we are all familiar with collecting average daily attendance figures. These numbers provide the formula used to receive our funding from the state and federal governments. In most cases, once we report the attendance to our county office or state department, we put the data away in a file someplace. Rarely do we use these data to make decisions about curriculum and instruction.

Plenty of opportunities exist for school system leaders to make more informed decisions based on data. Consider the average daily attendance rate over recent years at Westside High School: 94 percent in 1996, 92 percent in 1997 and 94 percent in 1998.

At first glance, things look impressive. On average, over a three-year period, approximately 93 percent of the students are in school every day. We reason that 93 percent is pretty good and deserving of a grade of "A." So we report the figures to the appropriate agencies and move on to other matters.

But take a closer look. If 93 percent of our students are in attendance on a typical day, we must conclude that 7 percent are absent. So in fact, on average, our high school students miss nearly two weeks of school per year. We calculate this by taking 7 percent of the 180 school days (180 x .07 = 12.60). Wow! Now that's a different story. Do we not agree that 13 days of school (on average) missed has ramifications for instruction and learning? Are there ways of adjusting curriculum, scheduling and delivery of instruction that might help us reduce the number of absences at Westside High School?

Let's disaggregate, or break down, our data a bit further. By looking at daily attendance figures for each day of the week, we discover that our ADA fluctuates from 95 percent on Mondays to 97 percent on Wednesdays to 89 percent on Fridays. Now we see a different picture.

To no great surprise, we discover an up-and-down attendance pattern during the week. We notice that the highest attendance rate is on Wednesday, the day we hold the football rally. The lowest attendance rate shows up on Friday, the day most of our testing and assessment takes place.

Such findings suggest the administration consider moving the football rally to Friday and encourage teachers to do more of their testing on other days.

This example illustrates how easy it is to use existing data to help us with day-to-day school operations. Hopefully, you sense the importance of data analysis and how we link data analysis to what we spend our lives with—curriculum, instruction, assessment and student achievement.

Central Office Use

Karla, director of elementary education in a small rural school district in southeast Idaho, is interested in finding out if a mathematics textbook series adopted by the district five years ago is effective for all ability levels of students. She suspects that the district math program overemphasizes computation and repetition but lacks the components of in-depth investigations and problem-solving experiences.

To answer the research hypothesis, she collected four years of her students' Iowa Test of Basis Skills percentile rank scores. She created the following procedures:

* Step 1: Ranking the students into categories of high, medium and low.

Karla based her grouping on the percentile rank scores from the first year baseline data and categorized students scoring at or above the 60th percentile as the high group, students scoring from the 40th to the 60th percentile as the medium group, and students scoring below the 40th percentile as the low group.

* Step 2: Creating a data file.

Using Microsoft Excel, Karla entered the students' ITBS percentile ranks into a spreadsheet.

* Step 3: Analyzing the data.

With a few simple mouse clicks on her computer, she discovered that a majority (85 percent) of her students who were rated "below average" revealed no change or an increase in their scores over the four-year period and an unusually high number (75 percent) of her students who were rated "above average" revealed no change or a slight decline in their scores over the same period.

After running a few more statistical tests with her student data, Karla presented her findings to the superintendent. She was invited to share her data analysis with the board of education. Realizing that her analysis did not necessarily prove anything, she felt the pattern at least identified a need for re-evaluating the math program and especially teaching methods in the classroom.

Karla's conclusion to the superintendent and board was: The district math program seemed to be challenging the lower level of students by reinforcing basic skills, but for the higher-ability students who'd already achieved the fundamentals a need existed for an additional instructional forum for them to apply and experiment with the numbers and mathematical concepts they already know.

The end of this story was the implementation by Karla and her colleagues of a math enrichment program that encouraged the higher-level students to think mathematically and apply this thinking to complex and multidimensional math problems and to be able to communicate this thinking clearly.

Beyond Intuition

Collecting data without purpose is meaningless. All too often, school leaders fail to formulate decisions based on data. The effective use of data must play a major role in the development of school improvement plans. It helps us identify students who are improving and those who are not and helps to identify the reasons.

Meaningful information can be gained only from a proper analysis of data rather than intuition and gut feelings. The administrator can serve as instructional leader as data-driven decision making and instructional leadership go hand in hand.

Theodore Creighton, a former superintendent, is associate professor of educational leadership at Sam Houston State University, Campus Box 2119, Huntsville, Texas 77341. E-mail: creitheo@shsu.edu. He will become executive director of the National Council of Professors of Educational Administration in June. He is the author of Schools and Data: The Educator's Guide for Using Data to Improve Decision Making.