Taking Measure

Education Next Issue Cover

A recent Council of the Great City Schools report hailed Houston for 'beating the odds' by generating sizable gains in student achievement.



By JANE HANNAWAY & SHANNON MCKAY

0 Comments | Print | PDF |

Fall 2001 / Vol. 1, No. 3


A recent Council of the Great City Schools report hailed Houston for “beating the odds” by generating sizable gains in student achievement. Much of this is no doubt due to its accountability system. Even though accountability is increasingly recognized as the linchpin of education reform, only a few states have made real progress in establishing accountability systems. It appears that they have much to learn from the experiences of Texas and the Houston Independent School District (HISD).
The Texas educational accountability system has been in place since 1993. It is based mainly on student performance on the Texas Assessment of Academic Skills (TAAS), which is administered to students in grades 3 through 8 and in grade 10. The test is aligned with the state’s standards and measures performance in reading and math; writing in the 4th, 8th, and 10th grades; and science and social studies, but only in the 8th grade. In the past four years, the Houston school system has also been giving the Stanford 9 test to students in order to benchmark the achievement of its students against students nationwide.

Districts report the percentage of students who “pass” the TAAS at each school and for the district as a whole. The state then classifies schools and districts into four performance categories on the basis of test scores, dropout rates, and attendance rates. The categories are: “exemplary,” “recognized,” “acceptable,” and “low performing.” Over time the state has moved the bar for schools and districts steadily higher. For example, last year at least 50 percent of a district’s students needed to pass the TAAS in order for a school or district to be rated “acceptable.” This was up from 30 percent a few years earlier. It is expected that the bar will continue to rise by 5 percent a year until the passing standard reaches 70 percent.

The Texas system involves both financial rewards and relief from regulations for high-performing schools and assistance and sanctions for low-performing schools. Districts and schools receiving the lowest accountability rating are visited by a peer-review team and must develop an improvement plan. If the low rating continues for two years or longer, the state can intervene more directly, for example by taking over the school. Parents may also transfer their children from a low-performing school to a higher-performing public school. Houston is unique in that it provides much more targeted assistance to low-performing schools than other districts.

The test scores of all eligible students who are registered in a district in the October listing of the Public Education Information and Management System (PEIMS) are included in the state accountability system’s performance measures. While this means that some of the students, whose test scores are included in the school’s performance measure, may have only been in that school for a relatively short time, it avoids problems associated with excluding the high-mobility students-typically the lowest-performing students-from the district’s overall accountability measure. HISD goes one step further and includes all eligible students in a school at the time of testing, regardless of where they were in October. The intent, as one district official noted, “is to make the school feel responsible for every student.”

Before 1999, special-education students were excluded from state accountability measures, but since then special-education students who are receiving instruction on grade level are also included. Houston went even further by including all special-education students, even those not on grade level, in its testing program, except those classified as multiply impaired, mentally retarded, emotionally disturbed, autistic, hearing impaired, or having a traumatic brain injury.
One of the special features of the Texas plan is that performance statistics have to be reported for different student subgroups: African-Americans, Hispanics, whites, and economically disadvantaged students. These subgroup ratings weigh heavily in the overall performance rating for a school or district because the rating given by the state is based on the lowest performance on any single criterion (TAAS, dropout rate, attendance rate) for any subpopulation. Thus, even if the majority of the students in a school were performing well, if its economically disadvantaged students were performing poorly in math, it would receive an “unacceptable” rating overall.

Houston rates schools not only on their level of performance but also on their progress. Progress is judged against the amount of improvement a school is expected to make, with lower-performing schools expected to make more progress. For instance, schools with a passing rate between 60 percent and 75 percent are expected to improve by 4 percentage points; schools whose passing rate falls between 45 percent and 60 percent are expected to improve by 6 percentage points. This enables the district to recognize schools that are making good progress even if they have not moved into a higher performance level.

Student Performance

We examined data from the state and the district to answer several questions about Houston’s performance. First, has student performance in Houston improved over time? Here we looked at data for the entire district and broken out by students’ race/ethnicity and socioeconomic status. Second, how does Houston perform relative to schools statewide and in other urban districts? Third, to what extent is Houston closing the gap between minority students and white students?

Measuring Houston against the state of Texas is holding the city to a high standard. Research from the RAND Corporation, among others, shows that, after adjusting for student background, Texas and North Carolina made greater leaps on the National Assessment of Educational Progress from 1990 to 1997 than any other states. This research also shows that Texas appears to have been particularly successful in closing the achievement gap between minority students and white students. While other research from RAND suggests that the TAAS results may overstate the amount of “real” learning gains in Texas, the issue here is not whether the TAAS captures all the areas of learning assessed, for example, by the NAEP, but the extent to which performance is improving on those areas measured by the TAAS, which is keyed to the state’s curriculum and serves as the basis of the state’s accountability system. We did, however, examine how closely the TAAS results correlate with results from the Stanford 9, a nationally normed test. An analysis of school-level data by grade for reading and math in 1999 and 2000 showed large and highly significant correlations, suggesting that schools that perform well on the TAAS are also likely to perform well on nationally normed tests. We confine our discussion here to the TAAS results, using the passing rates set by the state, and make comparisons between Houston and the state and between Houston and other urban districts using this one measure.

Comparing Houston with the state. As measured by the TAAS, Houston has made great strides in student achievement. From 1994 to 2000, the number of Houston students passing the TAAS in math increased from 49 percent to slightly more than 80 percent. Houston inspired large gains in reading as well, with pass rates going from 65 percent to 81 percent (see Figure 1). These gains exceeded the gains statewide. In math, the pass rate statewide rose from 61 percent to 87 percent, an increase of 26 percentage points (compared with a 31 percentage point increase in Houston). The statewide reading pass rate rose from 77 percent to 87 percent, an increase of 10 percentage points (compared with a 16 percentage point increase in Houston).

Figure 1
Neither the district nor the state has experienced steady gains in achievement over time. Test scores at the district and state levels increased sharply in the beginning years and started to level off in 1998. Indeed, in 1999 Houston’s reading and math scores actually dropped, and the state showed a slight dip in reading. Houston’s drop and the leveling of state performance are likely due, at least in part, to changes in the pool of students required to take the test. As noted earlier, special-education students who were receiving instruction at grade level were included in the state’s testing system for the first time in 1999, and Houston imposed an even more inclusive policy. Similarly, Houston was less likely to exempt students with limited English language skills than was the state, a practice that may also have contributed to the 1999 drop in the district’s performance. In short, the 1999 data for Houston probably include a larger fraction of lower-performing students than those included in the state’s measure. It is important to note that Houston rebounded in 2000, posting a gain of nearly 6 percentage points in reading and math.

In both reading and math, while the absolute performance of white students in Houston is higher than that of Hispanics and African-Americans, it is clear that the upward trend in performance for minority students in Houston is steeper than that of whites.

Comparing Houston with other urban districts. Arguably, Houston experienced more improvement than the state because from the beginning its scores were so low. In 1994 Houston’s pass rates in math and reading were 49 and 66 percent, respectively; the corresponding rates for Texas that year were 61 percent in math and 77 percent in reading. The presumption is that the higher the initial scores, the more difficult it is to elicit substantial gains; in other words, the state’s scores were approaching a “ceiling.” A behavioral argument could also be made: that the state’s reform policies, its public shaming and sanctions for low-performing schools, would most strongly influence the behavior of urban districts, which tend to have a history of low performance and mismanagement. It is thus important to examine how Houston fares relative to Texas’s other major urban districts.

In 1994 Houston’s performance on the TAAS in math placed it just below the middle of the pack of six large urban districts in Texas (see Figure 2). Three of the five other urban districts outperformed Houston in 1994, though some only slightly; in 2000, one district, El Paso, equaled Houston’s performance. A similar pattern held in reading. Houston’s improvement looks even better considering the city’s increases in passing rates. Houston, by far the largest of the six urban districts, made greater progress in reading and math than all but one of the major urban districts, San Antonio, which showed remarkable gains.

Figure 2

Closing the Achievement Gaps

The large gap in student performance between white students and minority students is one of the most serious problems facing the United States. The nation has watched Texas not only for the improving performance of its students across the state, but also for the shrinking achievement gap between white and minority students as measured by the TAAS. Here we examine the progress that Houston is making on closing the achievement gaps. Again, we examine Houston’s progress relative to the state and relative to other urban districts.

Comparing Houston with the state. In both reading and math, while the absolute performance of white students in Houston is higher than that of Hispanics and African-Americans, it is clear that the upward trend in performance for minority students in Houston is steeper than that of whites. The math scores of African-Americans in Houston, for instance, rose by 34 percentage points; for Hispanics, by 36 percentage points; and for whites, by only 14 percentage points (see Figure 3). Houston’s results roughly mirror those of the state.

Figure 3
Note that the pass rates for white students are very high, well above 90 percent, suggesting that the skills tested by the TAAS are not too challenging and that many students are close to “topping out” on the test. However, this does not diminish the significance of the substantial increases in performance on the skills tested, especially for minority students. The gaps in the passing rates between whites and African-Americans and between Hispanics and whites in both math and reading declined from 1994 to 2000. Houston reduced the white/Hispanic gap in math from 36 percentage points in 1994 to 14 percentage points in 2000, a narrowing of 22 percentage points. The state as a whole reduced the same gap by 15 percentage points. Still, the gap statewide between whites and minorities is somewhat smaller than it is in Houston.

Houston, by far the largest of the six urban districts, made greater progress in reading and math than all but one of the major urban districts.

Comparing Houston with other urban districts. In both reading and math, all six urban districts included in our analysis reduced the gap in TAAS passing rates between whites and African-Americans and between whites and Hispanics. Houston’s performance is especially noteworthy. Houston decreased the white/African-American achieve- ment gap in math by 21 percentage points, more than all the other districts. It also reduced the white/African-American gap in reading by more than all the other districts. Houston’s reductions in the white/Hispanic gap were equally impressive: the gap in math dropped by 22 percentage points, more than any of the other urban districts.

Whatever one thinks of the TAAS, Houston is clearly doing something right. Its progress on the TAAS, for the most part, has outstripped the gains of the state and most other urban districts. The district at least has begun to solve one of society’s most intractable problems: the achievement gap between white and minority students, at least as measured by the TAAS. Without the information provided by the Texas accountability system, few people would have noticed what was happening. What a loss that would have been!

-Jane Hannaway is the director and principal researcher of the Education Policy Center at the Urban Institute. Shannon McKay is a research associate in the Education Policy Center.




Comment on this article

Comment on this Article

Name ()


*

     0 Comments
Sponsored Results
Sponsors

The Hoover Institution at Stanford University - Ideas Defining a Free Society

Harvard Kennedy School Program on Educational Policy and Governance

Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

Sponsors