Monster Hype

Education Next Issue Cover

School violence, the media's phantom epidemic



By JOEL BEST

2 Comments | Print | PDF |

Summer 2002 / Vol. 2, No. 2

Illustrations by James Yang

Contemporary discussions about social issues, especially within education, almost always involve statistics. Numbers have become an essential element in policy rhetoric, a form of evidence needed to persuade others. Statistics let us claim that we can measure the size of our problems and the effectiveness of our solutions.

Yet even as we rely on numbers, we are bedeviled by innumeracy, the mathematical equivalent of illiteracy. Too often, we fail to think critically about the statistics we encounter, to ask even the most basic questions. This is important, because accepting numbers uncritically may cause us to badly misunderstand our problems. There are few better examples of this failing than some of the recent figures regarding school violence.

The March 5, 2001, shooting spree at Santana High School in Santee, California, which left 2 dead and 13 injured, revived concerns over the seeming escalation of school violence and its potential links to the age-old schoolyard tradition of bullying. School shootings first became a serious issue in the wake of a series of tragic incidents, the most famous being Dylan Klebold and Eric Harris’s April 1999 rampage at Columbine High School, in Littleton, Colorado, during which they murdered 12 students and a teacher before turning their weapons on themselves. Of particular interest was that nearly every shooting was accompanied by reports that the teenagers involved were marginalized in some way; the Santee shooter especially appears to have been a victim of bullying.

Both school shootings and bullying have become subjects of extensive media coverage, featuring the pontification of assorted politicians, activists, and experts. This is how contemporary Americans create new social problems. Typically, the process involves a three-part recipe:

1) Illustrate the problem with an awful example (e.g., the mass murder at Columbine High School).

2) Give the problem a name (“school shootings”).

3) Use statistics to suggest the problem’s size and importance.

Statistics play a crucial role in this process, because we tend to assume that numbers are factual–that somebody has counted something, that the problem has been measured and therefore is as big as the claims suggest. Coupled with dramatic, headline-grabbing incidents, they have created the impression that both school violence and bullying are on the rise. This may make for compelling television, but the oversaturated media coverage can portray a few isolated incidents as a national trend. Take CBS anchor Dan Rather’s post-Santee warning: “School shootings in this country have become an epidemic.” Such claims have become commonplace among journalists who haven’t thought carefully enough about the evidence. The statistics on violence and bullying that are trucked out by pundits and activists often exaggerate or distort the case. The result is that the public and policymakers tend to overreact as they look for solutions to problems that appear to be out of control. A closer look at the statistics, however, reveals a more complicated and hopeful picture.

A Phantom Epidemic

Of course, the phenomenon of adolescents’ bringing guns to school and randomly shooting their peers ought to be a source of genuine worry. The 1997-98 school year alone saw tragedy strike West Paducah, Kentucky (3 dead, 5 more wounded); Jonesboro, Arkansas (5 dead, 10 wounded); and Springfield, Oregon (2 deaths and 21 wounded at the school, after the shooter first killed his parents at home). These crimes, along with the Columbine massacre, seemed to be without rational motivation; what could possibly have driven these adolescents to lash out at the world in such bloody fashion? Kids have always divided into cliques and subjected the nonconforming to verbal and physical abuse; only recently, it seemed, had the social drama of high school resulted in mass casualties. It’s legitimate to wonder whether these incidents represent a deep-seated change in youth culture.

Nevertheless, these tragic events masked the overall trend: a good deal of evidence indicates that school violence has actually been declining in recent years. When researchers at the National School Safety Center (NSSC) combed media reports from the school years 1992-93 to 2000-01, they identified 321 violent deaths at school. However, not all of these incidents involved student-on-student violence: they included, for example, 16 accidental deaths and 56 suicides, as well as incidents involving nonstudents, such as a teacher killed by her estranged husband, who then shot himself, and a nonstudent killed on a school playground during the weekend. Even if we include all 321 deaths, the average fell from 48 violent deaths per year during the school years 1992-93 through 1996-97 to 32 per year from 1997-98 to 2000-01. If accidental deaths and suicides are eliminated from the data, the decline remains: from an average of 31 deaths per year in the earlier period to 24 per year in the later one. Moreover, the later period includes all of the heavily publicized cases mentioned above. And the later figure may be further inflated by the likelihood that the media were more apt to report school shootings after the topic vaulted to public attention (see Figure 1).

This decline is consistent with the evidence suggesting that crime rates were declining nationwide. During the 1990s, the overall crime rate fell, as did the rates of major violent crimes such as homicide, robbery, and aggravated assault. The crime rate, which is the Federal Bureau of Investigation’s tally of crimes reported to the police, is only one of two national measures of criminal activity. The second, less familiar measure is the rate of victimization reported in the National Crime Victimization Survey. Researchers with the victimization survey interview a large national sample and ask respondents whether they or anyone in their households have been victims of crime. This survey showed instances of criminal victimization falling during the 1990s. Moreover, reports of teenagers being victimized by violent crimes at school dropped. The data also showed that instances of victimization were less common at school than elsewhere; in other words, teenagers were safer at school.

The federal Centers for Disease Control and Prevention’s Youth Risk Behavior Survey also found steadily declining percentages of high-school students who reported fighting or carrying weapons on school property during the 1990s. It is also important to recognize that the risks of school violence are extremely low. For every million children who attend school, there is less than one violent school-related death per year. Moreover, only about 1 percent of children killed by violence are hurt at school, despite the large amount of time they spend there.

None of these data are especially hard to come by; all of them were readily available–and the trends they showed were apparent–before, during, and after the various school-shooting incidents that became subjects of extensive news coverage. All of this evidence flatly contradicted the claims that there was a wave, trend, or epidemic of school violence. In other words, the wave of school shootings was a phantom–that is, a nonexistent trend. What accounts for this misperception? Why did the press and the public assume that school shootings were increasing?

In large part, media coverage promoted this distorted view of the problem. The Columbine killings in particular became a huge story. Columbine involved many victims, and the story unfolded over hours. Because the crime occurred in the suburbs of a major city, there were plenty of reporters nearby, and they had time to arrive on the scene for live coverage. The result was dramatic video footage that would be replayed many times. Furthermore, Columbine was a bastion of suburban privilege; it challenged stereotypes about inner-city violence. It was a story made for television.

The Columbine coverage also reflected recent media transformations. Most Americans now have access to cable or satellite television systems; they are no longer limited to receiving broadcasts from a handful of local stations. Most viewers now can choose among several all-news or public-affairs channels. Those channels need constantly to fill the time with content. In the aftermath of the Columbine shootings, broadcasters like CNN, Fox News, and MSNBC devoted hours, not just to reporting the story and commentary about the violence, but also to live coverage of many funeral and memorial services. Columbine remained a major story for days, and during that period, politicians, activists, and commentators used it as evidence to justify their calls for a wide range of measures, including tougher gun laws, restrictions on adolescents’ access to violent popular culture, and so on.

The Columbine killings were a terrible event, but we are accustomed to thinking about such incidents as instances–that is, as examples of some larger problem. The extraordinary level of media coverage reinforced the interpretation that these killings must have had some larger significance. It also gave people the sense that school shootings must be a large and growing problem, regardless of what the available statistics actually showed.

Making Bullies Seem Big
Concern with school bullying usually concentrates on mundane, everyday cruelty, but the issue has taken on new significance with reports that several school shooters were reacting to a history of taunts and shoves from their peers. All of us have witnessed–if not experienced–bullying, usually during our own school days. The phenomenon has long been recognized as undermining efforts to maintain discipline in schools.

The problem with trying to measure the extent of schoolyard bullying should be obvious: the term lacks any clear definition. What is bullying? Any attempt to measure bullying requires not just defining bullying in the abstract, but also devising an operational definition–that is, a set of procedures or operations whereby one can identify and count cases of bullying. In general, those who are trying to provoke a policy response to a particular social problem favor broad, inclusive definitions. Their purpose is to make the problem seem as widespread as possible. If bullying is defined narrowly–say, as involving only physical assaults–it will seem like less of a problem than if the definition includes many other forms of hurtful behavior.

Consider “Bullying Behaviors Among U.S. Youth,” an article published in the April 25, 2001, issue of the Journal of the American Medical Association (JAMA). The article reported results from a large (nearly 16,000 respondents) representative sample of students in grades 6 through 10; the authors were associated with the National Institute of Child Health and Human Development (which supported the survey). Its major finding, widely reported by the media, was that nearly 30 percent of youths “reported moderate or frequent involvement in bullying.” This was just one of many studies of bullying, but few other studies feature samples as large and well drawn. In addition, this article appeared in JAMA, an especially prestigious journal, so we might assume that it represents the best work on bullying.

Of course, the proportion of surveyed students who report being involved in bullying will depend on how bullying is defined and how the questions are framed. The questionnaire’s section on bullying began with an explanation: “Here are some questions about bullying. We say a student is BEING BULLIED when another student, or a group of students, say or do nasty and unpleasant things to him or her. It is also bullying when a student is teased repeatedly in a way he or she doesn’t like. But it is NOT BULLYING when two students of about the same strength quarrel or fight.” The students were then asked how frequently they bullied others or were bullied during the current school term. They had to choose from the following responses: “I haven’t . . . ,” “once or twice,” “sometimes,” “about once a week,” or “several times a week.” In presenting their results, the authors defined incidents that occurred at least weekly as frequent involvement in bullying, and responses of “sometimes” as moderate involvement. In addition, the students were asked to “report the frequency with which they were bullied in each of five ways–belittled about religion/race, belittled about looks/speech, hit/slapped/pushed, subject of rumors or lies, and subject of sexual comments/gestures.”

It is difficult to know what to make of the students’ responses to these questions, because they were not completely consistent. For example, in response to the general questions, 41 percent of the sample reported having been bullied at least once. However, substantially larger proportions of the same students reported experiencing four specific sorts of bullying: 62 percent reported having been belittled about their looks or speech; 56 percent said they had been hit, slapped, or pushed; 60 percent claimed they had been the subjects of rumors; and 52 percent indicated they had been subjected to sexual comments or gestures. In other words, a large share of students, when asked a general question, denied being bullied, yet they then acknowledged having been bullied in specific ways. This confusion suggests that we should not place too much confidence in the significance of the students’ responses to particular questions.

Nevertheless, the authors conclude that “bullying is a serious problem for U.S. youth,” and that “the prevalence of bullying observed in this study suggests the importance of preventive intervention research targeting bullying behaviors.” But the data are much less clear than these conclusions suggest. The central finding–that 30 percent of youths are involved in bullying–depends on three manipulations, three methodological choices. First, students could be involved either as a bully (13 percent), as a victim (11 percent), or as both (6 percent) (see Figure 2). If the article’s authors had chosen to count only the victims, they would have found that 17 percent, rather than 30 percent, were subject to bullying. Second, the authors included both “moderate” bullying (those who reported bullying as occurring “sometimes”–more than once or twice during the term, but less than weekly), and “frequent” bullying (occurring at least weekly). Adopting a narrower definition would have made the findings less dramatic; only 8 percent of the respondents reported being targets of frequent bullying. Third, there were separate questions about bullying in and out of school, but those responses were combined for the JAMA article, so readers are not told how much of the reported bullying actually occurred in schools. We can assume that at least some of the students who reported being frequently bullied were targeted outside of school. If so, the percentage of students who reported frequent bullying in school should have been even lower than 8 percent.

In other words, the authors made a series of choices that allowed them to estimate that bullying significantly affects 30 percent of students; different choices–say, looking only at victims of frequent bullying in schools–would have produced a figure that was perhaps a quarter as large. The point is not that this is a bad piece of research, nor that bullying can’t have serious consequences (remember the Santee shooter). Rather, it is that the numbers that emerge from social-science research need to be handled with some care. How the survey’s questions were worded, the order in which they were asked, and the choices made in interpreting and summarizing the data for publication all shaped the results.

Perceptions that school violence is a national crisis have clearly affected educational policy. Resources have been directed toward purchasing and operating metal detectors, stationing police officers in schools, and enhancing other security measures, restrictions that many schools expanded in the aftermath of the September 11 terrorist attacks (see Education Matters to Me essay “In the Shadow of Terror“). These are not necessarily bad choices, but they may have been based in part on inaccurate perceptions of the nature and level of school violence.

-Joel Best is a professor of sociology and criminal justice at the University of Delaware. His latest book is Damned Lies and Statistics: Untangling Numbers from the Media, Politicians, and Activists (University of California Press, 2001).




Comment on this article
  • Kid says:

    Cool

  • Rosalie says:

    thank you… the media is so influential & people just take it in uncritically

  • Comment on this Article

    Name ()


    *

         2 Comments
    Sponsored Results
    Sponsors

    The Hoover Institution at Stanford University - Ideas Defining a Free Society

    Harvard Kennedy School Program on Educational Policy and Governance

    Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

    Sponsors