Thursday, August 24, 2006

WHAT CHARTER SCHOOL RESEARCH IS, AND IS NOT, TELLING US

This is the best rebuttal I've seen of the recent foolish study that purports to show that charter schools are underperforming regular publics schools (by Nelson Smith, head of the National Alliance for Public Charter Schools.  In addition to the failure of the study to adequately capture charter schools' high proportion of the most disadvantaged students, I had no idea that the data the study is based on only covers FIVE PERCENT of charter schools!
Only 150 charter schools participated in the 2003 NAEP, which was about 5 percent of the charters open in the 2002-03 school year. Given the wide diversity in the types of charters that are open, it’s doubtful that a sample of 5 percent of charters nationally tells one much of anything about their performance as a sector.
In addition, Smith cites additional studies to the 20 in my slide (26 total), but the conclusion is exactly the same: students at charter schools are making much bigger gains.
Bryan Hassel of Public Impact has looked at 44 such studies that include both snapshot and longitudinal perspectives. Of the 18 snapshot studies, 12 found that charters did worse than non-charters, and 6 found that charters did the same or better. Of the 26 change studies, 16 found that charters did better than non-charters, 6 found roughly comparable results between charters and non-charters, and 4 found that charters did worse. Of the 16 positive change studies, the most common theme was that charter students start out behind and catch up and surpass non-charter students over time.
------------------------

WHAT CHARTER SCHOOL RESEARCH IS, AND IS NOT, TELLING US

Statement by Nelson Smith, president, National Alliance for Public Charter Schools

WASHINGTON, Aug. 22, 2006
—The perceptions of charter schools have largely been based on “he said, she said” news reports since the movement began nearly 15 years ago. Yet if we are to truly understand the reality of how well charter schools are serving more than a million students in 40 states, it is important to understand properly what charter school research is and is not telling us.

The report released today by the National Center for Education Statistics provides the fourth analysis of the same data from 2003 National Assessment of Education Progress about the performance of 4th graders in charter and other public schools. Not only is this old news—there are similar results in the first three analyses—it is old data. The most current available data is from the 2005 NAEP, which was released last October.

In either case, both the 2003 and 2005 NAEP have limitations that can lead to misinterpretation and misunderstanding of charter school performance. Here’s why:

• NAEP was designed to illustrate the nation’s progress on student achievement; to establish nationwide historical trends in student achievement; and to see how states compare to one another. Yet chartering is a state policy innovation and therefore the national context is less important than the state context. How charters perform in a particular state is impacted by the resources, regulatory environment, and quality of authorizing in that state. That said, the recent record of state-level achievement studies is quite encouraging

• NAEP only provides a snapshot analysis for one point in time. These snapshot analyses look at the results by various demographic factors (e.g., race and ethnicity) but say nothing about the most important issue – prior student achievement. Studies of charters in three of the biggest charter states (CA, FL, and TX) show that students who attend charters tend to enter significantly behind their non-charter counterparts. A snapshot study that doesn’t take into account students’ starting points provides no sense of the school’s effect on these students.

• There is not one data set that tells all. In fact, there are dozens of studies on charter schools. Bryan Hassel of Public Impact has looked at 44 such studies that include both snapshot and longitudinal perspectives. Of the 18 snapshot studies, 12 found that charters did worse than non-charters, and 6 found that charters did the same or better. Of the 26 change studies, 16 found that charters did better than non-charters, 6 found roughly comparable results between charters and non-charters, and 4 found that charters did worse. Of the 16 positive change studies, the most common theme was that charter students start out behind and catch up and surpass non-charter students over time.

• Only 150 charter schools participated in the 2003 NAEP, which was about 5 percent of the charters open in the 2002-03 school year. Given the wide diversity in the types of charters that are open, it’s doubtful that a sample of 5 percent of charters nationally tells one much of anything about their performance as a sector.

• The major indicator used to identify low-income students is seriously flawed when applied to charter schools. Some of the findings in each of the four analyses of the 2003 NAEP focus on the performance of students who are eligible for free and reduced price lunch. (Each found little difference between charters and non-charters) However, a significant number of charter schools don’t participate in the federal free and reduced price lunch program, even though they have students who are eligible for it. This understates the number of disadvantaged students in charter schools, and skews any conclusion about their academic performance.

The National Alliance for Public Charter Schools is working to grow both the number and quality of charter schools. Precisely because we support strong schools and don't excuse poor performance, we think it important to be clear and rigorous in discussing data. We do not take issue with the agencies or researchers involved in today's release – but must call attention to the fact that this is the fourth time the public has been greeted with essentially the same "news" derived from a source that is singularly ill-suited to the measurement of charter schools’ performance.

We encourage you to contact us as well as the third-party researchers below with your questions.

Brian Gill, senior social scientist, Rand Corporation, 412-683-2300 or briang@rand.org

Bryan Hassel, co-director, Public Impact, 919-967-5102 or Bryan_Hassel@publicimpact.com

Gary Miron, professor of education, Western Michigan University, 269-387-5895 or gary.miron@wmich.edu

Alliance staff:

Nelson Smith, president, 202-289-2700 or nelson@publiccharters.org

Todd Ziebarth, senior policy analyst, 303-329-4648 or todd@publicharters.org

 Subscribe in a reader