This post is copied from a post from April 2016, with updates for current links.
The Fraser Institute has published their yearly, controversial, ranking of BC elementary schools:
While parents want data about schools, and to see how schools are doing, this is not a terribly useful, accurate, or helpful report.
There is a BC government website that allows parents to access data about elementary schools: http://www.bced.gov.bc.ca/reporting/school.php
Here are two web posts that give some background information on how these results are calculated:
http://donaldgutstein.com/eight-distortions-and-other-problems-in-the-fraser-institutes-report-card/
“…
2. Twenty percent of a school’s ranking comes from differences between the results achieved by boys and girls. This artificially depresses the scores of schools with students of lower socio-economic status where, typically, gender differences are more pronounced.
Worse, and inexplicably, the Fraser gives more weight to gender differences than to the actual results. Gender differences in Grade 7 numeracy and reading tests (what happened to writing?) account for 10 percent each. The actual test results account for only 7.5 percent each.
3. Twenty-five percent of a school’s ranking comes from the percentage of tests “not meeting expectations.” This result penalizes low-performing schools by accounting for their low scores twice.
4. Ten percent of a school’s ranking comes from the percentage of tests not written in a school. This indicator was added in 2007 “to encourage schools to ensure a high level of participation in the FSA testing program.” It is a not-so-veiled attack on the BC Teachers Federation and parents who don’t want their children to write the tests.
That punishing the BCTF is the purpose of this component of the rankings can be seen by comparing the Fraser Institute’s BC and Alberta elementary schools rankings. This component does not exist in the Alberta report card where the union is not as activist in opposing mandatory testing.
…”
The annual Fraser Institute ranking of B.C. elementary schools is out, showing that — shock! — private schools perform better than those where the kids arrive hungry and get stacked up like cordwood in the classroom.
Of course the Saint Whoever schools rank well, is the standard response. Children are screened before being accepted, special-needs kids have better support and, as a retired teacher pointed out in a letter to the editor, class sizes “are smaller than most grade-school birthday parties.” If a parent is paying both taxes and tuition, the results better justify the extra outlay.
Sure enough, this year’s report showed that 19 of the 20 schools that tied for first place — including Victoria’s Saint Michaels University School — were independents. West Vancouver’s Cedardale was the lone public institution. The other end of the scale was just as predictable: inner city and remote schools that might as well be named Sisyphus Elementary, the students destined to push uphill boulders that always rolls back on them.
If the Fraser Institute results never vary, neither does our reaction: we all A) complain that the rankings are statistics-twisting nonsense, then B) rush to see how our kids’ school placed. Nature abhors a vacuum; parents know the report’s methodology leaves a lot to be desired, but in the absence of a more comprehensive way to measure the quality of their children’s education, they’ll seize on this one. To which Helen Raptis says “Don’t.”
Ditto for David Johnson.
Raptis is associate dean of education at UVic. Johnson is an economics prof at Wilfrid Laurier University in Waterloo, Ont., and the education policy scholar at another think tank, the C.D. Howe Institute.
Both think the standardized testing on which the Fraser Institute rankings are partially based is useful — just not in the way the Fraser Institute is using it. The tests were never meant to be used as the education equivalent of TripAdvisor.
The rankings rely in part on the Foundation Skills Assessment taken by all B.C. students in Grades 4 and 7 to test their knowledge of numeracy, reading and writing (though note that in Greater Victoria, most elementary schools don’t go to Grade 7). If a parent really wants to use a yardstick to measure school performance, go to the Education Ministry website to look up that data, Raptis says.
But those tests account for just 45 per cent of an elementary school’s Fraser Institute ranking, she says. The balance of the weighting is based on indicators that haven’t been proven to affect school performance, but that are skewed against schools with a lot of kids of lower socio-economic status. The result is that a school full of poorer kids can be ranked below one with inferior test results.
Forget all the public-versus-private school talk, Raptis says. This is just an Orwellian exercise that pulls down good schools by measuring them with tools of uncertain usefulness. The low rankings of low socio-economic schools are inevitable, discouraging progress. It’s actually counter-productive, which is why the Times Colonist decided to stop publishing the Fraser Institute list a few years ago, she notes.
Johnson’s objections are different — and somewhat contradictory. He developed a more complete measuring system for the C.D. Howe Institute that incorporates socio-economic variables that the Fraser Institute ignores, he says. That allows schools in similar circumstances to be compared, allowing improvements can be made. “What you really want to do is look at schools that outperform similar schools and see what you can learn from that.”
Even then, forget saying with a straight face that School X, in 132nd place, is better than 445th-ranked School Y. We all like Top 10 lists, and there’s a sexiness to ranking schools one through 982, but Johnson scoffs at the idea of rating them that finely, particularly when doing so by focussing on year-to-year changes in the average FSA scores. In a small school, a handful of students who test particularly poorly or well can shift the marks dramatically. Better to put more weight on longer-term trends and the percentage of students who achieve at an acceptable level.
As it is, Johnson simply doesn’t find much value in the annual fuss. “I think it just annoys people.”
Leave a Reply