As promised, I return for a third time to the Fraser Institute’s dubious School Report Card series. But first, a quick question: raise your hands if you think the basic socioeconomic difference between a family with an income of $25 000 per year and a family with an income of $250 000 per year is roughly the same as the difference between a family earning $450 000 per year and a family earning $625 000.
I hope I know my reader pool well enough that I can safely assume you didn’t raise your hand. Oddly enough, though, at least six people did: Simon Fraser University economist Stephen Easton, Fraser Institute marketing consultant Peter Cowley, private school scholarship fund administrator Michael Thomas, and their three anonymous reviewers (not to mention four dead guys on the Editorial Advisory Board).
One of the most important amendments to the Fraser Institute report card series in recent years has been the inclusion of a “socioeconomic indicator.” There are many misleading things about this number, but the most misleading, as I’ve just discovered, is the Institute’s ludicrously skewed view of socioeconomic factors. What follows is, to a degree, going to be a little dry, and I apologize for that in advance. But the summary is: I believe the socioeconomic indicator is skewed in such a way that it unfairly punishes poor schools (exclusively public ones) and unfairly rewards wealthy schools (more often private ones).
The Fraser Institute doesn’t actually incorporate socioeconomic factors into school scoring, contrary to what many people seem to think. Instead, what they’ve done is “predict” each score based on average parental income in the neighbourhoods the students come from, compare their predicted score with the actual score, and then give each school a second, integer score. A “0″ on the socioeconomic scale means that they guessed right. A positive number (say, “2″) means the school is doing better than you would expect given parental income. A negative number (say, “-2″) means the school is actually doing worse than you would expect based on parental income. Not surprisingly, the schools at the very top and very bottom of the scale tend to be outliers: bad schools that should have done much better, and good schools that could have done much worse.
This is where things get a little weird, though. How exactly does one predict a score based on neighbourhood average parental income? (I’m going to leave aside for the moment the fact that there are all kinds of other factors that this completely ignores, like local rather than provincial costs of living, single vs two-parent households, etc.) The Fraser Institute claims they conducted a regression analysis, although they don’t show their math. Whatever they did, it led to a very strange view of social and economic class:
What we’re looking at is a chart that shows the Fraser Institute’s predicted relationship between the class of the student’s family and her school’s results. I’ve calculated it by starting at the beginning of the report and typing in data from “A” up to the end of “Calgary” (in total, 205 schools). I could have continued but this limited data set will do to illustrate my complaints, and unlike the Fraser Institute, I don’t have any money to pay for my time invested.
There are several very bizarre observations about this graph. First, the socioeconomic predicted scores start at about 5 out of 10. I’m not sure this is even possible. The institute’s scoring system is actually a sliding scale based on deviation from the mean: in other words, schools get higher or lower scores not based on how they measured up to an objective grading rubric, but on how they compared to the provincial average. In any case, it seems to be the Institute’s equivalent of the “nobody can fail” rule that currently rules in BC. According to the Fraser Institute’s socioeconomic model, schools start at average and move up from there. This can’t work — some people have to be below average.
The first consequence of this is that poor schools, which are exclusively public schools, tend to get socioeconomic scores suggesting they did much worse than they should have. In my sample, there were 40 schools where parents earned $50 000 or less in a year. Of the 79 schools that underperformed (according to the Fraser Institute), 27 of them were poor schools. So about 2 in every 3 poor schools did worse than the Fraser Institute thought they should. Or, put another way, although poor schools accounted for 20% of my sample, they make up 34% of the schools that supposedly should have done better.
I want to take a moment to explain why this matters. I believe, and the Institute agrees, that socioeconomic status partially determines student success. They developed a predictor that supposedly accounts for this. But of course, some schools do worse than they should have, and some schools do better than they should have. But these should be evenly distributed across economic class. If they’re not, that means the Fraser Institute hasn’t fully accounted for the importance of family income. In short, they’ve fudged their numbers.
Now, as income climbs from the lower class into the middle and upper-middle class, you might expect scores to rise dramatically. That’s because, as you do so, you dramatically reduce the proportion of poor single parents, malnourished kids, homes without books, and so on and so forth. The Fraser Institute doesn’t think so, though. Warburg School, where families of a frighteningly low average income of just $12 500, is expected to score 4.7. A family earning five times as much — a more comfortable $62 500, say — is only predicted to score one point higher, at 5.7. A family earning twice as much again, an upper-middle-class household raking in $115 000 a year, is supposed to score 6.7.
This simply doesn’t pass the smell test. First, one would expect most of the biggest benefits of socioeconomic status to come in fairly early on, and then to tail off over time. That’s because kids coming from a household earning $115 000 a year and kids coming from an even wealthier household earning $230 000 a year both have (one assumes) escaped most of the obstacles facing poor kids. After that, the benefits should tail off. Second, the increase logically should stop at some point. There’s an upper end to the scale — the point at which every student gets 100% and no students fail.
But the Fraser Institute disagrees. In fact, it even says the scale should keep climbing at the same rate all the way up the ladder. By the socioeconomic predictor, the “worst” school in the province is Elbow Park public school. Its parental income was almost $700 000 a year, so the Fraser Institute says the students there should have earned an implausible 17.5 out of 10. It didn’t, so it got a high negative score, -8.8. The top three schools in my sample, and all schools where parental income is over $300 000 (not many schools, I acknowledge), actually can’t perform up to snuff on the Fraser Institute report card, because their predicted scores all exceed perfect.
Going down one rung, from the super-rich to the highly affluent, we arrive at the next problem. Recall that because of the way the predictions were made, a disproportionate amount of failing schools turned out to be poor ones, too. The reverse is true at the upper income level. There are 63 schools where parental income topped $100 000 per year. Here, only 14 schools underperformed — or 22%. So poor schools are half again as likely as rich schools to miss their Fraser Institute socioeconomic standard. That suggests, once again, that the Fraser Institute’s attempt at a socioeconomic predictor is highly misleading.
There are real consequences to fudging the socioeconomic numbers like this. Poor schools, which are almost exclusively public schools, will appear to do worse than they should have, because they are being held up to an unfairly high standard. Wealthy schools, which are more often private schools, will appear to do much better than they should have, because they are being held up to an unfairly low standard. Given that the Fraser Institute is known to favour private schools, and that one co-author even administers a financial aid fund for private schools, it’s hard to believe that this result is purely accidental.
Now, I could certainly accept that first author Peter Cowley doesn’t really understand what his own method does here. He’s only a professional marketer, after all — he has no real experience either in advanced research or in education. Michael Thomas has an undergraduate degree in political science and probably took a stats course at some point, though he may not have been paying attention. Stephen Easton, though, is a professional economist and a Simon Fraser University professor. It’s stupefying to think he didn’t realize the effort at a socioeconomic status predictor was a failure.
Next time, I’d like to take another look at the same data set and see if I can suggest a more realistic way of viewing the Fraser Institute’s school report card results.Tweet