(New York Times) By Frank Bruni —
Willard Dix is one of the crankiest observers of the college admissions process I know; he’s also one of the smartest. He worked at Amherst, his alma mater, then advised college-bound students at a private secondary school in Chicago. He now blogs about higher education.
I asked him on the phone the other day about the dizzying proliferation of college rankings beyond those by U.S. News & World Report, each using its own methodology and emphasizing different metrics. If a tone of voice can approximate an eye roll, his did.
“You can slice and dice it any way you like, but this isn’t like Consumer Reports, which tests something to see if it does or doesn’t work,” he said. “The interaction between a student and an institution is not the same as the interaction between a student and a refrigerator.”
I can’t improve on that quip. But I can explain it in terms of what rankings do and don’t reveal and how high school seniors, who are right now in the thick of figuring out where they want to apply, should approach them.
It joined a jammed field of players, including The Economist, Forbes, and, yes, this newspaper, whose College Access Index looks narrowly at which of the country’s top schools seem to be the most socioeconomically diverse.
Inasmuch as all of these rankings rely on, and compile, objective information about the schools they examine, they’re useful. But all of them also make subjective value judgments about what’s most important in higher education, and those judgments may or may not dovetail with a student’s interests. It’s crucial to look at precisely what’s being measured — which is easy to do, if you read the fine print.
Some don’t really try that hard to get at the question of how satisfied a school’s students are. Others do, but take varying routes to the answer. Some look in meaningful ways at diversity, which can greatly influence campus life and classroom discussions and says something about administrators’ priorities. Others don’t.
Over the last few years, there has been a movement toward ranking colleges in terms of how much money their graduates go on to make — something that U.S. News has never directly factored in but that The Wall Street Journal, The Economist, Forbes and Money Magazine, among others, do. My Times colleague James Stewart recently examined this development.
But here, too, there are necessary caveats. Graduates’ incomes probably have more to do with dynamics that precede college — their parents’ wealth, their childhood opportunities, their innate gifts — than with the particular seasoning of a given institution, and not all salary-oriented rankings pay careful attention to this.
The economist Jonathan Rothwell found a way to reward colleges whose graduates achieved more than their backgrounds might have predicted, with a set of “value-added” rankings that he produced for the Brookings Institution early last year. His inaugural list differed markedly from U.S. News’s, with Colgate University, Washington and Lee University, Clarkson University and Manhattan College appearing in the Top 10, above any Ivy League school. He later tweaked and adapted this list for a column in The Times by Stewart last October.
But there are also problems with these income-oriented approaches (beyond their implication that money equals contentment and success). One of the two principal sources for income figures is PayScale, a company that collects salary information. It relies on self-reported numbers from people who use its database, and is by no means a comprehensive, definitive survey.
The other source is the federal government’s College Scorecard, but its figures are only for people who received federal aid and reflect what they’re earning in the earliest years of their careers. Schools whose students move quickly into professions with high starting salaries fare better by this yardstick than do schools whose students choose careers that tend to develop slowly.
My larger point is this: For almost every well-intentioned measurement, there’s either a fundamental shortcoming or possible glitch. Take the Wall Street Journal rankings, which significantly factor in how a school’s current students, in a survey, evaluate their experience.
This would seem to be — and perhaps is — an excellent idea. But in visiting colleges over time, I’ve noticed that the ones with the loftiest reputations sometimes marinate in their own mythology, sending students all sorts of messages about what an extraordinary opportunity they’re enjoying. This self-congratulation surely colors the survey responses, which may wind up saying as much about a school’s status as about anything else.
Rothwell, who is now at Gallup, conceded that even the best rankings were “deeply flawed.” “They don’t measure learning outcomes,” he told me, “and it seems to me that that’s probably the chief goal of higher education: to teach people.”
The best way to use rankings is to focus on discrete assessments that speak to distinct concerns. For instance, if you care about socioeconomic diversity, consider Washington Monthly’s rankings. They pay heed to that while also trying to determine how potent an agent of social mobility a school is and how broadly and deeply its students subscribe to an ethos of community service. Washington Monthly is judging institutions’ characters as much as their clout.
The ScholarMatcher, in its second year, is an interactive tool designed to show students from households with incomes of less than $50,000 which schools are most likely to be affordable and to leave them in good financial stead.
In an utterly different vein, the Heterodox Academy, which is a group of professors concerned about ideological diversity, has just begun rating schools on their apparent commitment to that. It ranks the University of Chicago highest among large institutions; Purdue is the runner-up.
But rankings cannot take into account, and thus ignore, the most consequential part of the equation, which isn’t some spell that a given school casts on a student but a student’s commitment, curiosity, daring. An obsession with rankings obscures and invariably minimizes this essential truth.
“We should not overlook the effort that it takes to be a serious student,” Janet Napolitano, the president of the University of California system, told me recently. She went to Santa Clara University, not because of how it was ranked on some list but because it was Dad’s alma mater, and California sounded cool to a girl who’d grown up in New Mexico. Once there, she studied hard, she recalled, and emerged as the school’s first female valedictorian.
“You get out of it what you put into it,” she said. I guess the same does apply to a refrigerator, but only if you’re talking about condiments.