There are certainly some good reasons for
some people to take the U.S. News college rankings seriously. Presidents of
schools that went up a notch or two can trumpet the fact to their trustees
while noting modestly, of course, that “we don’t really pay them any heed.” But
if you are a college-bound student or the parent of one, there are lots of
reasons not to give them any
credence. As a starter, and in the spirit of my editorial friends at U.S. News,
here are my Top 10:
1. The U.S. News asks
the wrong question.
The question is not what is the “best” school by some abstract standard, but
what is the best school for you. Harvard
is a great place for lots of super-bright students, but if at the age of 17 you
could still benefit from a bit of stroking, try Swarthmore, Carleton or Pomona.
2. They only look at
inputs. U.S.
News asks colleges about the resources they enjoy, including the academic
quality of the students they admit. But the formula pays no attention to
results. Does the college do a good job of educating its students? The rankings
have no way of saying.
3. They don’t tell
you anything really important. When you get to the point of actually
deciding among several schools, you want to know about things like whether
faculty care about teaching, how competitive the academic climate is and
whether you are likely to be comfortable with the kind of students it attracts.
Don’t look to the rankings for any help here.
4. The rankings are
really measures of institutional wealth. By emphasizing factors such as
endowment per student and faculty resources, the formula favors private
schools, especially smaller ones, with big endowments. If you think that
institutional wealth automatically translates into academic quality, I have a football
stadium that you might like to buy.
5. They are biased
against public universities. U.S. News used to have a good mix of public
and private institutions in its lists of top schools, but over the years the
proportion of publics, which educate more than 75 percent of college students,
has declined. Are we really to believe, as the rankings released today suggest,
that there are no publics among the
top 20 national universities in this country. Of course, one reason for the
anti-public bias is obvious (see #4).
6. Statistical
differences are trivial. By listing schools in one-two-three order, U.S. News
creates an illusion of precision. Even if you think that the rankings provide
useful information, the actual differences between College #7 and College #19
are trivial.
7. Year-to-year
changes are meaningless. It would be nice if colleges and universities
could improve dramatically from one year to another, but the culture of Academe
doesn’t work that way. Take California Institute of Technology, ranked #10 this
year. In 1989 its ranking soared from 21st to 3rd. Then, after cruising in 9th place
for a few years it jumped to #1, but then it was back in 4th place.
Does Cal Tech change its institutional spots this frequently?
8. Colleges game the
system.
U.S. News claims that colleges can’t improve their rankings by tactics such as
getting more alumni to contribute to the Annual Fund. But this doesn’t stop
colleges from trying. Within the last year, two quality schools, Claremont
McKenna College and Emory University, acknowledged submitting inflated data in
areas such as test scores and GPAs. Several years ago Baylor even paid entering
freshmen hundreds of dollars to retake the SAT and boost their scores.
9. U.S. News also games
the system.
If you know that the list of top schools will be the same year after year, why
buy the magazine? U.S. News tweaks the formula each year (sometimes for very
good reasons). This assures churning of the rankings.
10. Who’s your Daddy?
U.S.
News actually does two separate things. First, it presents a huge amount of
data about lots of schools, much of which can be quite useful. For example, it
allows you to compare the SAT ranges or relative selectivity of a handful of
schools in which you are interested. But the editors then go on to make
judgments about the relative importance of each of these numbers and build
these judgments into a formula. But why
should you accept the value judgments of a bunch of editors sitting in
Washington, DC? You can take the numbers and devise your own rankings.