Americans are competitive people and therefore obsessed with rankings, even geeky ones. Higher ranked colleges attract better students and more grants. Rankings might even push administrators to focus more resources on mere undergraduate teaching, a mission long ghettoized in research universities.
So how can we make college rankings better? Rankings do a decent job of reporting a school’s resources, and student candlepower. What existing rankings fail to do is capture whether a college or university takes seriously its charge of developing young minds. As Stanford Social Psychologist Carol Dweck explains in Mindset, one can view intelligence as a variable or a constant. If intelligence is something you are born with, then there is no point working to get smart: you either have brains or you don’t. Unfortunately, this seems to be the mindset of many universities, which work hard to attract smart kids for prestige and then do little or nothing to make them smarter.
Like many parents, I want to send my children to a university that would challenge them academically. Yet too often higher education does no such thing. After all, faculty time grading undergraduate papers is time not spent publishing and applying for grants, the sort of prestige seeking behavior rewarded in the rankings.
Indeed college ranking may actually harm undergraduate education. To raise rankings, which reflect graduation rates, some colleges have slashed requirements and pushed faculty to pass anyone with a pulse. As anyone who has been around any state university knows, athletes are not the only students taking no-show courses.
Accordingly, I would keep U.S. News and other such rankings, but to them add a new set of numbers, Maranto’s measures, calculated from four items:
*As Thomas Sowell pointed out in his classic Choosing a College (available free on line at http://www.leaderu.com/alumni/sowell-choosing/toc.html), one measure of college success is the percentage of alumni who go on to earn doctorates. This is not to say that you would want your kid to get a Ph.D.; merely that those who do probably gained considerable intellectual preparation and inspiration from their undergraduate days. As Sowell notes, many unheralded colleges like Wabash and Wooster had more than 7% of their alumni earn doctorates, something not attained by any of the Ivies despite their more talented student bodies.
*A second good measure is whether the faculty is politically diverse. Particularly in the social sciences and humanities, if faculty and guest speakers all or nearly all lean left, students might not be exposed to a wide range of ideas, harming their education. (This is an occasional theme of this publication.) Of course faculty ideology can be hard to pinpoint, so this should be a job for organizations like NAS, ISI, and Manhattan Institute, who could do a public service by tracking and perhaps captive breeding the remaining academic Republicans. (To this end when asked for alumni contributions from my undergraduate and doctoral institutions, as happens on a regular basis, I always promise to write a check as soon as the political science department hires a Republican faculty member. So far my money has been safe.)
*Third, Minding the Campus, the U.S. Department of Education, or somebody needs to start surveying students to see how often their faculty assign papers, and whether faculty provide feedback on said papers. As Richard Arum and Josipa Roksa point out in Academically Adrift, evidence suggests that at most large campuses the answers are rarely and never.
*Finally, the ultimate test, as a professor I often ask faculty members a simple question: would you send your own child to study at the college where you teach. Of course that may be a trick question since faculty know which of their peers to seek out and which to avoid for a real education. Still, a large, confidential survey would give a good sense of what one would need to know.
Interestingly, the four elements of Maranto’s Measures are never included in any college ranking, nor in any accreditation instrument of which I’m aware. Yet each gets at the core of what makes a college or university a teacher of men and women. Maybe that is why they are not included.
Robert,
I actually wanted to praise you folks, but lacked the space. The Washington Monthly rankings are quite useful, just as I would expect from the magazine founded by Charles Peters. I’ve toyed with the idea of doing some simple number crunching to test that proposition that HBCUs and Catholic colleges and universities do particularly well on your measures, which ask what a college does for the country.
Bob Maranto
My name is Robert Kelchen and I’m the methodologist for Washington Monthly magazine’s annual college rankings (http://www.washingtonmonthly.com/college_guide/about_the_rankings.php). We use the first measure (the percentage of alumni who go on to attain doctorates) in our rankings–and have done so for years.
I’d love to have data on the other measures, although it would be difficult to get objective data if rankings became high-stakes. (Maybe surveying faculty to see where their kids actually went would help with the final question?)