While I applaud the Forbes authors for their attempt to look out for the common college applicant, their methodology is questionable. In an introduction to the Forbes list, the authors discuss the questions they sought to answer and also take a deliberate jab at the loved/hated US News & World Report rankings:
"To our way of thinking, a good college is one that meets student needs. While some college rankings are based partly on school reputation as evaluated by college administrators and on the amount of money spent, we focus on things which directly concern incoming students: Will my courses be interesting and rewarding? Will I get a good job after I graduate? Is it likely I will graduate in four years? Will I incur a ton of debt getting my degree?"
The research questions are actually effective counseling tools. College applicants should look at graduation rates, graduate successes, and course descriptions among other evaluative criteria. Every college publishes an annual college catalogue explaining major requirements, department missions, and course descriptions. These thick books are published for an audience of current college students who are selecting their upcoming courses. High school students interested in really learning about the culture of a college should start by reading the catalogue! Luckily, almost all colleges have switched to an online pdf version of this item that is accessible via the college's website. These course descriptions can provide a great perspective in regards to pedagogy, departmental structure, and expectations for students who take those classes.
But back to rankings bashing.... The Forbes methodology is flawed to say the least. Here is a quotation from the Forbes article that details how the rankings components were compiled:
"They based 25% of the rankings on 4 million student evaluations of courses and instructors, as recorded on the Web site RateMyProfessors.com. Another 25% is based on post-graduate success, equally determined by enrollment-adjusted entries in Who's Who in America, and by a new metric, the average salaries of graduates reported by Payscale.com. An additional 20% is based on the estimated average student debt after four years. One-sixth of the rankings are based on four-year college graduation rates--half of that is the actual graduation rate, the other half the gap between the average rate and a predicted rate based on characteristics of the school. The last component is based on the number of students or faculty, adjusted for enrollment, who have won nationally competitive awards like Rhodes Scholarships or Nobel Prizes. (Click here for the complete methodology.)"
I was a college student not too long ago. And I have visited the aforementioned websites. To say the least, these sites are not very statistically sound nor are they strictly regulated. Students who write on ratemyprofessors.com are usually those on the two polars of the spectrum - those who have strong positive feelings for a certain professor and those who have strong negative views. This is not a statistically appropriate sample who can in any way represent the broader population. In regards to the Who's Who list... what overachieving student hasn't received a letter in the mail asking them to pay $40 so he/she can get a copy of the book?! On the plus side, the final criteria listed (professors who have won nationally recognized awards) is an admirable field of examination. It is certainly important for colleges to maintain a competitive edge by boasting an impressive faculty roster. It is more important, however, for students to find out if those professors are accessible at an undergraduate level.
The Forbes report also raises broader implications for the entire ranking system. How does Duke University rank number 8th in the US News & World Report (in 2008), but rank 109th in this week's Forbes list? This further supports my earlier point, that rankings are in the eye of the beholder.
No comments:
Post a Comment