RateMyProfessors.com

From Wikipedia, the free encyclopedia

A screenshot from www.ratemyprofessors.com
A screenshot from www.ratemyprofessors.com

RateMyProfessors.com (RMP) is a review site, founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which allows college and university students to anonymously assign ratings to professors of American, Canadian, British, New Zealand, and Australian institutions. The site contains more than six million ratings, for more than half a million professors. The site was originally launched as TeacherRatings.com and converted to RateMyProfessors in 2001.

All ratings are supposed to be anonymous without any instructor input or decisions. According to the site's privacy policy, RateMyProfessors.com will only reveal a user's personal information in response to a court order or a subpoena.[1]

In November 2005, RateMyProfessors was sold to Patrick Nagle and William DeSantis.[2] The site was sold again in January 2007 this time to MTVu, a subsidiary of Viacom.

Contents

[edit] Ratings and reviews on RateMyProfessors

Anyone who goes onto the website, with cookies enabled, may post a rating and review of any professor already listed on the site, and may create a listing for any individual not already listed. To be posted, a rater must rate the course and/or professor on a 1-5 scale in the following categories: "easiness", "helpfulness", "clarity", the rater's "interest" in the class prior to taking it, and the degree of "textbook use" in the course. The rater may also rate the professor’s "appearance" as "hot" or "not", and may include "comments" of up to 350 characters in length.

According to the website’s FAQs page, "The Overall Quality rating [that the professor ends up with] is the average of a teacher's Helpfulness and Clarity ratings...." It’s the professor’s Overall Quality rating that determines whether his/her name, on the list of professors, is accompanied by a little smiley face (meaning "Good Quality"), a frowny face ("Poor Quality"), or an in-between, expressionless face ("Average Quality"). A professor's name is accompanied by a chili pepper icon if the sum of his or her appearance ratings is greater than zero (one "hot" rating equals +1, one "not hot" equals −1).

Michael Hussey, who helped design RateMyProfessors, sums up its purpose: "All we're doing is taking chatter that may be in the lunchroom or the dorm room and organizing it so it can be used by students."[3]

Although one RateMyProfessors CEO, Patrick Nagle, told Christine Lagorio of the Village Voice, "we are for students, by students," he also justified the site's existence by explaining: "This is to enhance your college experience so you don't end up taking classes that aren't worthwhile."[4]

[edit] Professors' rebuttals

Since MTVu took over the website, RateMyProfessors.com has added a "new rebuttal feature which allows professors to write blog-like comments. These comments will appear in a new window that will link from here [the 'User Comments' section of the professor's page on RMP] once a rebuttal has been added."[5] Professors must register with the website, using an ".edu" e-mail address, in order to make their rebuttals. The site also links to a new website called "Professors Strike Back" which features videos of professors responding to specific ratings that they received on RateMyProfessors.[6]

[edit] Criticism

[edit] How accurate are the ratings?

  • Perhaps the main criticism of RMP is that there may be little reason to think that the ratings accurately reflect the quality of the professors rated.[7][8] For one thing, there may be no good reason to think that raters are even attempting to represent the truth in their ratings or comments. And even when they are, there is perhaps insufficient reason to think that the facts raters have in mind are an accurate measure of the quality of the professors. RMP's format seems to presume that raters are all well-informed, competent, and impartial judges of the quality of teaching in general and that of individual teachers, as well as of the "quality" of their appearance. Even more importantly, the range of areas on which ratings are allowed is narrow. Clarity and helpfulness may not be the only or even the major components of quality instruction in general.[9] It may even be true in some cases that high quality instruction can be given when these two qualities are lacking.[10] Edward Nuhfer says that both Pickaprof.com and RMP "are transparently obvious in their advocacy that describes a 'good teacher' as an easy grader. ... Presenter Phil Abrami...rated the latter as 'The worst evaluation I've seen' during a panel discussion on student evaluations at the 2005 annual AERA meeting."[11] A study of RMP ratings, conducted by James Felton found that "the hotter and easier professors are, the more likely they’ll get rated as a good teacher."[12]

[edit] Who's doing the rating?

  • Some have criticized the passive method of data collection. Clearly there are few, if any, cases where a professor listed on RMP has been rated by every one of his/her students, or by every student in a single course, or even by most of the students in a single course. It may therefore be doubted whether the ratings represent more than the opinions of a tiny minority. Furthermore, due to a phenomenon known as "selection bias", it may be that students who rate a professor have very strong feelings (positive or negative), while students who do not have strong feelings will tend not to bother rating at all. RMP itself admits that the ratings are "not really" statistically valid.[13]
  • Also, studies of research methodology have shown that in formats where people are able to post opinions publicly, group polarization often occurs, so that a given instructor will often receive very positive comments, very negative comments, and little in between; those who would have been in the middle are either silent or pulled to one extreme or the other. So the ratings may not represent a true consensus.
  • Another potential problem is that a single individual may make multiple separate ratings of a single professor on RMP. While RMP alleges[14] that it does not allow such multiple ratings from any one IP address, it has no control over raters who use several different computers, or those that "spoof" IP addresses, and RMP in actuality only conducts minimal policing of ratings, frequently allowing multiple ratings from even a single IP address.[15] Also, there is no way of knowing that those who rate a professor's course have actually taken the course in question, making it possible for professors to rate themselves and each other,[16] and for posters to rate professors based purely on hearsay for any reason imaginable. As recently as May 2006, the FAQ page on RMP itself said:
Who can rate? Is it limited to students? We prefer that you only rate teachers you have first-hand knowledge of. However, it is not possible for us to verify which raters had which teachers, so always take the ratings with a grain of salt. Remember, we have no way of knowing who is doing the rating — students, the teacher, other teachers, parents, dogs, cats, etc.[17]

[edit] How relevant are raters' comments?

  • Another complaint about RMP is that unlike other methods of rating teacher performance, most teachers themselves do not feel they gain any helpful feedback.[18] Rather, they say that it only leads to the harassment or denigration of particular instructors.[19][20] Critics state that a number of the ratings focus on qualities irrelevant to teaching, such as physical appearance. Furthermore, whereas anyone who is hired as an instructor usually has a great deal of experience in and knowledge about the field in question, students generally do not. Given their lack of experience and of wide-ranging knowledge of the subjects being taught, students generally are in no position to make well-informed criticisms of the teaching they receive.[21]
  • Also overlooked is that it is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach. RMP, though it lets the student identify the course that they took with the professor, lumps together the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.

[edit] Missing the point of education?

  • Instructors have voiced concern that it and similar methods are in fact counterproductive to the educational process by minimizing the importance of quality in instruction and instead encouraging students to see themselves as consumers filling out product satisfaction surveys, a disheartening perception of student values.[22] This sentiment has been articulated by Mark Edmundson[23] of the University of Virginia, among others, in what has been described as an influential essay[24] on the topic of course evaluations.

[edit] See also

[edit] References

  1. ^ RMP Privacy Policy
  2. ^ John Swapceinski announces sale of RateMyProfessors.com
  3. ^ Everybody's a Critic - New York Times 23 April 2006
  4. ^ Lagorio, Christine. "Hot for Teacher: Students Rate Profs Online -- and Vice Versa", Village Voice, 2006-01-11. Retrieved on 2006-01-11. 
  5. ^ ratemyprofessors.com, accessed 19 Dec 2007.
  6. ^ http://www.mtvu.com/professors_strike_back/
  7. ^ Sacha Pfeiffer, "Ratings sites flourish behind a veil of anonymity", September 20, 2006, Boston Globe Online[1]
  8. ^ Kenneth Westhues, "Stephen Berman: Scapegoat", Dec 2006[2]
  9. ^ James M. Lang, "RateMyBuns.com", Chronicle of Higher Education, December 1, 2003 [3]
  10. ^ See Fritz Machlup and T. Wilson, cited in Paul Trout, "Deconstructing an Evaluation Form", The Montana Professor, Vol. 8 No. 3, Fall 1998, accessed 7 May 2008.
  11. ^ Edward B. Nuhfer, 2005, "A Fractal Thinker Looks at Student Evaluations", accessed 10 May 2008.
  12. ^ David Epstein, "‘Hotness’ and Quality", Inside Higher Ed, 8 May 2006, accessed 10 May 2008.
  13. ^ www.ratemyprofessors.com/faq.jsp, accessed May 16, 2007
  14. ^ Pfeiffer, "Ratings sites flourish behind a veil of anonymity".
  15. ^ Westhues, "Stephen Berman: Scapegoat".
  16. ^ Gabriela Montell, "The Art of the Bogus Rating", Chronicle of Higher Education, September 27, 2006[4]
  17. ^ www.ratemyprofessors.com/faq.jsp, 18 May 2006, accessible via http://www.archive.org/web/web.php; also quoted in Westhues "Stephen Berman: Scapegoat"
  18. ^ Lang, "RateMyBuns.com".
  19. ^ [Steve Giegerich] Associated Press, "Web warnings: Sites extol, slam professors", Feb 2003, CNN Online[5]
  20. ^ David Epstein, "Rate Your Students", Jan 2006, insidehighered.com[6]
  21. ^ Robert Sproule, "Student Evaluation of Teaching", Education Policy Analysis Archives, vol. 8 no. 50, 2 November 2000, accessed 7 May 2008.
  22. ^ The Hottest Professor on Campus
  23. ^ "On the Uses of a Liberal Education: As Lite Entertainment For Bored College Students", Harper's Magazine, vol. 295, Sept 1997, pp. 39-49, accessed 30 May 2008.
  24. ^ Book Reviews, Averett Library News Winter 2005 [7]

[edit] External links