January 27, 2008

Behind the Scenes of Rate My Professor

Posted by Michael Arnzen at 11:35 in Praxis and Reviews.

As I mentioned in my first entry after returning from hiatus, RateMyProfessors.com has grown since I first looked at it a few years ago, particularly in the ways in which professors can interact and respond to the student comments. Inspired by the video responses from teachers, I decided to join the site as a professorial member, and since I'm guessing other profs out there rarely join it (or probably only access it anonymously once in awhile to read their own ratings or those of their colleagues), I thought I'd open the curtain a little bit so you can see what it's like there once you sign up. Consider this a website review, rather than any endorsement or direct encouragement to join them.

In fact, you might not want to encourage the site by giving it a hit to begin with. If you haven't seen Rate My Professors, it is an independent website where college students can post comments anonymously (virtually without responsibility, save for community enforcement of the rules). These students fill out forms that "rank" their professors on such criteria as level of difficulty...and "hotness." Indeed, beside a "highest ranked" chart for schools and teachers, the site sports a master chart of the "50 hottest professors" on their front page, which probably tells you all you need to know about the academic legitimacy of the site.

If it doesn't, a good overview of this issue appeared in Christine Lagorio's article, Hot for Teacher, which appeared in the Village Voice in January 2006 -- a highly recommended read which brilliantly compared RMP and other websites of its ilk to "the slosh of a giant virtual spitball smacking the ivory tower" while at the same time reminding us that there may be some merit in the site's purpose.

Terry Ceasar, in his lucid IHE article on the significance of the site on the landscape of higher ed, also gives much enlightenment, comparing it to American Idol and musing over the consequences.

Although in my reading of the site, students tend to use this site to recommend their favorite teachers and advisors (often with hyperbolic-yet-kind praise) more than anything else, a great number of professors have railed against the anonymous postings of students, who seem free to virtually libel a professor (or at least bias others from taking their classes and soiling their reputations) without accountability, and to post their comments and ratings completely outside the context of the usual "course evaluation" where such things might actually help the teacher review and alter the class. In other words, it seems geared more toward personality and popularity points than anything related to learning. Some profs have gone so far as to retaliate by rating their students in a like fashion, as the fascinating blog rateyourstudents makes clear. It's true that this may be going too far (or maybe even sinking to the sophomoric level of the students on RMP) because the Rate My Prof site does allow visitors to "flag" inappropriate postings...and now allows profs to "rebut" them, generally...but by the same token, unless a professor visits the site and does these things herself, it is probably unlikely that a student will police any professor's profile.

So whether you're a tenured college teacher, grad student instructor, or adjunct, you might want to join the site anyway and keep an eye on what people are saying about you, after all.

Indeed, as Towsen U professors James Otto, Douglas A. Sanford Jr., and Douglas N. Ross pointed out in "Does Ratemyprofessor.com Really Rate My Professor?" -- a thorough empiric analysis of the site that appeared in the Oct 2007 journal of Assessment and Evaluation in Higher Education -- the numbers on this site might actually have statistical correlation to teaching performance, despite the occasional flames of student outburst that call them into question.

So perhaps RMP and others of its ilk -- Professorperformance.com, Pickaprof.com, or Studentsreview.com -- might have some merit. If you're interested in joining, what follows is a preview of what you'll find. I'll share my opinions and warnings along the way, and conclude with some passing ideas about how this might be turned into something teachable or work for faculty self-development.

I was happy to learn that the site offers a verification process to make sure that someone who says they're a prof is actually employed by an institution before allowing them to join the site. This gives ratemyprofessors.com a modicum of credibility. They asked for my phone number and warned that it might take a few days for them to verify me; perhaps they heard my voice mail, I'm not sure. I'm not sure how they verified my identity, exactly, but my info is up online on this page as well as in the faculty directory on the Seton Hill University website, so I'm sure it wasn't too hard to confirm. Regardless, it took a few days so I'm assuming it was a real verification, either by phone or online. I was ultimately confirmed via an e-mail message that asked me to sign in from my campus e-mail account, which is the same way many e-mail lists will verify their subscribers to prevent spam. So the free enrollment process seems to function well to prevent anonymous students posing as teachers -- something other social networking pages probably could work harder do.

After joining, and being verified as a prof, and signing in for the first time -- the site then asks you to fill in the typical "personal profile" questionnaire that asks a few questions that seem almost too personal for the purposes of this site. The professor's sign-in page is really the same as the student sign-in page, so perhaps that accounts for the personality questions. I had to hunt for a button that said "I'm a Professor" to bypass some of these opening screens. But, clearly there's more going on here than just evaluating teachers. Like most social networking sites, there's clearly a degree of demographic harvesting at work. I kept my answers pared down to bare minimums and a few outright falsehoods. (I don't think students need to know my birthday, for example...but I'm sure it helps MTVu's marketers understand the age of their aggregate users. Still: isn't it clear that most users of RMP are likely 18-24 -- i.e. college-aged?! That demographic is built in to the very concept of the site! No matter: as far as they're concerned, I am a 94 year old professor who's birthday falls on Xmas day. If I start getting geriatric foot powder spam or gift cards to senior discount drug stores, I'll know something's amiss!)

We all know that websites collect information and have their own privacy policies that anyone signing up for them should read and review before signing up, and RMP has one worth reviewing before you enroll. But I raise this matter because when I browsed around the site, I noticed something interesting: even when students rate professors, they input a lot more information about the class then what you see on the main list when you look at a teacher's profile. For example, students are asked to enter whether attendance was mandatory and what sort of grade they got in the course. Where does this info go? How is it used? You don't see it on the public page of the site. We know whether students think teachers are "hot" but we don't know if they took attendance? That's odd. In any case, this info may go into a screen that no one besides the student sees -- I'm not sure, because I didn't actually rate anyone, I just went to the first page that opens up when you do so. In any case, I think maybe that course info should be reported out to the public, not kept private, because it might help readers interpret the student ratings.

One part of the rating/evaluation form that raised my eyebrow was the question about textbooks, which asked students to rank "Textbook Use" on a scale from 1 to 5 -- and then it also asked for an ISBN. Hmmm... are they sharing this information with book publishers or online booksellers? I'm not sure, because, again, this isn't reported on the public screens along with the course ratings, and it isn't clear why they're asking for it. Regardless, I seriously doubt many students look up the ISBNs of their books when rating and commenting on professor's classes, so maybe it's a moot point.

There's also a place to mark whether a prof is "still teaching" or "retired" in this rating screen. I find this odd and wonder why it's there. Because I also question how many students know this employment status if they're writing ratings about past classes with nostalgia or long-term grudges. Instead, it should ask for "year taken" or something like that.

I really don't know where all this info goes or what it signifies, but there's more going on in the ratings then meets the public eye. And given the advertising everywhere on the site (from credit cards banners on your left to deceptive text-only sponsored links on the bottom of the page) it's fair to assess a commercial interest in some of this information.

Though it didn't work for me for some reason, the site promises that you can subscribe to your own page on ratemyprofessors.com as an rss feed. This might be the best way to go if you want to keep up with new postings on your work, but don't want to succumb to the lure to check your ratings as obsessively as some writers I know who check their amazon.com sales rankings. Plus it will keep you away from browsing your college on the site, where the temptation to read your colleagues' rankings is really quite strong. You probably shouldn't do this, especially if a time may come when you are in a position to evaluate the teacher for promotion or tenure. A little empathy can go a long way here: just as you probably wish your own rankings and comments had more context, if you read your colleague's info, you are doing so out of context, and shouldn't be quick to leap to any particular conclusions. Sometimes the best teachers get the worst ratings, simply because they are challenging. Any given sampling of students on RMP entry is probably not representative of the entire class -- it is simply a collection of rankings by those who use RMP -- which is not necessarily a properly random population sampling.

With over 6,000 schools from five different nations, a total of 1 million profs and 6 million opinions listed about them, RMP clearly is a "hot" site with a lot of content and data. The student opinions are often genuinely felt, even if they are sometimes irresponsible or hostile or rife with empty praise. On their own criteria, and using a 4.0 grading scale, I give RMP a 3.5 for ease-of-use, a 2.5 for helpfulness, a 3 for clarity and a 3 for rater interest. Teaching is not a popularity contest, but if you are interested in student feedback on your own teaching, this is but one of many ways to look for it. I caution you against rebutting, because this could encourage future student raters to bear bait just to see what you'll say next.

Of course, you can and should still get "anonymous" feedback from your students by passing out a handout or doing your own evaluations in class, and that's a better way to go, because such evaluations occur within a specific context, and along a direct line of communication between teacher and student, rather than student and student. You could easily borrow the criteria from RMP and make your own in-class handout and -- something I would think is best -- have a class discussion about these things. In the classroom, I think it is important to separate evaluation from the politics of judgment whenever possible, and instead to turn evaluation into a method of inquiry -- an inquiry into both the subject being evaluated and the criteria used to evaluate it -- instead. In our "American Idol" culture, this understanding and skill might be more imperative than ever to teach.

We can only learn from engaging in such evaluative inquiry; rating is about snapping to judge.

Trackback Pings

You can ping this entry by using http://blogs.setonhill.edu/mt/mt_tb-awoisdlkfj.cgi/11471 .


Just spotted another "college review" site that I thought I'd post a link to here -- it's called Communiversity... a really cool word that probably should be used in some other contexts, as well. (Like "strategery" ;-) )...

Here's the link to SHUs page (empty as of this posting):


Posted by Mike Arnzen at 16:02 on February 23, 2008. #

Some professors just aren't any good other than knowing every student that took a class from that professor there are no real alternatives. The school gets response from students about what they think of the professor but dose not release this information nor dose it collect data from students who withdraw drop out or switch classes. We pay allot to go to school and deserve to get fair value out of our investment. I had a choice of 3 professors for a summer trig class, [One prof]'s class fit my schedule best but with 35 ratings on a scale of 1 to 5 he had an easiness of 1.2 and a helpfulness of 1.6. I read all of the reviews and it seems he is rather mean, openly insults students who ask questions in class and doesn't give adequate time to take tests; all of witch were reported several times. So I ended up taking an evening class from another professor.

Posted by Anime Master at 14:24 on April 2, 2008. #

You will probably score very high on this site. It's the teachers who do not care that will perpetually score low. Obviously, you took the time to write out this big huge diatribe, so you DO care!

Posted by Satan at 15:15 on May 6, 2008. #

Post a comment

Remember this information?

(requires cookies)