Students filling out end-of-term course evaluations during the next three weeks will be able to see their peers' reviews for the first time beginning next semester.
The Arts and Sciences' Educational Policy and Planning Committee has moved to open course evaluations, creating a standardized set of questions for students to answer at the end of the semester. Their feedback will then be published and accessible to students through the online course planning guide Vergil—though faculty can choose to opt out of making their evaluations public for the next two years.
Students have called for open course evaluations for years. The University Senate passed a recommendation in 2012 in support of the initiative after the Student Affairs Committee produced a report recommending the implementation of open course evaluations. Only the School of Engineering and Applied Science currently publishes course evaluations.
Because course evaluations have not been public, students have instead relied on third-party website Columbia Underground Listing of Professor Ability and Courses@CU, a website owned by the Spectator Publishing Company, to solicit their peers' advice in choosing classes.
But in calling for open course evaluations, some have criticized CULPA, arguing the site often provides outdated reviews and does not cover every course and professor.
"Open course evaluations promote a culture of transparency, accountability, and self-examination consistent with the highest ideals of scholarship; provide valuable information for students as they plan their academic careers; and signal to students that their opinions are taken seriously, leading to more thoughtful and higher quality feedback for instructors," the report said.
The EPPC began debating the issue in the fall of 2014 and voted in May to open evaluations. Currently, the committee is in the process of finalizing the wording of several standardized questions.
According to a proposal the committee created in May, four responses will always be published: what students have learned in the course, their overall assessment of the course, whether or not they would recommend the course to fellow students, and a comparison of the course to others the student has taken.
Graphic by Ivy Chen
Departments and individual instructors will be able to customize the evaluations by adding up to eight questions specific to their courses, according to EPPC chair and professor Brent Stockwell. However, the answers to those questions will not necessarily be made available to students.
"There is some information [the EPPC members] want to make to available to students to assist them in selecting their courses, and there's another goal of the evaluations, which is to provide feedback to the instructors and the departments," Stockwell said. "And those are both valuable, kind of overlapping, but not identical things. … So the idea was to decide what is helpful to students, publish that and what's useful for feedback for the instructor, and give them that information."
Columbia College Dean of Academic Affairs Kathryn Yatrakis told Spectator that the questions for the evaluations were rewritten in a way that would lead to more effective feedback for both students and faculty.
"Our previous evaluation was just too long, too convoluted," Yatrakis said. "We started to think about the evaluation and think about reshaping it, so we're really asking students more about their academic, intellectual experience in the class.
There will be limitations to the published evaluations: The EPPC will allow faculty to opt out of publishing evaluations for their courses for the next two years during the pilot period of the new system. After those two years, faculty teaching a course for the first time will also be allowed to opt out of publishing their evaluations, in response to a recurring faculty concern that harsh evaluations might discourage younger professors or those who are teaching experimental courses.
"It's very important that faculty teaching a course for the first time or trying something out, that evaluations not have a chilling effect on faculty innovation and on teaching innovation," former EPPC chair and professor Susan Pedersen said.
Faculty must choose to opt out before they read their evaluations. But Yatrakis predicts that few faculty members will ultimately choose to opt out of the published evaluations.
"I think everyone understands that this should happen, so I'd be very surprised if many faculty [opted out]," Yatrakis said.
The EPPC is also working to determine a way to minimize bias in evaluations of women and minority instructors, which, according to research that the EPPC has taken into account, is common among students.
"In the short term, we want to go ahead with the implementation, have a large number of evaluations published, and then as we go ahead establish the policy for what could be redacted," Stockwell said. "And then, how do we in a longer-term way minimize bias in evaluations?"
In addition to these restrictions, the EPPC has decided not to publish evaluations from a class that has such a low enrollment that the anonymity of students would be at risk. The EPPC is also in the process of establishing a committee that will review faculty requests to have certain comments redacted.
"The sense is that the bar would have to be high [to redact comments], but what that will be exactly still has to be determined," Stockwell said.
Stockwell also noted that the subcommittee would have to decide whether to redact a single inflammatory comment or the student's entire evaluation.
Despite these restrictions, Yatrakis believes that the changes to the evaluation system will greatly improve the quality of feedback.
"It will be interesting to see," Yatrakis said. "My own guess is that [the new policy] will just make the evaluation system much better and more valuable for the faculty members as well as for the students."