Article Image
Illustration by Vangie Shue

It was the first session of a lecture class, and I didn't think twice about bringing my laptop to take notes. With a magically petite MacBook Air at my disposal and the nimble fingers of one who has churned out many a 4:00 a.m. paper, what need—or want—had I of a pen? By the time the professor had introduced himself and the course, I had already answered three emails, refreshed my Facebook page a disturbing number of times, and Gchatted (that's a verb now, right?) a friend about dinner plans. The professor was reading through an ironic handout of negative CULPA reviews from semesters past, among them: Professor X will kick you out of class for using a laptop. I slinked back in my seat, pushed my computer away as though it were contagious, and bummed a Bic off of the guy next to me. But ... it's a lecture, I thought. What did he expect? Computer use is rare and often discouraged in seminars, and the distractions afforded by a laptop are difficult to mask in an active discussion of 10 or 15 people. A quick glance around a lecture hall in Schermerhorn or Mudd, on the other hand, will give you an idea of how easily the act of taking notes on a laptop turns into the fine art of multitasking. I took an informal and terribly flawed survey of the number of laptop users in my psychology lectures, estimating that more than three-quarters had multiple windows open and switched between them multiple times per minute. Why are we so comfortable blatantly engaging in these distracter tasks in large lectures? Is it simply the fact that our expected contribution is minimal compared to that of a seminar setting? If that's the case, it implies that we're only paying attention in seminars because we have someone checking up on us—a seemingly unlikely lack of motivation for Columbia students. My faulty data aside, the drawbacks of constant connection are becoming increasingly evident in studies on the human capacity to execute two or more tasks at once. In reality, multitasking more often approximates performing tasks serially rather than actually processing multiple stimuli at once. Take, for example, an instance of tweeting while listening to a lecture: It's tempting to think that we can divide our attention between a professor's analysis of the Cold War and a clever 140 characters, but it would be more accurate to think of this as a series of micro-episodes, as we alternate brief bursts of attention between the two possible stimuli to which we may attend. Each one is a distraction that impedes our performance on the primary task. Professors can stipulate the conditions for the use of devices in their classrooms—whether that regulation is the product of a Facebook-induced blow to the academic ego or more simply due to "old school" tendencies, as was the case with the professor mentioned above. It's unlikely, I'd venture, that laptops will be banned from lecture halls, or that the University will block our access to the Internet in certain rooms. And I don't think either of these things should be the case (especially in 614 Schermerhorn, please), or that glancing at an email during class is a terribly grave act. Any seminar will show you that a student can face innumerable distractions without the help of technology—among them, his own thoughts. I do, however, think that we should be aware of the real-world applications of studies on divided attention. Consider the following scenario: You are half-listening to a history lecture while reading a particularly captivating Times story online, or making a mental shopping list for an anticipated Westside trip. Later in the month, you have a midterm with a multiple-choice question about an aspect of that lecture. You know you've seen two of the names before—you can even picture them on the slide in the room—but you can't remember how they go together. Researchers have replicated this issue of peripheral attention in the laboratory: Subjects are shown a series of faces with dots drawn on them, and half are told to pay attention to the faces while half are told to focus on the dots. When shown a "test" series of faces later, including some previously-viewed faces and some combinations of facial features from the original images, the participants who focused on the dots and had the faces in their periphery were more likely to "false alarm," or erroneously claim to remember the blended faces. The reason? They had seen the features before, but without attentional focus, they were unable to properly bind the features together in memory. Professors, too, can take a few lessons from the lab when it comes to indirectly engaging students in a lecture setting in ways that avoid the ire-provoking technology ban. One simple trick is to ask a rhetorical question, pause and give students a few seconds to mull over an answer. Research indicates that how we encode information to be remembered is crucial to learning and remembering that material. If we have to produce information rather than just having it shown to us, we're significantly more likely to remember the content later. In other words, lecturers: Make us work a little harder. We'll better learn what you're teaching, and we might—just might—learn to focus. Caitlin Brown is a Columbia College senior majoring in psychology and comparative literature and society. Pick My Brain runs alternate Tuesdays. To respond to this column, or to submit an op-ed, contact opinion@columbiaspectator.com.

procrastination Lectures laptops
From Around the Web
ADVERTISEMENT
Newsletter
Recommended