Pages

Friday, March 27, 2009

Multitasking: From Unlikely to Unavoidable?

Today is March 27th. This is not an April Fool's joke.

According to Science Daily, in the upcoming April 14th issue of Current Biology, researchers at the University of Texas-Houston Medical School will publish a study titled "Attention Alters Visual Plasticity during Exposure-Based Learning." The study's lead author, Valentin Dragoi, says that "(I)gnoring the stimuli presented over days of exposure (is) more effective than actually attending them...(T)his finding can be explained by the fact that, typically, attention filters out unwanted stimuli so they are not consciously processed. However, in the absence of attention, stimuli are able to escape the attentional mechanisms" that would have otherwise filtered them out "and induce robust learning after multiple exposures" (Science Daily, 2009, para 8).

You must be wondering, "Did I understand that right? Trying to devote my undivided attention to something causes me to miss lots of details, and not paying attention enables my mind to perceive and learn more? Huh?" That was my reaction too.

On the basis of this study, it seems that focused attention functions like a microscope: It allows people to see a subset of details within a very narrow field of view. In the periphery beyond the microscope's lens, of course, there is much more that could be observed, but it is beyond one's conscious attention. All that data might not completely escape notice, though. The study suggests that one's sensory system and mind are capable of concurrently perceiving an untold portion of the multitude of stimuli beyond the object on which one has focused his attention.

Sounds too good to be true, doesn't it? When you look at the details of the study, you realize that it takes quite a bit of extrapolation to get from the experimental conditions to anything resembling "real" learning situations. Here's a rough summary of the procedures:

Six subjects fixated on a certain position of a computer screen at which circles filled with parallel lines flashed rapidly. From time to time the circle's orientation or color changed. Subjects were instructed to discriminate these changes by pressing a button. Concurrently, on the periphery, another set of similar circles flashed, but subjects were forced to ignore them (in order to be able to identify changes to the circle on which they were instructed to fixate). After many sessions of being habituated to watching the circles and becoming familiar with the angle at which the lines slanted, then the subjects were tested to see if they could identify circles in which the lines were slanted at variety of different angles compared to the orientation on which they had been trained, and that's where the interesting thing happened. The subjects were able to discriminate a much wider range of changes in the orientation of the circle on the periphery. Without trying, they had learned about it's orientation, and their knowledge of it was more generalizable than their knowledge of the circle on which they had fixated. There is more to the experiment, but that gives you a general idea of how it worked.

Knowing the details of the experiment, it doesn't seem quite as dramatic as Dragoi's characterization made it sound. It's not dramatic that my senses monitor my surroundings beyond the focus of my attention. It's also not dramatic that repeated exposure to certain stimuli, even unattended, would cause my sensory system to send messages to my brain, yielding increasingly stronger neural connections. I suppose in the strictest sense, these phenomena "count" as multitasking and learning, but I am interested in more demanding forms of multitasking and learning. I am struggling to think of any practical ways I might apply the findings of this study. Here's a creazy idea: Instead of explicit teaching, I could broadcast educational subliminal messages in my classroom while we all watch television and play video games! (Just kidding, of course.)

1 comment:

  1. I think this may explain why I tend to remember a lot of what I read if I skim or read as brisk pace but not when I read slowly and deliberately. For example, like when you are trying to absorb the academic basics of HCI. In all seriousness, I wonder what percentage of subjects reflected the ability to concurrently perceive the orientation changes. If a subject could not perceive the changes, could their perception be trained or improved?

    Have you considered how social networking interfaces, MySpace, Facebook, etc. may capitalize on this effect inadvertently? I have seen many MySpace profiles that are so busy graphically; there is so much going on, that it is almost literally impossible to figure out where to start reading. They come across as personal profile mashups where it is distinctly difficult to find a particular piece of information. Are social networkers inadvertently picking up on more information than one might think?

    Going on nothing but instinct it seems to me that one would find varying levels of perceptive abilities or multitasking. As there are generalized tests for measuring intelligence, personality or mental health it would be interesting to know how perceptive abilities, especially multisensory perception could be measured or linked to multitasking. I think you’ve opened a can of worms.

    ReplyDelete