Sunday, December 6, 2009

Decontextualization, Part 1: Sum Ergo Cogito

In my last post I asked "what is wrong with the mainstream cognitive science view of internal 
representation?" (more or less). The answer in a word: decontextualization. There are many  directions the Western intellectual tradition might have taken in building on the Greek's skepticism and attempts to explain how we manage to perceive and act (largely) adaptively while connected to the world only by our senses. One interesting direction might have been to notice that we are also always acting in the world, and our senses are exquisitely tuned for successful interaction with our particular needs and niches in the environment which we share with other living things. We could have looked to a broader context, asking ourselves what about living creatures' interaction with their surroundings allows them to do what they do. For one thing is certainly true - all of us are always doing something, and often the stakes are high.

Instead, with a few brief diversions like the Romantics, we went into our heads. We interpreted veridical perception as a "problem" and created centuries of Gedanken experiments to try to puzzle it out. In short, we fell in love with the kind of formal reasoning that at least seems to be unique to us humans. Later, in response to the difficulty of applying the strict experimental approach of Behaviorism to our "higher" faculties, practitioners of the emerging field of cognitive science sought the same kind of abstract intellectual modeling tools that are so helpful in physics. When I was in graduate school many of us used to call this "physics envy." In trying to understand our own cognitive capabilities in terms of internal representation and other formal structures, huge edifices of "black box" mechanisms have been created in apparent violation of Occam's razor. How can the "simplest explanation" include so many constructs for the operation of perception, memory, logical reasoning, language, attention, and even computation when the physiological operations of the brain that MUST underlie these phenomena are completely unknown, or at best still so poorly understood? Why do we continue to deny the existence of many of these cognitive capacities to other animals when their success in navigating the world in spite of problems that have vexed us since the Greeks is self evident?

In my own graduate work, I had the privilege of working with animal subjects before I started working with people. I know from personal experience that what you learn about the cognitive capabilities of a subject can be very different when you are approaching cognition from an abstract or more contextual and naturalistic perspective. I personally spent many hours showing pigeons various slides to see if they could learn to understand simple categories such as "green bar" or the more challenging "green bar without a red dot". Well, it turns out pigeons can take a really long time to learn these things that seem so elementary to us. On the other hand, I had a colleague who studied pigeons' perceptual capabilities using more ecologically relevant situations for them. He took a handful of red wheat and tossed it onto a tile of multi-colored gravel he used in his experiments. The wheat completely disappeared - I stared at the tile for minutes without finding a single grain. He said that a pigeon can clear that handful of grain off that tile in a few seconds, its beak a complete blur as it strikes at one kernel after another.

But wait, you say, that's just seeing the wheat, there is no categorization or higher function involved. Well, I was really impressed to learn about a researcher who was having problems with a more realistic categorization experiment than my own. He was using outdoor scenes for his experiment, in which he showed pairs of slides to which the pigeons had to learn "same or different". To get a large batch of identical slides, he took his camera into the woods, fixed a spot, and then reeled off a couple of rolls of film. Once he came back into the lab and started running his experiments, he discovered that the pigeons simply were unable to learn that any of these pictures were the same - they kept insisting that the slides of the identical scene were different, even though the cost to them must have been missing a food reward. Only after very close analysis of the slides did he realize that the slides were in fact different in their small details - in between the individual shutter clicks, the breeze had moved leaves and branches around slightly. That was much more apparent to the pigeons than to the researchers, who really don't have to bother much about the direction of wind between trees. But it sure mattered to the pigeons. Another researcher had trouble with a particular slide in which he was trying to get pigeons to distinguish between street scenes with and without people in them. Sure enough, the pigeons were seeing a person in a fourth floor office window the researcher had not noticed. So when people ask me, "are pigeons smart at all?" - assuming the answer is "no" - I generally say "well, they are really, really good at being pigeons."

So, what have we bought for ourselves by denying minds to our fellow creatures, ignoring our own bodies and embeddedness in the world, and embracing whole-heartedly a mostly abstract, formalist interpretation of cognition? Embarrassingly little, really. We do not have a very different understanding of ourselves and how we and other animals navigate the world than we have had since Freud or William James. The big 'C' - consciousness - is still a mystery, as is the problem of qualia (when you see red, do you see and feel the same red that I do?). There have been few dramatic improvements in education, training, or moral reasoning. Psychotherapy has benefited to some degree, but there have been no unequivocal breakthroughs in treatment for most cognitive and emotional disorders. There are a couple of practical applications where the results of research into the nature of memory, perception and attention have made a more substantial impact. One is the improvement of the design and usability of airplane cockpit control panels, software, and other complex technology, in which the work of Don Norman and many other cognitive scientists has been crucial. The other is in the evaluation of eyewitness testimony in jury trials, where Elizabeth Loftus has shown that the traditional "gold standard" of courtroom evidence can often be marred by suggestive questioning and the often reconstructive nature of memory itself. And while these advances are real, they are hardly universal: most software and complex devices are still very hard to use, and most people still believe that our memories are "true" and that eyewitness evidence must be sound.

Perhaps it's time to at least entertain a different perspective on who gets to count as a thinking being, and what thinking is anyway ...

Not quite next, but soon: If not computation on stored internal representations, then what?

Sunday, July 5, 2009

Why I hate (some of) the Ancient Greeks


It's not uncommon now (especially where I live in the California Bay Area) for people to feel that there is something wrong with drawing a sharp separation between the mind and the body, rationality and emotion, thinking and feeling. Usually in these discussions the Enlightenment philosopher Descartes gets the blame for this, with his depiction of animals as mindless automata, and of people living in separate bodily / physical and mental / spiritual worlds that interact with each other through some piece of the brain. But some form of these separations has been with us since the earliest recorded speculations of the Greek natural philosophers.

As the ancient natural philosophers tried to make sense of the world around them, there was a constant tension in their thoughts between the ideal or rational and the sensual and direct aspects of our experiences. Some argued that the world was in such constant flux that we could not trust the evidence of our senses to tell us anything lasting about the nature of reality. Others stressed the felt and concrete immediacy of the evidence of our senses, and claimed that to understand reality we would have to learn the order of nature underlying the seemingly chaotic changes.

We in the West have been stuck with these opposing points of view ever since, and I would argue, to the great detriment of our understanding of the role that motivation, emotion, judgment and value play in our ability to interpret and act in the world around us. For, while both ways of thinking have been been with us since the Greeks, it has not been a fair fight since Plato. In an astonishing feat of intellectual prowess, Plato took up the rationalistic fragments of his predecessors and forged them into a worldview that remains the core of mainstream thinking in the philosophy of mind and cognitive science today. Plato resolved the tension between consistency and change or variation in our perception of the "world of sensed things" by positing a parallel world of abstract, "ideal" objects that is the true home of our mental life.

For example, think of the fact that we are easily able to recognize a particular animal as a horse, even though the horses of our experience come in many colors, sizes and temperaments. Plato would say that while our senses may report an ever changing parade of differences between these animals, each individual is just a specific instance of the "ideal" concept of a horse from the world of ideas. Just as our body and senses allow us to see the particular features of an individual horse, our mind -- connected to the generic, conceptual realm of rational representations -- allows us to recognize each horse as an example of the ideal. With this single masterstroke, Plato created the foundation of modern theories of internal representation of knowledge and doomed our raw, immediately sensed experience to an also-ran in the search for our understanding of understanding itself.

Next up: What's wrong with that?