
representation?" (more or less). The answer in a word: decontextualization. There are many directions the Western intellectual tradition might have taken in building on the Greek's skepticism and attempts to explain how we manage to perceive and act (largely) adaptively while connected to the world only by our senses. One interesting direction might have been to notice that we are also always acting in the world, and our senses are exquisitely tuned for successful interaction with our particular needs and niches in the environment which we share with other living things. We could have looked to a broader context, asking ourselves what about living creatures' interaction with their surroundings allows them to do what they do. For one thing is certainly true - all of us are always doing something, and often the stakes are high.

In my own graduate work, I had the privilege of working with animal subjects before I started working with people. I know from personal experience that what you learn about the cognitive capabilities of a subject can be very different when you are approaching cognition from an abstract or more contextual and naturalistic perspective. I personally spent many hours showing pigeons various slides to see if they could learn to understand simple categories such as "green bar" or the more challenging "green bar without a red dot". Well, it turns out pigeons can take a really long time to learn these things that seem so elementary to us. On the other hand, I had a colleague who studied pigeons' perceptual capabilities using more ecologically relevant situations for them. He took a handful of red wheat and tossed it onto a tile of multi-colored gravel he used in his experiments. The wheat completely disappeared - I stared at the tile for minutes without finding a single grain. He said that a pigeon can clear that handful of grain off that tile in a few seconds, its beak a complete blur as it strikes at one kernel after another.

So, what have we bought for ourselves by denying minds to our fellow creatures, ignoring our own bodies and embeddedness in the world, and embracing whole-heartedly a mostly abstract, formalist interpretation of cognition? Embarrassingly little, really. We do not have a very different understanding of ourselves and how we and other animals navigate the world than we have had since Freud or William James. The big 'C' - consciousness - is still a mystery, as is the problem of qualia (when you see red, do you see and feel the same red that I do?). There have been few dramatic improvements in education, training, or moral reasoning. Psychotherapy has benefited to some degree, but there have been no unequivocal breakthroughs in treatment for most cognitive and emotional disorders. There are a couple of practical applications where the results of research into the nature of memory, perception and attention have made a more substantial impact. One is the improvement of the design and usability of airplane cockpit control panels, software, and other complex technology, in which the work of Don Norman and many other cognitive scientists has been crucial. The other is in the evaluation of eyewitness testimony in jury trials, where Elizabeth Loftus has shown that the traditional "gold standard" of courtroom evidence can often be marred by suggestive questioning and the often reconstructive nature of memory itself. And while these advances are real, they are hardly universal: most software and complex devices are still very hard to use, and most people still believe that our memories are "true" and that eyewitness evidence must be sound.
Perhaps it's time to at least entertain a different perspective on who gets to count as a thinking being, and what thinking is anyway ...
Not quite next, but soon: If not computation on stored internal representations, then what?