At least, consciousness is something we know we have. According to Descartes, that we are conscious is the only thing we can know for sure. This certainty formed the basis for Descartes' insights, Cogito, ergo sum ("I think, therefore I am") and sum res cogitans ("I am a thing that thinks").
From
the time of Descartes on, introspection remained the
primary -- no, the only --
method for investigating consciousness. After all,
the philosophical method consists of
introspection and reasoning.
How did Descartes know he was
conscious? How do we know that we are?
Because we experience ourselves as observing, sensing,
perceiving, knowing, remembering, thinking, intuiting,
feeling, wanting, willing, intending, and doing. These
paradigm cases of the monitoring and controlling aspects of
consciousness are what consciousness is all about.
Consciousness is the totality
of sensations, perceptions, memories, ideas, attitudes,
feelings, desire, activities, etc. of which we are aware at
any given time. Conscious consists in our awareness of
events and of the meaning we give to them, and of the
strategies that we plan and execute to deal with them.
Actually, the word "consciousness" means a lot of different things. Thomas Natsoulas of UC Davis published a very useful paper in which he analyzes the seven different definitions of consciousness provided by the Oxford English Dictionary (American Psychologist, 1978):
A later paper (Journal of Personality & Social Psychology, 1981) analyzed the many different "problems" of consciousness studied by philosophers and psychologists:
- joint or mutual knowledge,
- internal knowledge or conviction,
- awareness,
- direct awareness,
- personal unity,
- normal waking state, and
- double consciousness.
Although these papers were purely exegetical in nature, and contained no empirical data, they were milestones in what we can call the "Consciousness Revolution" in cognitive psychology.
- conscious experience,
- Intentionality,
- imagination,
- awareness,
- introspection,
- personal unity,
- the subject, "consciousness" (as more or less),
- the normal waking state, conscious behavior, and
- explicit consciousness
William
James (1842-1910) -- trained as a physician,
employed as a professor of philosophy,
pioneering American psychologist -- serves
as a link between strictly philosophical and
psychological analyses of
consciousness. Called "the greatest of
the 19th-century introspective
psychologists" (Farthing, 1992, p. 25),
James nonetheless had little interest in the
tightly controlled, experimental or
"analytical" introspection of Wundt
and Titchener. James assembled a
collection of "brass instruments" for
experimental introspection at the Harvard
psychological laboratory, but he himself
never used them, and as soon as he could he
arranged for a new colleague, Hugo
Munsterberg, to be hired to take over the
laboratory work so that he could get back to
his writing, based "first and foremost and
always" on the method of "looking into our
own minds and reporting what we there
discover" (James, 1890, p. 185).
The introspective analysis of "the stream of consciousness" that James offered in his Principles of Psychology (1890/1980, Chapter 9) has never been equaled. So here it is in full (emphases in red added).
The Stream of ThoughtThe first fact for us, then, as psychologists, is that thinking of some sort goes on. I use the word thinking... for every form of consciousness indiscriminately. If we could say in English 'it thinks,' as we say 'it rains' or 'it blows,' we should be stating the fact most simply and with the minimum of assumption. As we cannot, we must simply say that thought goes on. FIVE CHARACTERS IN THOUGHTHow does it go on? We notice immediately five important characters in the process, of which it shall be the duty of the present chapter to treat in a general way: 1)
Every thought tends to be part of a
personal consciousness. 1) Thought tends to Personal Form...In this room -- this lecture-room, say -- there are a multitude of thoughts, yours and mine, some of which cohere mutually, and some not. They are as little each-for-itself and reciprocally independent as they are all-belonging-together. They are neither: no one of them is separate, but each belongs with certain others and with none beside. My thought belongs with my other thoughts, and your thought with your other thoughts. Whether anywhere in the room there be a mere thought, which is nobody's thought, we have no means of ascertaining, for we have no experience of its like. the only states of consciousness that we naturally deal with are found in personal consciousnesses, minds, selves, concrete particular I's and you's [sic]. Each of these minds keeps it own thoughts to itself. There is no giving or bartering between them. No thought even comes into direct sight of another thought in another personal consciousness than its own. Absolute insulation, irreducible pluralism, is the law. It seems as if the elementary psychic fact were not thought or this thought or that thought, but my thought, every thought being owned. 2) Thought is in Constant ChangeI do not mean necessarily that no one state of mind has any duration -- even if true, that would be hard to establish. The change which I have more particularly in view is that which takes place in sensible intervals of time; and the result on which I wish to lay stress is this, that no state once gone can recur and be identical with what it was before. 3) Within each personal consciousness, thought is sensibly continuousI can only define 'continuous' as that which is without breach, crack, or division. I have already said that the breach from one mind to another is perhaps the greatest breach in nature. The only breaches that can well be conceived to occur within the limits of a single mind would either be interruptions, time-gaps during which the consciousness went out altogether to come into existence again at a later moment; or they would be breaks in the quality, or content, of the thought, so abrupt that the segment that followed had no connection whatever with the one that went before.... Consciousness, then, does not appear to itself chopped up in bits.... It is nothing jointed; it flows. A 'river" or a 'stream' are the metaphors by which it is most naturally described. In talking of it hereafter, let us call it the stream of thought, of consciousness, or of subjective life.
...The judgment that my thought has the same object as his thought is what makes the psychologist call my thought cognitive of an outer reality. The judgment that my own past thought and my own present thought are of the same object is what makes me take the object out of either and project it by a sort of triangulation into an independent position, from which it may appear to both. Sameness in a multiplicity of objective appearances is thus the basis of our belief in realities outside of thought.... The first spaces, times, things, qualities, experienced by the child probably appear, like the first heartburn, in this absolute way, as simple beings, neither in or out of thought. But later, by having other thoughts than this present one, and making repeated judgments of sameness among their objects, he corroborates in himself the notion of realities, past and distant as well as present, which realities no one single thought either possesses or engenders, but which all may contemplate and know.... A mind which has become conscious of its own cognitive function, plays what we have called 'the psychologist' upon itself. It not only knows the things that appear before; it knows that it knows them. This stage of reflective condition is, more or less explicitly, our habitual state of mind.
The phenomena of selective attention and of deliberative will are of course patent examples of this choosing activity.... Looking back, then, over this review, we see that the mind is at every stage a theatre of simultaneous possibilities. Consciousness consists in the comparison of these with each other, the selection of some, and the suppression of the rest by the reinforcing and inhibiting agency of attention. The highest and most elaborated mental products are filtered from the data chosen by the faculty next beneath, out of the mass offered by the faculty below that, which mass in turn was sifted from a still larger amount of yet simpler material, and so on. -----From William James, Principles of Psychology (1890), Chapter 9 |
Obviously, introspection is limited to conscious mental life, begging the question of whether there is an unconscious mental life consisting of percepts, memories, thoughts, feelings, and desires of which we have no phenomenal awareness. James's position on unconscious mental life was complicated. Because he identified consciousness with thought, the notion of unconscious mental states (as opposed to unconscious brain processes) struck him as a contradiction in terms. Further, James adopted the doctrine of esse est sentiri: the essence of consciousness (its "to be") is to be sensed. Mental states are felt; therefore they cannot be unconscious.
In Chapter 6 on "The Mind-Stuff Theory" (which otherwise was devoted to a critique of structuralism), James considered and rebutted 10 "proofs" of the existence of unconscious mental states. These 10 ostensible proofs are stated as follows.
Some of these refutations, frankly, strike me as strained, glib, hand-waving. They are not, in my view, James at his best. And in some cases, James has simply been proved wrong. There is, now, good evidence of subliminal perception, of the automatization of mental processes, and of unconscious inference in perception. There are dissociations between explicit and implicit memory, etc., in hysteria and hypnosis. There is some evidence of incubation in problem-solving. All of these empirical facts seem to show that some of James's refutations were empirically wrong, and that there is "something it is like" to be an unconscious mental state after all.
In fact, James was already well aware of some of this evidence, in 1890, and even in the Principles he describes in positive terms evidence of apparent "unconsciousness" in hypnosis, hysteria, and multiple personality. For example, in hysterical blindness, the person claims to be unable to see, while continuing to respond to visual stimuli. This looks like "unconscious" vision.
James accepted the evidence of hypnosis and hysteria as legitimate, but his interpretation was different. Rather than postulate unconscious mental states, he referred to mental states of which we were unaware as co-conscious, subconscious, or as representing a secondary or tertiary (etc.) consciousness. This is not just playing with words. Remember that, for James, "thought tends to personal form". For James, consciousness could be divided into parallel streams, each associated with a representation of the self. Each of them is a fully conscious condition, but each of them is unaware of the others. When we ask what a person is aware of, the result of the inquiry will depend on which stream is being tapped. If we tap the primary stream, which is usually the case, the person will seem unaware of what is in the secondary stream(s); but if we tap one of the secondary streams, one of the other selves, we will see immediately that consciousness is there. Esse est sentiri, still, but it depends on who's being asked -- or, put another way, who's doing the feeling.
All of this sounds a little odd, but it's what seems to happen in hypnosis and "hysteria" -- about which more later.
The "Stream of Consciousness" Before JamesJames' phrase "the stream of consciousness" is commonly sourced to the Principles of 1890, but Garry Wills has found it in an earlier paper, "On Some Omissions of Introspective Psychology", published in Mind for January 1884 ("An American Hero" [review of William James: In the Maelstrom of American Modernism by Robert D. Richardson], New York Review of Books, 07/19/2007). However, the phrase popularized by James was not original with him. Sandra Tropp found the phrase in Physiological Aesthetics by Grant Allen (1877, p. 200), with which James may have been familiar (New York Review of Books, 08/16/2007). But William
Waterhouse (New York Review of Books,
11/22/2007) found it even earlier,
in The Physiology of Common Sense
by George Henry Lewes (1859, p. 61); and
even earlier than that, in The Senses
and the Intellect by Alexander Bain
(1855, p. 359). James may have read
Allen, but he was certainly familiar with
both Lewes and Bain. |
Introspection,
the philosopher's traditional method for investigating
consciousness, became the psychologist's method as well.
And not just James (who, after all, was a philosopher -- and
physiologist -- before he became a psychologist). In the hands
of Wundt, Titchener (Wundt's most famous American student),
and other "Structuralists", introspection came to be the
method for a "mental chemistry" by which complex conscious
states could be analyzed into their constituent elements (for
comprehensive reviews, see Boring, 1953; Danziger,
1980).
To quote from E.B. Titchener's Text-Book of Psychology (1910):
Scientific method may be summed up in the single word 'observation'.... The method of psychology, then is observation. To distinguish it from the observation of physical science, which is inspection, a looking-at, psychological observation has been termed introspection, a looking-within. But this difference of name must not blind us to the essential likeness of the methods.
In principle, then, introspection is very like inspection. The objects of observation are different: they are objects of dependent, not of independent experience; they are likely to be transient, elusive, slippery. sometimes they refuse to be observed while they are in passage; they must be preserved in memory, as a delicate tissue is preserved in hardening fluid, before they can be examined. And the standpoint of the observer is different; it is the standpoint of human life and of human interest, not of detachment and aloofness. But, in general, the method of psychology is much the same as the method of physics.
Titchener (1898) also laid out the general rules for introspection (there were also specific rules, depending on the nature of the mental state being introspected):
Or, as Titchener, advised: "The rule of psychological work is this: Live impartially, attentively, comfortably, freshly, the part of your mental life you wish to understand.
The big rule, however, was to avoid what Titchener (1905; Boring, 1921) called the stimulus-error. That is, the introspective observer should not confuse the sensation with the stimulus and its meaning. Observers were to base their reports on "mental material", not on the objects which gave rise to their mental states. The stimulus-error consists in describing the objects of perception and their meanings. But, for Titchener, the description of the stimulus, independent of experience, reflects the point of view of physics, not psychology.
In any event, as Boring (1953) made clear, classical experimental introspection, as practiced by Wundt (1896), Titchener (1905, 1910), and other Structuralists, was a kind of mental chemistry (Boring should know, as he was a student of Titchener's and knew Wundt). Consciousness contains complexes, analogous to molecules, which are composed of sensory elements, analogous to atoms. Oswald Kulpe, another Structuralist, identified these elements as intensity, extensity, duration and, most important -- because it was inherently psychological in nature -- quality. The quest for identifying the basic qualities of sensation is discussed in the lectures on Psychophysics, to which we will turn shortly.
Titchener was clear that, to quote James (1890/1980, p. 187), all introspection is retrospection (later, Jean-Paul Sartre said much the same thing in Being and Nothingness (1957), p. 11). The Structuralists understood clearly that observing and reporting on experience would necessarily interfere with having the experience -- a kind of psychological anticipation of Heisenberg's (1927) uncertainty principle in physics. Accordingly, observers were carefully trained to have the experience first, and then report it from memory. This training, like training in avoidance of the stimulus-error, was painstaking, and involved as many as 10,000 trials (an anticipation of Anders Ericsson's "10,000 Hour Rule".
Titchener was also clear that experimental introspection involved going above and beyond mere verbal reports. Verbal reports, in his terms, were responses to the stimulus. Introspections were observations of experience.
In the final analysis, the psychologist's introspection was distinguished from the philosopher's introspection by the "scientific" means by which it was conducted:
James' analysis of mental life relied primarily on introspection. He had a collection of "brass instruments" in his teaching laboratory, but he rarely used them. He preferred to introspect and then psychologize. However, there were some differences between James's approach to introspection and that of the structuralists. (1) He believed that introspection was essentially memory-based, rather than on-line (i.e., "All introspection is retrospection"); the implication is that the introspective mental state (saying "I feel tired") is different from the pre-introspective mental state (feeling tired). (2) He believed that introspection was unreliable, and had to be checked by other means.
To this end, James outlined a number of methods to supplement introspection: (1) connecting conscious states with physical conditions; (2) analyzing space perception; (3) measuring the duration of mental processes; (4) reproducing sensory experiences and intervals of space and time; (5) studying how mental states influence each other (e.g., excitation and inhibition; span of apprehension); and (6) studying the laws of memory.
Still, introspection remained James' preferred method of psychological analysis -- and he thought that its results far outweighed those obtained (so far) by experimental analyses employing "brass instruments". But James was not entirely persuasive on this score, and as psychology developed, three quite different critiques of introspection emerged.
Even the Structuralists understood that there were methodological problems with introspection.
Aside from these methodological problems, which investigators like Titchener did their best to surmount, there was One Big Problem with introspection - -which was that scientific psychology was gradually abandoning introspection in favor of an emphasis on human performance.
But the decisive critique of introspection came in John B. Watson's manifesto for behaviorism.
The behaviorist
critique of introspection is pretty
straightforward: mental states are
subjective and private, and science is based
on objective, publicly available
observations. Therefore you can't have
a science based on introspection. You
can only have a science based on what's
observable, which is behavior and the
stimulus circumstances under which it
occurs.
Watson had
other criticisms of introspection, such as
the endless controversies over such topics
as whether there was imageless thought
(about which Karl Buhler and Wundt battled
endlessly). Watson actually didn't
object to introspection in studies of
sensation and perception, where the stimuli
can be controlled by the experimenter.
The problems really arose when introspection
was applied to the "higher" mental
processes. If someone is going to
introspect on thought processes, how could
we be sure that two different observers were
actually introspecting on the same
thought? But these were merely
methodological objections. The
behaviorist critique of introspection was
principled: you can't base a science on
introspection; and psychology should be
redefined as a science of behavior rather
than as a science of mental life.
Watson's
critique was echoed by B.F. Skinner, who
wrote (among many other things) Science
and Human Behavior (1953), intended to
be an introductory textbook of psychology
based on strict, radical behaviorism.
After the behaviorists were overthrown in the cognitive revolution, you'd think that introspections would be let in. And in some sense they were.
In the first
place, the basic data for cognitive
psychology is self-report and response
latency -- that is, how fast people make
their self-reports.
More substantively, introspections provided the data for one of the landmarks of the cognitive revolution, Allen Newell and Simon's (1972) "General Problem Solver". One of the first examples of artificial intelligence, GPS employed means-end analysis to solve all sorts of mathematical and scientific problems, and was explicitly based on subjects' reports of how they went about solving various kinds of problems -- a technique known as protocol analysis which is basically introspective in nature. (Simon won the Nobel Prize in Economics in part for this work). Later, K. Anders Ericsson, who was a student of Simon and Newell (Ericsson & Simon, 1990, 1993) introduced the "Model of Verbalization of Thinking" -- a refinement of protocol analysis that is, again, essentially introspective in nature.
But that didn't mean that
there weren't still problems with introspection that worried
investigators (including Simon and Ericsson).
The cognitive revolution made consciousness a legitimate topic of scientific research again, but -- as we'll see later -- it also legitimized the study of unconscious mental life -- that is, percepts, memories, thoughts, and the like of which we have no awareness. This, in turn, drew attention to a further limitation of introspection -- which is that introspection, by definition, offers us a view limited to conscious mental life. You simply can't introspect on unconscious mental life. And if the scope of unconscious mental life is broad and deep, rather than narrow and shallow, introspection may miss as much as, or even more than, it hits.
This argument was made expressly by Richard
Nisbett and Timothy Wilson, in a paper entitled "Telling
More Than We Can Know: Verbal Reports on Mental
Processes" (Psych. Review, 1977), which argued that people simply have
"little or no direct introspective access to higher
order cognitive processes". They reviewed old
evidence, and presented new studies, supporting the
following points:
For example, Nisbett and Wilson conducted one of their studies in a department store, under the guise of a consumer survey. In one version of the study, the subjects -- actual shoppers, or at least window-shoppers -- were asked to evaluate four different nightgowns; in another version, they were asked to evaluate four pairs of women's stockings; in each case, the items were actually identical. Both studies revealed a marked position bias, such that items on the right-hand side of the display were much more likely to be preferred than those on the left. But when asked why they had their preferences, not a single subject mentioned its position. So, it seems, subjects were unaware of the connection between the position of the objects and their preferences. Nisbett and Wilson argue that this is the case more often than not.
What's the problem? Nisbett and Wilson distinguish between content and process. It's one thing, they say, to be aware of some mental state, like our preference for one nightgown over another, and it's quite another to be aware of the processes by which that mental state is constructed. And in general, they argue that mental processes are largely inaccessible to conscious awareness. So, if we want people to tell us what they like, they can do that (usually). But if we want people to tell us why they like it, we may be asking them to tell us more than they can know.
It might be said that the "nightgown" study and its like have certain methodological problems. For example, the study described doesn't really allow subjects a rational basis for their decisions. In the stocking version, for example, the four pairs presented for evaluation were, in fact, identical, so there was no way to choose between them. But the subjects were forced to express a choice, and they did. To be sure, they didn't seem to realize that their choices were biased by position -- and, more to the point, even if they did they would never have said so. Position is a ridiculous basis for preferring one pair of stockings over another, and subjects might think that, if they referred to position, they would be accused of not taking their job seriously. So even if they were aware of the influence, they wouldn't admit it. Distinguishing between what people are genuinely aware of, and what people are aware of but won't report, is a serious (but not unmanageable) problem in the scientific study of unconscious mental life.
Still, the content-process distinction is one that turns out to be important. As will be discussed later, in the lectures on "Attention and Automaticity", a lot of mental operations appear to be performed automatically, and it's a property of automatic mental processes that they are unconscious in the strict sense that they are simply unavailable to introspective phenomenal awareness under any circumstances. Nisbett and Wilson do not explicitly refer to automaticity in their paper -- it was written before the distinction between automatic and controlled processes really took off. But if the argument is that we only have introspective access to controlled processes, but not to automatic processes, Nisbett and Wilson were onto something.
In addition, the philosopher Jerry Fodor (The Modularity of Mind, 1983) has argued that some cognitive systems are modular in nature, performed by dedicated mental systems that are associated with a fixed neural architecture. Cognitive modules take some input, perform some transformation on it, and output this transformation to other parts of the cognitive system. According to Fodor's doctrine of modularity, the internal operations of these modules are inaccessible to other parts of the cognitive system -- which means, essentially, that these processes are inaccessible to introspection. Moreover, in the course of performing these transformations, the information may pass through one or more distinct states. Although these distinct states count as mental contents, and so might be accessible to introspection (by virtue of the process-content distinction discussed earlier), Fodor argues that these contents are also inaccessible to phenomenal awareness, and thus to introspection, precisely because they are encapsulated in these modules.
And finally, as Nisbett and Wilson also point out, there are some stimuli that are "subliminal" -- too weak, or too briefly presented, to be consciously perceptible. There is now a considerable amount of evidence that such "subliminal" stimuli can have palpable effects on experience, thought, or action. We'll discuss this evidence later, in the lectures on "The Explicit and the Implicit".
So, Nisbett and Wilson were onto something, which is that there are limits to introspection. We can't introspect on subliminal stimuli, and we can't introspect on automatic processes, and we can't introspect on the inner workings of cognitive modules. But that doesn't mean that introspection is always invalid -- that we're always, or even often, telling more than we can know. We know a lot, about the stimuli in our environment, and our responses to them, and about what comes in between.
Many
philosophers identify
consciousness with
phenomenal
experience. As the
philosopher Thomas Nagel
argued in a famous
essay, "What is it like
to be a bat?" (1979),
there is something
it is like to be
conscious.
Conscious organisms have
certain subjective
experiences. This
phenomenal experience,
in turn, comes in
several forms -- but how
many?
Actually, some cognitive ethologists have tried to figure out what it's like to be a bat -- well, if not a bat, exactly, some other kind of nonhuman animal.
Nagel's point, that there's something it's like to be conscious, directly inspired Bird Sense: What It's Like to Be a Bird (2012) by Tom Birkhead, an English behavioral ecologist. In an earlier book, The Wisdom of Birds: An Illustrated History of Ornithology (20008), Birkhead traced the evolution of our understanding of bird behavior. In What It's Like, he tries to get inside the head of birds, to develop some idea of what their sensory experience is like. For example, birds can see in the ultraviolet range of the electromagnetic spectrum, meaning that a bird who appears quite drab to us may look spectacular to another bird. And the asymmetrical placement of an owl's ears permit it to triangulate on noise prey in a way that is not possible for us. Never mind the special magnetic sense that may enable birds to navigate over long migratory distances. Birkhead understands that it's not really possible to know what the bird sees in the ultraviolet range, or how it feels the pull of geomagnetism. But from his objective standpoint, he takes us closer than anyone before to the subjective life of another species.
- Rachel Carson, author of Silent Spring (1962), the book that raised the alarm about environmental pollution and triggered the environmentalist movement, was a marine biologist who, earlier in her career, wrote a trilogy of books about the world's oceans and their inhabitants: Under the Sea-Wind (1941), The Sea Around Us (1951), and The Edge of the Sea (1955) -- all reissued by the Library of America in 2022 (reviewed by Rebecca Giggs in "The Sea, the Sea", New York Review of Books, 11/22/2022). Giggs writes:
The vantage Under the Sea-Wind takes on its characters is close-range, but it is not internalized and so the animals' feelings remain inaccessible.... The effect of this is that, though Under the Sea-Wind takes no overt stance on animal consciousness, the outlook on the ocean is inflected by whichever creature Carson places us next to. What we see gets tinted by a sensibility seeped out of nonhuman bodies and minds, as though color gels are being affixed to a lens. The ocean is manifold and unalike, as it turns out, to an owl, a raven, and a sanderling, or to a trout, an eel, or an anglerfish. To borrow an expression from the American critic Lawrence Buell, Carson's writing in Under the Sea-Wind proves an exercise in "disciplined extrospection" -- the studied relinquishment of a self-centered perspective, guided by reaching out toward, but never quite enclosing, the viewpoint of another species.
Carl Safina, in Beyond Words: What Animals Think and Feel (2015), approvingly quotes Voltaire on Descartes: "What a pitiful, what a sorry thing to have said that animals are machines bereft of understanding and feeling".
Thomas Thwaites, for example, realized that "to inhabit the mental life of a goat, he would need to relate to his surroundings in a goatlike way", and built a kind of prosthetic device which enabled him to do so, after a fashion -- an experience he wrote about in GoatMan: How I Took a Holiday from Being Human (2016). Actually, he initially wanted to be an elephant, and the Welcome Trust, a British foundation that supports scientific research, approved his proposal. But when he consulted a Dutch shaman, she said his proposal was "idiotic", and that he should become a deer, or a sheep, or a goat instead. And so he did.
Charles Foster tried to accomplish much the same goal simply by living like a badger, as well as he could, for six weeks in the woods -- a story he told in Being a Beast (2016; reviewed by Vicki Constantine Croke in "'I Want to know What It Is Like to Be a Wild Thing'", New York times Book Review, 07/17/2016). Foster also tried his hand at being a fox, red deer, and an otter -- the last project one in in which he also enrolled his children.
Goats and badgers are, at least, mammals. In What a Fish Knows: The Inner Lives of Our Underwater Cousins (2016), Jonathan Balcome tries to understand how the world appears to a fish -- and, more importantly, marshals anecdotal and scientific evidence about intelligent problem solving in various species of piscines.
And, for good measure, Andrew Barron and Colin Klein that insects and other invertebrates also possess at least the limited degree of self-awareness that comes with knowing where their bodies are in space and what they are doing ("What Insects Can Tell Us About the Origins of Consciousness", PNAS 2016).
From a more conventionally scientific basis, Franz de Wall, in Are We Smart Enough to Know How Smart Animals Are? (2016) that the whole enterprise of comparative psychology mistakenly tries to compare other animals to humans. Instead, we should recognize that each animal species has its own unique, self-centered, subjective world -- what Jacob von Uexkull calls its Umwelt -- which cannot be fully comprehended by any other species (reviewed by Elizabeth Kolbert in "He Tried To Be a Badger", New York Review of Books, 06.23.2016, which also reviews Foster's and Balcome's books).
See also An Immense World: How Animal Senses Reveal the Hidden Realms Around Us by Ed Yong, a science writer (reviewed by Elizabeth Kolbert in "Contact", New Yorker, 06/13/2022, in an essay that also covers recent books on animal communication). Yong notes that different species have very different sensory capacities, leading them to perceive the world quite differently from the way we do (Yong refers to Nagel's essay, and uses von Uexkull's word Umwelt to refer to an animal's subjective world). Scallops, for example, have dozens or hundreds of eyes, but apparently don't see anything. Rather, their eyes function more as motion detectors, so that when a large enough object passes slowly enough through the water, they send a signal that opens the scallop's shell to catch some food. Yong covers a wide range of species, including the black ghost knifefish. Kolbert writes:
The black ghost knifefish is, as its name implies, a nocturnal hunter. By firing a specialized organ in its tail, a knifefish creates an electric field that surrounds it like an aura. Receptors embedded in its skin then enable it to detect anything nearby that conducts electricity, including other organisms. One researcher suggests to Yong that this mode of perception, known as active electrolocation, is analogous to sensing hot and cold. Another posits that it's like touching something, only without making contact. No one can really say, though, since humans lack both electric organs and electroreceptors. "Who knows what it's like for the fish?"....
***
Yong's response to Nagel... runs along the lines of "Yes, but...". Yes, we can never know what it's like for a bat to be a bat (or for a knifefish to be a knifefish). But we can learn a lot about echolocation and electrolocation and the many other methods that animals use to sense their surrounds. And this experience is, for us, mind-expanding.
- On the occasion of the publication of his book, Yong wrote an OpEd piece in the New York Times entitled "How Animals See Themselves", which also cites Nagel's essay (06/21/2022). Yong is critical of nature documentaries, even the best of them, which always seem to portray animals' lives through a filter of human narratives:
An elephant family searches for water.... A lonely sloth swims in search of a mate...." The result is a subtle form of anthropomorphism, in which animals are of interest only if they satisfy familiar human tropes of violence, sex, companionship and perseverance. They're worth viewing only when we're secretly viewing a reflection of ourselves.Again referring to the idea of Umwelt, Yong instead suggests that we try to view animals "through their own eyes".
A tick’s Umwelt is limited to the touch of hair, the odor that emanates from skin and the heat of warm blood. A human’s Umwelt is far wider but doesn’t include the electric fields that sharks and platypuses are privy to, the infrared radiation that rattlesnakes and vampire bats track or the ultraviolet light that most sighted animals can see.The Umwelt concept is one of the most profound and beautiful in biology. It tells us that the all-encompassing nature of our subjective experience is an illusion, and that we sense just a small fraction of what there is to sense. It hints at flickers of the magnificent in the mundane, and the extraordinary in the ordinary....
By thinking about our surroundings through other Umwelten, we gain fresh appreciation not just for our fellow creatures, but also for the world we share with them. Through the nose of an albatross, a flat ocean becomes a rolling odorscape, full of scented mountains and valleys that hint at the presence of food. To the whiskers of a seal, seemingly featureless water roils with turbulent currents left behind by swimming fish — invisible tracks that the seal can follow. To a bee, a plain yellow sunflower has an ultraviolet bull’s-eye at its center, and a distinctive electric field around its petals. To the sensitive eyes of an elephant hawk moth, the night isn’t black, but full of colors.
Reviewing a number of such books, Martha Nussbaum, an ethics philosopher at the University of Chicago, writes:
Nussbaum continues:The world we share with the other animals is stranger and more wondrous than humans have typically realized.... As de Waal puts it...:
We used to think in terms of a linear ladder of intelligence with humans on top, but nowadays we realize it is more like a bush with lots of different branches, in which each species evolves the mental powers it needs to survive.Animals who look sort of like us are stranger and more complicated than we thought, and those who look nothing like us (whales, birds) turn out to hvae some of the most sophisticated cognitive equipment. Nor are humans "at the top" of any ladder. Some animals have senses that we utterly lack. Many birds have a strong sense of magnetic fields and, through that, can navigate the world with an accuracy of which we can only dream. Dolphins have a capacity for echolocation, a form of perception akin to sonar, which can inform them not only of the contours of an object but also of its insides.
The new learning about animal lives and their complexity has large ethical implications. At the most general level we must face up to the fact that many, if not most, animals are not automata or "brute beasts" but creatures with a point of view on the world and diverse ends toward which they strive -- and that we interfere with these forms of life in countless ways, even when we do not directly cause pain.
For more on animal consciousness, see the lectures on "The Origins of Mind"; also "The Metamorphosis" by Joshua Rothman, New Yorker, 05/30/2016), from which some of these quotes are drawn.
Technically, Brian Farrell, a British philosopher, first posed the question of "what it would be like to be a bat" in a paper entitled "Experience" (Mind, 1950). But Nagel popularized the question, and it's his essay with that title that has entered the canon of philosophical examinations of consciousness.
In the late 18th century, the philosopher Immanuel Kant offered a tripartite classification of "irreducible" mental faculties: knowledge, feeling and desire -- or, as the 20th-century psychologist Ernest R. ("Jack") Hilgard (1980) put it, the "trilogy of mind": cognition (having to do with knowledge and belief), emotion (having to do with feeling, affect, and mood), and motivation (having to do with desires, goals and drives). Cognition, emotion, and motivation are three different mental functions, but they also serve as three broadly different types of mental states. According to this view, perceiving and remembering are different mental states, but they have in common that they are cognitive states of knowing; anger and fear are also different mental states, but they have in common that they are emotional states of feeling; hunger and thirst are different mental states, but they have in common that they are motivational states of desire.
Based on
the Kant-Hilgard
analysis, then, as a
first pass we can
identify three
different qualitative
states of mind, each
corresponding to one
of his "irreducible"
mental
faculties. Put
another way,
Kant asserted that the trilogy of mind was irreducible, in that states of feeling and desire, for example, could not be further reduced to states of knowledge and belief. However, this point is controversial. Within both psychology and cognitive science, some theorists believe that cognition is the fundamental faculty, and that emotional and motivational states are reducible to cognitive states. Put another way, emotions and motives are cognitive constructions. In this view, the basis mental state is one of belief, and feelings and desires are actually beliefs about our feelings and desires. I call this situation the hegemony of the cognitive, and it is not a figment of my imagination. Cognitive psychology and cognitive science are both full of theorists who take the view that feelings and desires are cognitive constructions. Chief among these are Stanley Schachter (of Columbia University) and Richard Lazarus (late of UC Berkeley).
What Kant Really SaidMy own favorite version of the Kant quote is as follows:
There are three absolutely irreducible faculties of the mind, namely, knowledge, feeling, and desire.Kant goes on to explicate:
It's taken from the Introduction to
the first edition of The Critique of Judgment
(1790), Part III, "The Critique of Judgment as a Means
of Connecting the Two Parts of Philosophy in a Whole).
The Critique of Judgment was the last of a
three-book project dealing with each of the faculties
in turn:
It's a great quote -- especially the first sentence. Unfortunately, that's not quite what Kant said. Judging by the various translations, Kant -- however brilliant a philosopher he certainly was -- seems to have been a very bad writer. The "quote" above is actually taken from The Philosophy of Kant, as Contained in Extracts from His Own Writings -- a kind of Reader's Digest Condensed Book, selected and translated by John Watson, and first published in 1888 (p. 311). For those who want something closer to the original source, I provide three different translations below. All are considered authoritative. I think you'll agree that Watson got it right the first time.
Translation
by James Creed Meredith (1911).
Translation
by Werner S. Pluhar (1987).
Translation
by Paul Guyer (2000).
Where Kant Got It
Just for the record,
Kant's trilogy of mind wasn't original with him. It
can be found in Plato's doctrine of the triune
soul, according to which the brain was
associated with the intellect; the heart with the
"higher" passions, like anger and fear; and the spleen
with the "lower" passions, like greed or desire. |
Whether you believe that cognition, emotion, and motivation are irreducible, or that emotion and motivation are the products of cognitive construction, has implications for academic organization that lie at the heart of the relations between psychology and cognitive science. If everything boils down to cognition, then cognitive science can be a complete science of the mind. But if emotional and motivational states are independent of cognition, then it follows that cognitive science can't do it all, and must be supplemented by affective and conative sciences.
In fact, at the end of the 20th century a new interdisciplinary field began to emerge known as affective neuroscience, modeled on cognitive neuroscience but dedicated to the proposition that the principles of emotion were different from the principles of cognition -- otherwise, you wouldn't need a new field, would you? If affective neuroscience takes hold, can a conative neuroscience be far behind? If you're going to have separate fields for cognition, affection, and conation, why don't you just do psychology, which already encompasses all three.
It's important to remember that cognitive science arose in reaction to the dominance of behaviorism within psychology, its rejection of mentalistic concepts, and its unwillingness to consider mental processes as mediating between stimulus and response. In the final analysis, however, there's nothing that cognitive science does that psychology can't do, and psychology provides coverage of the mind that cognitive science can't.
If the term "cognitive" in cognitive science is really a euphemism, and cognitive science is really concerned with the mind in its entirety, including emotion and motivation, then it might just as well be called psychology.
Philosophers often describe mental states in terms qualia, or the phenomenal qualities of conscious experience -- "raw feels", if you will. These are the conscious experiences that Descartes could not bring himself to doubt -- the bundles of distinct sensory qualities that make up our conscious experiences. Qualia (singular quale) refer to the distinctive states of mind associated with various sensory experiences. There is "something it is like" to see rather than hear, or to smell rather than taste, and there is "something it is like" to see red as opposed to blue, or to taste sweet as opposed to sour, and these "somethings" are the differences between and among qualia.
In
his essay,
"Quining Qualia",
the philosopher
Daniel Dennett has
listed four
ostensible
properties of
qualia:
It's important to note that Dennett doesn't actually believe that qualia exist. We'll go into the reasons for this in the lectures on "Mind and Body", but for now understand that Dennett is simply summarizing what the traditional view of qualia is.
Of these qualities, perhaps the most important is ineffability (the others are either implications or effects of ineffability): There is no linguistic description of an experience such that understanding the description would enable someone who has never had the experience to know what that experience is like.
To illustrate this point, the philosopher Frank Jackson (1982, 1986) asks us to imagine the experience of "Mary, the color-blind scientist". Mary, a visual neuroscientist, is raised from birth in an achromatic chamber, so that she is completely deprived of exposure to all color stimuli. Meanwhile, she learns all there is to know about the nervous system, and in particular, all there is to know about color vision. What happens if Mary should leave her chamber, and be exposed to color stimuli? Will she have an altogether new experience, of color? Jackson proposes that she will have entirely new experiences of color. Knowing how physical and neural processes give rise to the experience of color does not enable us to know what the experience itself is like.
Jackson offered yet another thought experiment, this time of "Fred, the Scientist with Super-Vision" (not his actual title, but it will do). Imagine Fred, another vision neuroscientist, who has a range of visual sensitivity that extends beyond the normal -- into the infra red (> 780 nm), say, or into the ultraviolet (<380 nm). An observer -- a colleague of Mary and Fred, for example -- knows all there is to know about the visual system, and all there is to know about color. Does that observer have any knowledge of what Fred's infra-red or ultra-violet vision looks like?
And Along Comes MaryJackson's thought experiment has become an important focus of debate about the nature of qualia and of conscious experience. Jackson's arguments are dissected by a number of his colleagues, and by Jackson himself, in There's Something About Mary: Essays on Phenomenal Consciousness and Frank Jackson's Knowledge Argument, ed. by P. Ludlow, Y. Nagasawa, & D. Stoljar (MIT Press, 2004). |
Jackson's story is one of those thought experiments that philosophers love to pose, and debate, but there is actually anecdotal evidence that bears on the question.
Knut Nordby is a scientist who actually studies color vision, even though he himself is congenitally color bind and has no idea what the colors look like. As Nordby has written (1990):
"Although I have acquired a thorough theoretical knowledge of the physics of colors and the physiology of the color receptor mechanisms, nothing of this can help me to understand the true nature of colours.... From the history of art I have also learned about the meanings often attributed to colours and how colours have been used at different times, but this too does not give me an understanding of the essential character or quality of colours.
Oliver
Sacks (1995;
Sacks &
Wasserman,
1987) has
reported on
Patient J.I.,
a painter with
excellent
conceptual
knowledge of
colors who
suffered
damage to Area
V4 in his
occipital lobe
-- an area
known to be
involved in
processing
color vision
-- in an
automobile
accident.
As a result of
this damage,
J.I. lost his
ability to see
colors; he
also gradually
lost his
ability to
imagine
colors, or his
ability to
dream in
color.
He retained
his conceptual
knowledge of
color,
including his
knowledge of
how various
objects are
colored, the
laws of color
mixture, and
the
like.
But he lost
his ability to
experience
color.
Forms of Color Blindness |
|
Deutanopia,a form of red-green colorblindness in which both reds and greens look greenish-gray. |
|
Protanopia,another form of red-green colorblindness, in which reds and greens are difficult to distinguish. Reds appear dark, and purpose appears blue. |
|
Tritanopia,a blue-yellow colorblindness in which blues are dim, yellows look white, and purples look red. |
The point of Jackson's stories, and of the two actual cases, is that our objective linguistic, conceptual, and scientific knowledge of color and color vision is not enough to give rise to the subjective experience of color. It is in this sense that the experience of red and green, yellow and blue is ineffable. It also implies that objective, third-person descriptions of color are not sufficient to yield the subjective, first-person experience of color.
Analyses of qualia are important in the study of consciousness, but we rarely experience disembodied reds and blues, sweets and sours. Rather, "red" and "sweet" are properties of the things we see and taste. For this reason, mental states are also often described in terms of their Intentionality. We don't think, or believe, or feel or desire in the abstract. Rather, we always think (etc.) something. Put another way, consciousness is representational -- it is always about something other than itself.
In fact, the 19th-century philosopher Franz Brentano argued
that Intentionality is the mark of the mental.
Specifically,
Intentional
states, in
turn, are
represented by
propositional
attitudes
-- the term
was coined by
the English
philosopher
Bertrand
Russell --
which state a
relation
between a
person and
some
proposition P.
The notion of Intentionality is a little confusing, because the word intention has a double meaning. In philosophical discourse, such as the aphorism that "Intentionality is the mark of the mental" from Brentano cited above, In philosophy, Intentionality refers to the object toward which a mental state is directed, or its directedness; in ordinary discourse, however, it simply refers to a property of an action, as in the familiar excuse that "I didn't intend to do it". Unfortunately, both these senses are relevant to the study of consciousness: Intentionality is a property of mental states, and it is also a feature of deliberate, conscious action. In order to keep the two senses separate, following the practice introduced by the UCB philosopher John Searle (in Intentionality: An Essay in the Philosophy of Mind, 1983) I will write Intentionality with an initial capital "I" when I am referring to the philosophical concept, and intentionality in all lower case when referring to the term in ordinary language. This makes some linguistic sense: Brentano wrote in German, and in German all nouns -- like Intentionalitat, meaning "directedness", are capitalized; the German word for intention, in the ordinary-language sense, is Absicht, or perhaps the verb wollen ("want").
The notion of Intentionality has been pretty well worked out for cognitive states like believing and knowing, but the corresponding solution for emotional and motivational states has always been a little unsatisfactory.
As already noted, Brentano argued that Intentionality is the mark of the mental: all mental states are Intentional in nature, and only mental states are Intentional in nature. Later, Bertrand Russell argued that Intentional states are represented by propositional attitudes, which state a relation between a person and some proposition P, where that proposition entails the person believing, knowing, feeling, or wanting (etc.) P. For John Searle, Intentional states are the means by which our minds relate us to the world.
Thus, in the statement John believes that it is raining outside, the proposition about the world is that it is raining, and John's relation to that proposition is an attitude of belief. From this point of view, propositions have truth value -- they are either true or false (it is either raining or not) -- or, as Searle prefers, truth conditions -- that is, they are true under certain conditions (i.e., when it really is raining outside).
The
point is important because if consciousness is
about the mental, and if all mental states are
Intentional states, therefore all conscious mental
states must be Intentional in nature. The problem is
that some conscious mental states don't seem to have
propositional content.
Similarly,
in the motivational state John wants pizza there doesn't seem to be any propositional
content either. Certainly there isn't any
proposition; thus there is no truth value and there are no
truth conditions.
The implication is that emotional and motivational states aren't mental states, because they aren't Intentional in nature. But that doesn't seem to be right either: feelings and desires are epitomes of conscious mental states.
One solution to this problem is to re-frame the emotional and motivational states as beliefs, the way cognitive constructivists do. Thus, the emotional state John likes pizza becomes the cognitive state John believes that he likes pizza which includes a propositional attitude, and a proposition that has truth value or truth conditions. In the same way, John believes he is happy.
We can pull the same trick with the motivational state John wants pizza by transforming it into the cognitive state John believes that he wants pizza. And, John believes that he's hungry.
This is fine, but the upshot of this tack is that emotional and motivational states are not irreducible after all, as Kant had argued they are, because they can be reduced to cognitive states -- to beliefs about our feelings and desires. This is fine if you're the kind of person -- a cognitive constructionist -- who approves of the "hegemony of the cognitive" in psychology and cognitive science, but it's bad if you're Immanuel Kant, who argued that knowledge (cognition), feeling (emotion), and desire (motivation) are irreducible faculties of mind. And the conclusion is also going to make other people, who think that emotional and motivational processes are at least partially independent of cognition, a little nervous. So what to do?
One
solution is suggested by Searle's reanalysis of
Intentionality, which downplays the importance of
propositional content. From his point of view, all
Intentional states have four components:
Within this framework, cognitive and motivational (conative) states are clearly distinguishable.
Cognitive states (e.g., beliefs,
percepts, and memories) have
propositional content, of course, but
they have what Searle calls mind-to-world
direction of fit, by which he means
that the mind is describing a current
reality that exists independently of
it. Thus, in the cognitive state
George believes that Martha
likes him the question is whether
the mental state is an accurate
reflection of the world outside the
mind, and the condition of
satisfaction is whether the
description of the world is true or
false -- or, more precisely, the
conditions under which the description
is true.
Conative states (e.g.,
motivational states of want, need, and desire), also
have propositional content, but they have a world-to-mind
direction of fit. That is, the mind is
anticipating a future reality that does not presently
exist. In the conative state George
wants Martha to like him the question is whether the world can be
brought to match the mental state, and the condition of
satisfaction is whether the desire is satisfied -- or,
more precisely, the conditions under which the desire
might be satisfied.
Thus, both
cognitive and conative states have propositional
content (something about Martha liking George)
but differ in terms of direction of fit:
John Searle takes up Brentano's position, arguing that Intentional states are the means by which our minds relate us to the world. All Intentional states take the form of John believes that P is true, where P is some relationship -- in this instance, a relationship of belief -- about the world.
Within this framework, emotional (affective) states differ from both cognitive and conative states because their content is not propositional in nature: the state just refers to some feature of the world. Thus, in the affective state George was glad that Martha liked him there is propositional content (something about Martha liking George), but no direction of fit because the propositional content is already satisfied. It's just true (or, at least, George believes that it's true) that Martha likes him. So emotional states differ from cognitive and motivational states because there are no conditions of satisfaction: they just are what they are.
John Searle takes up Brentano's position, arguing that Intentional states are the means by which our minds relate us to the world. All Intentional states take the form of John believes that P is true, where P is some relationship -- in this instance, a relationship of belief -- about the world.
This is especially the case for our most abstract emotional states, such as John was happy.
Given this analysis, cognition, emotion, and motivation may be irreducible after all, just as Kant asserted: they are clearly distinguished from each other by their conditions of satisfaction (emotional states don't have any) and direction of fit (cognitive states fit mind to world, and conative states fit world to mind).
This analysis (if it's actually correct) will satisfy the Kantians among us, but it does create something of a paradox. Brentano argues that Intentionality is the mark of the mental, and Searle argues that all Intentional states have conditions of satisfaction. But emotional states don't have conditions of satisfaction, and therefore can't be mental states. If they're not Intentional, then they're not mental. But emotional states are mental states, aren't they?
The solution suggested by Searle is that emotional states are partly or largely constituted by beliefs and desires. Therefore they have propositional content after all, and conditions of satisfaction, and direction of fit. Thus, in an affective state like George likes Martha the belief is something like George believes that Martha is likable and the desire is something like George wants Martha to like him in return. And in an affective state like George is glad that Martha likes him the belief is something like George believes that he is a likable person and the desire is something like George wants Martha to like him.
So, in terms of Searle's formulation at least, affective states are reducible after all, to combinations of cognitive and conative states. That's bad for those who want to develop an independent affective (neuro)science, but it still opposes the hegemony of cognition, because something besides belief (i.e., desire) is needed, and that something is not reducible to cognition (because cognitive and conative states differ in direction of fit). So perhaps Kant was wrong after all, and there are only two absolutely irreducible faculties of mind: knowledge and desire. If so, somebody better get started developing a conative (neuro)science, because we don't have one right now.
Searle's
Analysis of
Intentionality
Searle, J.R.
(1983). Intentionality: An Essay in the
Philosophy of Mind. |
The problem is worse than this, however, because there are some affective and motivational states that don't have any propositional content at all.
Consider, for example, George is happy: there is no propositional content to be satisfied in either direction; and therefore there is no direction of fit. And in another example, Martha is hungry, again, there's no propositional content to be satisfied, no direction of fit, and no conditions of satisfaction. George is just happy, and Martha is just hungry. These are clearly mental states, aren't they? And if they are mental states, they are mental states that lack propositional content, direction of fit, and conditions of satisfaction.
Still and all, Intentionality and propositional attitudes lie at the heart of the philosophical doctrine of mentalism, which lies at the core of psychology:
The doctrine of mentalism lies at the core of psychology simply because psychological explanations of behavior invoke mental states as causal entities. The behaviorist movement in psychology rejected mentalism, and argued that behavior is caused by environmental stimuli, without any intervening mental states. Some philosophers (and, for that matter, some self-hating psychologists) also argue that mental states are irrelevant to behavior, leading to the positions described by Owen Flanagan as conscious inessentialism and epiphenomenalism.
So,
we don't
perceive
qualia in the
abstract;
rather, our
states of mind
are "about"
something.
Intentionality
has to do with
this
"aboutness",
or the fact
that
consciousness
is
representational.
Brentano
proposed that
Intentionality
is the mark of
the mental.
Intentional
states are
represented by
propositional
attitudes
(a term coined
by Bertrand
Russell),
which state a
relation (of
believing,
etc.) between
a person and
some
proposition
about the
world.
Actually,
of course,
there has to
be something
else that
stands between
my belief and
my action --
for example, a
desire to
achieve, or
avoid, certain
consequences.
Thus,
According
to the
Doctrine of
Mentalism in
philosophy,
these
propositional
attitudes
cause us to
behave the way
we do;
however,
according to
the contrary
Doctrine of
Epiphenomenalism,
propositional
attitudes are
actually
irrelevant to
our behavior.
The discussion of Intentionality has been a little misleading, because it has been framed in the third person, illustrated by the mental states of other people -- namely, George, Martha, and John.
But, as James put it so well, "Thought tends to personal form.... The universal conscious fact is not 'feelings and thoughts exist' but 'I think' and 'I feel'."
Or, as Thomas Nagel put it, "there is something that it is like" to be conscious.
Or, as John Searle has put it, "Conscious states exist only as they are experienced by a human or animal subject".
This is the element of subjectivity. Consciousness is inherently subjective, and any analysis of consciousness that leaves it out misses the mark.
The matter of subjectivity re has been considerable ambiguity and confusion about the distinction between the objective and the subjective, which Searle has been at pains to try to straighten out.
In the first place, there is a distinction
between objective and subjective ontology:
Conscious states would not exist if there were no one to experience them, so they have a subjective ontology. Their existence depends on an observer who experiences them.
In the second place, there is a
distinction between objective and subjective epistemology.
The
situation is
further
complicated,
as Searle
notes, by the
distinction
between observer-independent
and observer-dependent
(or observer-relative)
features of
the world.
Ordinarily,
we'd think of
ontologically
objective
entities as
observer-independent
and
ontologically
subjective
entities as
observer-relative,
and that would
be that. But
Searle argues
that
consciousness
is both
ontologically
subjective and
observer
independent.
Conscious mental states are ontologically subjective, in that they do not exist independently of an observer. That is the challenge for a "scientific" approach to consciousness, what makes some cognitive scientists nervous about the whole topic-- part of Owen Flanagan's "conscious shyness" is the "positivist suspicion" that consciousness cannot be studied scientifically precisely because it is subjective, and private, while science is public and objective.
But as Searle points out, "Ontological subjectivity of the subject matter does not preclude an epistemically objective science of that same subject matter". The whole point of a course entitled "Scientific Approaches to Consciousness" is to achieve epistemically objective knowledge about ontologically subjective states of mind. If that's not possible, then we should all go home. But of course, it is possible. As Searle notes, psychology especially, but also neurology, cognitive science, and cognitive neuroscience are all dedicated to developing an epistemically objective knowledge of mind, including consciousness.
One common scientific approach is to reduce consciousness to brain-processes (think of Lord Rutherford, who is said to have stated that "All sciences is physics, all the rest is stamp-collecting". But, Searle argues, this tack must fail. According to him, you can't reduce ontologically subjective facts (e.g., about consciousness) to ontologically objective facts (e.g., about brain processes) because any such reduction leaves out subjectivity -- which is the think that is supposed to be explained by the reduction.
Nor, for that matter, is it possible to explain consciousness with observer-relative facts. This would be circular, because observer-relative facts already presuppose consciousness.
One final point about observer-relativity
and science. Observer-independent entities lie, generally,
in the domain of the natural sciences, such as physics,
chemistry, and biology. Observer-relative entities, which
include all of the phenomena created by consciousness, lie
in the domain of the social sciences, such as history and
sociology. Psychology, as the science of mental life, is
both a natural and a social science.
Psychologists sometimes think that they have to choose between these positions, allying themselves either with biology or the social sciences. But they don't, because psychology is both a natural science and a social science.
Another philosopher, Ned Block, has made a distinction between two kinds of consciousness, based on subjectivity.
For Block, you can have P-consciousness without A-consciousness, as when there is background noise, but you don't pay any attention to it, so it doesn't interact with what you're thinking. And you can have A-consciousness without P-consciousness, as in cases of blindsight, where a person can make judgments about the visual properties of an object without consciously seeing that object. A-consciousness without P-consciousness is a characteristic of unconscious mental life, about which we will have more to say later.
Let's return to James for a moment, and his idea that thought tends to personal form. Just as Intentionality suggests that a focus on qualia is not enough, so subjectivity suggests that a focus on Intentionality is not enough either. That is, a description of a mental state such as George believes that Martha likes him isn't an accurate description of consciousness, because it leaves out personal subjectivity. The appropriate description is I believe that Martha likes me. All conscious thoughts, feelings, and desires are personal thoughts, feelings, and desires: they take the self as their subject. Paraphrasing James, the universal conscious fact is not that "George thinks" and "George feels" but rather "I think" and 'I feel".
Going beyond syntax, it appears that
self-reference takes one of four forms, depending on
what the UCB linguist Charles Fillmore (1968; see also
Brown & Fish, 1983) referred to as semantic
role. By which he meant that the subject of any
sentence can play one of four semantic roles:
As noted earlier, the property of subjectivity seems to make a scientific study of consciousness impossible: how do you make an objective study, based on public observations, of something that is inherently private and subjective? How can we know what a person is really seeing, remembering, thinking, or feeling?
This epistemological issue is brought to a head by synesthesia. In this phenomenon, a stimulus in one modality elicits sensation in another: for example, presentation of a sound may elicit the visual experience of color. Alternatively, the modality of experience may remain the same, but some unpresented quality may be added to the perceptual experience: for example, letters or digits presented in black-and-white may be experienced in color. (Apparently, different synesthetes have different item-color relations.)
In 1883, Sir Francis Galton took note of the individuality of synesthetic experiences: "To ordinary individuals one of these accounts seems just as wild and lunatic as another but when the account of one seer is submitted to another seer, the latter is scandalized and almost angry at the heresy of the former".
Roman Jacobson, the linguist, described a multilingual woman with phoneme-color synesthesia, who saw colors when she heard certain consonants and vowels: "As time went on words became simply sound, differently colored, and the more outstanding one color was, the better it remained in my memory. That is why, on the other hand, I have great difficulty with short English words like jut, jug, lie, lag, etc.: their colors simply run together." For her, Russian has "a lot of long, black and brown words", while German scientific expressions "are accompanied by a strange, dull yellowish glimmer".
Subject MLS displays letter-color synesthesia (Mills et al., 2002). Each letter of the alphabet is associated with a different color. In fact, MLS is multilingual, fluent in Russian as well as German, French, English, and Polish. In her synesthesia, she has one set of colors for Roman letters, and another set for Cyrillic.
Subject C displays digit-color synesthesia (Dixon et al., 2000, 2001, 2002a, 2002b). C was first studied for her extraordinary memory, as reflected in her ability to remember lists of 9-digit strings over intervals of as long as 2 months. In the course of investigating how she accomplished this feat, she happened to mention that she sees color whenever she sees, hears, or thinks of digits. When digits are presented in conventional black-on-white form, the color overlays the printed item.
Cases of synesthesia are often labeled in pairwise combinations of the stimulus inducer and the concurrent experience: thus, in grapheme-color synesthesia, certain units of written language elicit color. In this way, we can distinguish between synesthetic experiences and other anomalies of perception:
According
to Cytowicz (1989, 1993), synesthetic experiences tend to have
a number of features in common:
In theory at least,
the relationship between the inducer and the concurrent
is automatic (for a detailed discussion of automaticity,
see the lectures on Attention
and Automaticity). This can be demonstrated with a
variant on the Stroop test, in which color names are
replaced with the inducer. So, for example, a subject
who sees the letter A as red would see a string of As
-- e.g., AAAAA -- printed either in the concurrent color (e.g.
red, AAAAA, or
some other color (e.g. green, AAAAA). The
subject is then asked simply to name the color in
which the string is printed. Response
latencies are reduced if the string is printed in the
concurrent color, which suggests that the synesthetic subject cannot
help seeing red, and this interferes with the correct
perception -- or, at least, naming of green
(Mattingly, 2001).
It would be easy to suggest that synesthetic relations are metaphoric in nature, such as when we describe the taste of cheese as sharp, or the feeling of sadness as "being blue". However, synesthetes insist that their experiences are perceptual in nature.
According to another theory, synesthetic relations reflect associative learning during childhood. For example, the associations between letters and numbers on the one hand, and colors on the other, might have been learned through experience with blocks. But if that is the case, why aren't there more synesthetes around?
Perhaps the most
interesting theory of synesthesia, proposed by Ramachandran and
Hubbard (2001), is that it is result of sensory linkage
-- a kind of "cross-wiring" between brain centers that generate
the different modalities of sensation, and different qualities
of sensation within each modality. This proposal is
consistent with Muller's Doctrine of Specific Nerve Energies
(discussed in the lectures on Psychophysics)
-- and especially with the view that sensory experience is
generated by the projection areas where sensory impulses end
up. So, for example, sound-color synesthesia might be
generated by a cross-wiring that carries neural impulses from
the auditory projection area in the temporal lobe to Area V4 of
the occipital lobe, which is involved in color perception.
An alternative theory, proposed by Grossenbacher and Lovelace (2001), is in terms of disinhibited feedback from the association areas to the sensory projection areas of the cortex. Under normal circumstances, it is proposed, the sensory projection areas send information to the association areas, but not the other way around: that direction of influence is inhibited. But if there is disinhibition, then information being processed in the association areas -- e.g., for reading words -- can leak back to the sensory areas (like the color area in V4), causing letters to have colors.
In either case, synesthesia illustrates the
Doctrine of Specific Nerve Energies and the associated Doctrine
of Specific Fiber Energies: Conscious experience is not tied to
the stimulus (if it were, then synesthetes could never perceive
letters printed in black-and-white to be colored).
Instead, conscious experience is tied to the brain that
processes the stimulus.
For example, studies have sometimes (but not always) found
that synesthetic color activates area V4.
Most theories of synesthesia assume that inter-modal sensation reflects some sort of "crossing of the wires", and the further assumption that the neural organization that gives rise to synesthesia is in some sense innate -- synesthetes' brains are just wired that way. However, Without and Winawer (2013) identified 11 synesthetes whose color-grapheme correspondences were identical. The investigators attributed this coincidence to the fact that each of the subjects had been exposed in childhood to the same set of magnetic letters, in which different letters and numbers were printed in different colors. This means that synesthesia can be learned, presumably incidentally. All of the subjects met the standard criterion for synesthesia -- their cross-modal experiences were specific, automatic, and stable over time. So they were real synesthetes, not just mnemonists. Any theory of synesthesia will have to make room for learning and memory as well as neural connections.
As interesting as the sensory
linkage theory is, what synesthesia needs now is more
experimental research to answer fundamental questions about the
phenomenon -- such as whether it's really perceptual.
While past studies of synesthesia were mostly clinical and
impressionistic in nature, more recent work has
applied carefully controlled experimental paradigms to
determine what synesthetic subjects actually experience.
Among the most interesting of these was a pioneering set
of studies by Ramachandran and Hubbard (2000) at UCSD (Hubbard
majored in Cognitive Science as a UCB undergraduate).
Ramachandran & Hubbard (2000) studied two
subjects who experience digit and letter-color synesthesia:
In one experiment, the subjects performed a perceptual grouping task with an array of digits printed in black and white. When random digits are spaced out evenly in the array, by chance we would expect roughly half of subjects to group them into vertical columns, and the others to group them into horizontal rows -- and to do so pretty much arbitrarily. However, rows of similar digits, like 3 and 8, tend to bias grouping toward the horizontal, as a reflection of the familiar Gestalt principle of organization by similarity of shape.
As it happened, subject ER saw 3s and 7s as red, and 8s and 0s as blue. As a result, ER grouped the array by columns instead of rows. Taken together, the synesthetic subjects tended to organize the arrays by similarity of color, while controls tended to organize the arrays by similarity of shape.
In another experiment, on visual search, they exploited the phenomenon of pop-out, in which distinctive targets are identified automatically. The digits 2 and 5 have pretty much the same features, and so it is hard to find 2s embedded in an array of 5s -- especially when, as in this experiment, the array is presented only for a single second. Moreover, even when they identify one or more 2s, most people usually fail to detect patterns of targets that may be embedded in the array.
However, the task is quite different for digit-color synesthetes, for whom the target is perceived in a color that distinguishes it from the other digits in the array. Accordingly, the synesthetic subjects were more likely to detect the target, and were more likely to detect the hidden pattern as well. Their synesthetic experience made the target "pop out" in a way that was not the case for the control subjects.
Research on synesthesia is just beginning (the first international conference on synesthesia was held at UCB in Fall 2004). However, the studies of Ramachandran and Hubbard show what some of the possibilities are -- possibilities that are constrained only by the ingenuity of the experimenters.
At one level, synesthetic experiences represent anomalies in qualia, because the subject perceives a sensory quality that is not "in" the stimulus. Synesthesia reminds us that conscious experience is not given by the stimulus -- it's constructed in the perceiver's mind, by the perceiver's brain.
Further Readings on Synesthesia
|
In
any event, the discussion of Intentionality
permits us to frame certain questions about
consciousness more precisely, so that they become
somewhat more tractable:
These are questions for philosophical debate, but they are also questions for scientific research, so we can hope that scientific research on consciousness can shed light on these difficult scientific issues. For now, however, we will turn to examining the earliest scientific approach to consciousness:
This
page last modified 07/21/2023.