In the introductory lecture, I noted that the existence of the self creates a qualitative difference between social and nonsocial cognition.
The self lies at the center of mental life. As William James (1890/1981, p. 221) noted in the Principles of Psychology,
Every thought tends to be part of a personal consciousness.... It seems as if the elementary psychic fact were not thought or this thought or that thought but my thought, every thought being owned.... On these terms the personal self rather than the thought might be treated as the immediate datum in psychology. The universal conscious fact is not "feelings and thoughts exist" but "I think" and "I feel"....
In other words, conscious experience requires that a particular kind of connection be made between the mental representation of some current or past event, and a mental representation of the self as the agent or patient, stimulus or experiencer, of that event (Kihlstrom, 1995). It follows from this position that in order to understand the vicissitudes of consciousness, and of mental life in general, we must also understand how we represent ourselves in our own minds, how that mental representation of self gets linked up with mental representations of ongoing experience, how that link is preserved in memory, and how it is lost, broken, set aside, and restored.
James went
on to distinguish between three aspects of selfhood:
My Avatar, My Selfie, My Shelfie, My Self?To James's list we
might now add one's avatar -- the image chosen
to represent oneself in video and online games like world
of Warcraft (no, I'm not a gamer, unless you
count cribbage
and backgammon). An avatar,
after all, is nothing less than a digital
representation of yourself. And if you configure
a close enough resemblance, you're dealing with what's
known as a digital doppelganger (Magid,
1998; Chimielewski,
2005). And, it turns out, watching
your digital doppelganger behave in a game
environment can actually have effects on your
self-concept. There are
studies to be done here (hint, hint).
James
was onto something with his concept of the material
self, and that is that our possessions, like our
avatars, are expressions of our selves. I'm the kind
of person who buys a Subaru, and I'm not the kind of
person -- whatever that is -- who would buy a
Lamborghini, even if I could afford it. At least
some of our possessions are, quite literally,
expressions of ourselves - -what we might call the material
culture of the self. There are studies to be done here (hint, hint). One possibility, if you could get around the obvious confidentiality problem, would be a study of people's passwords, and why they chose them. Yes, your passwords are supposed to be random numbers, letters, and symbols; but they aren't. Yes, you're supposed to use different passwords for different sites; but you don't. Yes, you're supposed to change them all the time; but you never do. And it's not just a matter of laziness, or the demands on memory. It's that our passwords are part of us, they represent something about ourselves. For more on "keepsake" passwords, see "The Secret Life of Passwords" by Ian Urbina, who coined the term, New York Times Magazine, 11/23/2014. Some excerpts:
The
Covid-19 pandemic offered another opportunity
to study the material culture of the
self. During the lockdown, with many
people working from home and social distancing
strongly encouraged, many people communicated
with each other via Skype, Zoom, and similar
utilities. This circumstance allowed
people to look inside friends' and coworkers'
homes for the first time; and, accordingly,
many people deliberately arranged the
backgrounds of their videos in such a way as
to make a statement -- about themselves by
displaying favorite books, mementoes, and the
like. Not only that, Room
Rater, an account on Twitter, emerged
with screenshots of various backgrounds, such
as those appearing on news programs,
accompanied by brief comments and a 1-10
rating. The trend was noted by popular
magazines such as House Beautiful
("Room Rater is the Best Thing to Come Out of
Lockdown" by Hadley Keller, June 2020). Developmental psychologists (and parents) note that object attachment begins very early in life (think of your childhood teddy bear); and while attachment to specific objects (like that teddy bear) may drop off in later childhood, our attachment to certain objects remains very strong throughout life. Our possessions help define who we are. Russell Belk, a specialist in consumer behavior, calls this the extended self (e.g., Belk & Tian, 2005). To some extent, we are what we own. So much so, that we can be traumatized when we lose them, or they are taken from us. Certainly we are reluctant to give them up. In a classic study, Kahneman and his colleagues (1990) gave college students coffee mugs embossed with their college logo, and then allowed them to trade them in a kind of experimental marketplace. Interestingly, they found that the students were very reluctant to sell their mugs -- even though they hadn't owned one before the experiment, and they hadn't paid anything for them in the first place. Selling prices were very high, and buying offers were very low. This is known as the endowment effect -- a reflection of loss aversion. The subjects simply didn't want to lose something that they now owned. |
This puzzling problem arises when we ask, "Who is the I who knows the bodily me, who has an image of myself and sense of identity over time, who knows that I have propriate strivings? I know all these things and, what is more, I know that I know them. But who is it who has this perspectival grasp...? It is much easier to feel the self than to define the self.
One way to define the self is simply as one's mental representation of oneself (Kihlstrom & Cantor, 1984) -- a knowledge structure that represents those attributes of oneself of which one is aware. As such, the self contains information about a variety of attributes:
Within personality and social psychology, the self-concept is commonly taken as synonymous with self-esteem, but within the social-intelligence framework on personality (Cantor & Kihlstrom, 1987) the self-concept can be construed simply as one's concept of oneself, a concept no different, in principle, than one's concept of bird or fish. From this perspective, the analysis of the self-concept can be based on what cognitive psychology has to say about the structure of concepts in general (Smith & Medin, 1981).
As a first pass, we may define the self as a list of attributes that are characteristic of ourselves, and which serve to differentiate ourselves from other people.
One possible way to assess the self-concept is simply to give people an adjective checklist and have them indicate the degree to which each item is self-descriptive. But such a procedure does not distinguish between those attributes which are shared with other people, and those that are truly distinctive about oneself. Nor does it distinguish those attributes that are trivial from those that are truly critical to one's self-concept.
For that reason,
Hazel Markus (1977) introduced the notion of the self-schema.
In her research, she presented her subjects with the usual sort
of adjective checklist, but asked them to make two different
ratings for each item:
Markus' notion of "self-schematicity" (pardon the
neologism) was an important advance in the assessment of the
self-concept, but it was not entirely satisfactory. As
Robert Dworkin and his colleagues noted, her method essentially
confounds self-descriptiveness and self-importance.
Subjects who have a moderate standing on a particular trait
(e.g., midway between dependence and independence) must
be classified as aschematic for this trait, even if their
moderate standing is extremely important to their
self-concept. This doesn't seem right. The
implication of Dworkin's argument is that the only rating that
really matters, so far as the self-concept is concerned, is
self-importance.
Of course, even the fact that subjects give high self-importance ratings to some attribute doesn't mean that it's really part of their self-concept. Strictly speaking, the self-concept should focus on how people naturally, spontaneously, think about themselves. Following this logic, William McGuire and his colleagues put forward the notion of the spontaneous self-concept -- in which subjects define themselves in their own terms. In his procedure, McGuire essentially presents subjects with a blank piece of paper with the instruction "Tell me about yourself". This results in a free listing of attributes that is entirely ideographic in nature -- that is, subjects are not forced to use terms chosen by the experimenter, or terms that might also be used by other subjects. They are allowed to define their self-concepts freely.
In one study of sixth graders, McGuire and his colleagues performed a content analysis of the spontaneous self-concept. Interestingly, self-esteem played a relatively minor role, accounting for only about 7% of the attributes listed. Habitual activities, and other people, played a much larger role in the way these children thought about themselves.
In his
research, McGuire has been particularly interested in testing
what he has called the distinctiveness postulate:
In testing the distinctiveness postulate, McGuire and his colleagues classified their subjects (again, sixth graders) as typical or atypical on various objectively measured attributes, such as age or birthplace. Across a large variety of such attributes, they found that children were more likely to mention a feature if they were atypical on that feature with respect to their classmates. For example, left-handed children are more likely to mention handedness in their spontaneous self-concepts than are right-handed children.
An interesting feature of distinctiveness is that "atypicality" is not defined in the abstract, with respect to population statistics, but rather concretely, with respect to the immediate social context. This was shown clearly by McGuire's analysis of the appearance of sex or gender in subjects' spontaneous self-descriptions. There are, roughly 50% each males and females, so gender can't really be atypical. But it turned out that gender was mentioned more frequently by sixth-graders who were atypical for gender with respect to the distribution of the sexes in their classrooms or households.
Thus, we
can redefine distinctiveness for purposes of exploring the
distinctiveness postulate further:
More on DistinctivenessThere's lots more research to be done on
McGuire's distinctiveness postulate. For
example:
But here's one idea: One could employ an adjective checklist such as Markus used, and use the self-descriptiveness ratings as a basis for assessing typicality, and the self-importance ratings as a proxy for the spontaneous self-descriptions. The prediction is that subjects are likely to be self-schematic for psychosocial attributes in which they describe themselves as atypical. Anybody who wants to do such a study: YOU READ IT HERE FIRST! |
From the time of Aristotle until only just recently, concepts were characterized as proper sets: summary descriptions of entire classes of objects in terms of defining features which were singly necessary and jointly sufficient to identify an object as an instance of a category. Thus, the category birds includes warm-blooded vertebrates with feathers and wings, while the category fish included cold-blooded vertebrates with scales and fins.
In
principle, at least, a classical proper-set structure could be
applied to the self-concept. Thus, the self-concept could
be a summary description of oneself, whose defining features
consist of those attributes that are singly necessary and
jointly sufficient to distinguish oneself from all others.
This is possible in principle, but in practice it seems not
terribly useful, as defining features would probably be
restricted to the individual's birth date, place, and time, and
the names of his or her parents.
But both philosophical considerations and the results of experiments in cognitive psychology have persuaded us not to think about concepts in terms of proper sets and defining features, but rather in terms of family resemblance, in which category members tend to share certain attributes, but there are no defining features as such. According to this view of categories as fuzzy sets, category instances are summarized in terms of a category prototype which possesses many, but not necessarily all, of the features which are characteristic of category membership.
In the late 1970s and early 1980s the idea that the self, too, was represented as a category prototype was popular, and some very interesting experiments were done based on this assumption (Rogers, 1981). But category prototypes are abstracted over many category instances. How does one talk about family resemblance, or abstract a prototype, when there is only one member of the category -- oneself? The notion of self-as-prototype, taken literally, seems to imply that the self is not unitary or monolithic. We do not have just one self: rather, each of us must have several different selves, the characteristic features of which are represented in the self-as-prototype.
In fact, the idea that we have a multiplicity of selves can be traced to the very beginnings of social psychology.
Taken to the extreme, the self-concept as a
set of exemplars is exemplified (sorry) by dissociative
identity disorder, formerly known as multiple
personality disorder. In this exceedingly rare
psychiatric syndrome, the patient posses two or more different
identities, each associated with a different set of
autobiographical memories. The different identities, in
turn, are separated by an interpersonality amnesia,
which is typically asymmetrical. For example, in the
famous case of the Three Faces of Eve, there were three
different "personalities", or identities,
within a single Georgia housewife:
But one does not have to have a dissociative disorder to have a multiplicity of selves. Traditional personologists assume that behavior is broadly stable over time and consistent over space, and that this stability and consistency reflect traits which lie at the core of personality. Viewed cognitively, the self might be viewed as the mental representation of this core. But social psychologists have argued that behavior is extremely flexible, varying widely across time and place.
Accordingly,
a person might have a multiplicity of context-specific selves,
representing what we are like in various situations, and
reflecting our awareness of the contextual variability of our
own behavior. For example:
This is all well and good, but maybe there is not a prototype after all. Another trend in cognitive psychology has been to abandon the notion entirely that concepts are summary descriptions of category members (Medin, 1989). Rather, according to the exemplar view of categories, concepts are only a collection of instances, related to each other by family resemblance perhaps, and with some instances being in some sense more typical than others, but lacking a unifying prototype at the highest level. Some very clever experiments have lent support to the exemplar view, but as yet it has not found its way into research on the self-concept. Nevertheless, the general idea of the exemplar-based self-concept is the same as that of the context-specific self, only lacking hierarchical organization or any summary prototype.
Regardless of whether this
"family" of context-specific selves is united by a summary
prototype, or exists simply as a set of exemplars, it is
possible for these multiple selves to come into conflict -- as when,
speaking metaphorically, the angel
on your right
shoulder tells you to do one thing, and the devil on your
left shoulder tells you to do something
else. T.C. Schelling, an economist,
has suggested that these alternate selves can
work against each other in a process he calls
self-binding: one of your selves wants
to eat that Little Debbie, the other wants to
stay thin, and they duke it out in our head.
The three views of categorization presented so far -- proper sets, fuzzy sets, and exemplars -- all assume that the heart of categorization is the judgment of similarity. That is, instances are grouped together into categories because they are in some sense similar to each other. But similarity is not the only basis for categorization. It has been proposed that categorization is also based on one's theory of the domain in question; or, at least, that people's theories place constraints on the dimensions which enter into their similarity judgments (Medin, 1989).
Epstein's views have not been translated into programmatic experimental research on the self, but we can perhaps see examples of theory-based construals of the self in the variety of "recovery" movements in American society today (Kaminer, 1992). Whether we are healing our wounded inner child, freeing our inner hairy man, dealing with codependency issues, or coping with our status as an adult child of alcoholics or a survivor of child abuse, what links us to others, and literally constitutes our definition of ourselves, is not so much a set of attributes as a theory of how we got the way we are. And what makes us similar to other people of our kind is not so much that they resemble us but that they went through the same kind of formative process, and hold the same theory about themselves as we do of ourselves. Dysfunctional or not, it may well be that we all have theories -- we might call them origin myths -- about how we became what we are, and these theories are important parts of our self-concept. Such self-theories could give unity to our context-specific multiplicity of selves, explaining why we are one kind of person in one situation, and another kind of person in another (Kihlstrom et al., 1995).
The Self and Identity PoliticsFor many people, a major part of their self-concept concerns their identity with some sociodemographic group -- usually, though not always, a minority group or other outgroup. Partly, this reflects McGuire's distinctiveness postulate -- that people include in their self-concepts those features that tend to distinguish them from others, which infrequent features do almost by definition. But since the 1970s, as a result of the civil rights and feminist movements in the United States and elsewhere, group identity has also taken on a political dimension, as social and political activists influenced by Marxist notions of class analysis and consciousness-raising. See, for example, the work of Arthur Schlesinger, Jr., a historian who in The Disuniting of America that identify politics threatened to destroy the common culture that was, in his view, necessary for liberal democracy to thrive. Identity politics can also infect majorities and ingroups, as well, as we can see in the predominantly white, Euro-centric "Tea Party" movement that came on the scene in the wake of the 2008 financial crisis. As a result of identity politics, terms like "black", "feminist", "LGBT" (for Lesbian-Gay-Bisexual-Transgender) stand for political identities as well as for features of one's own personal self-identity. And, accordingly, one's identity with these political movements can become incorporated into one's self-concept. And a person's identification with some group becomes an important element in how that person will be perceived by others. A dramatic example of this came in 2010, when President Barack Obama filled out his form for the U.S. Census. Viewed objectively, Obama is of mixed race, and (beginning with the 2000 census) he could have identified himself as such on his census form. But he didn't -- he checked only the box for "Black, African-American, or Negro". Obama was criticized for this in some quarters, on the view that such an act betrayed his self-presentation as someone who is "post-racial". It's objectively true, of course, that Obama is the product of a mixed marriage, with a white mother and a black father. But that's not the way he subjectively identifies himself. Despite being raised by his white maternal grandparents, Obama has always cultivated an identity as a black person -- as indicated clearly in his memoir, Dreams From My Father. And, as he quipped to David Letterman, he "actually was black before the election". What makes Obama "post-racial", if anything, is that he is able to express pride in his black heritage without making whites uncomfortable. Transgender people who are objectively male can identify themselves as female ("I'm a woman trapped in a man's body"). Mixed-race people can identify themselves as black -- or white. Even more than gender, race is a social construction. And, for that matter, there
are cases of transracial identification --
not just light-skinned Blacks "passing"
for white (as discussed in the lectures on Social
Categorization), but also white
individuals
who identify themselves as Black. An
interesting example is that of Rachel
Dolezal, a white woman who served
as president of the Spokane, Washington
chapter of the National Association for
the Advancement of Colored People.
Now, there's no problem with a white
person being a member, or even an
officer, of the NAACP. The
issue is that Dolezal identified herself
as black. Both her parents
are white, with no African ancestry;
interestingly, they adopted two Black
children, Izaiah and Ezra.
When
her deception was discovered, Dolezal
argued that, although she was
"biologically born white", she had
long considered
herself to be Black.
It's important to
note, though, that neither of these
cases is a simple one of someone
claiming minority-group status in
order to take advantage of affirmative
action. Neither of these
individuals simply "talked the
talk". They both also "walked
the walk", as activists in behalf of
their putative groups. Their
identities, as Black or Indigenous,
were put into relevant action. In
large part because of the Dolezal
case, the topic of transracialism
-- e.g., being biologically
"white" but identifying oneself as
black -- has begun to be discussed by both
psychologists and philosophers
(see, e.g., "In Defense of
Transracialism" by Rebecca Tuvel, Hypatia,
2017). If
biology is not
destiny when it comes to
gender, and everyone agrees
that race is to a large extent
a social construction, then
why can't people
be transracial in the
same way that they can be
transgender? One
contrary argument
is that race, while
not a strictly
biological
concept, is an ancestral
one. That is,
to a large extent
being black entails
having parents,
grandparents,
or great-grandparents
who were
Black. The
point is
particularly
applied to
African-Americans,
whose ancestry
includes the
heritage of
slavery (for
this reason,
some scholars
consider Michele
Obama, whose ancestors
were slaves,
to be African-American
in ways that
Barack Obama,
whose
ancestors
remained in
Africa, are
not (these
same scholars
use "Black" as
a
superordinate
term for
anyone who has
African
ancestry); and
why some
African-Americans
and Hispanic-Americans
object
when
affirmative-action
positions are
offered to
candidates
from Africa or
Latin America
(or Spain or
Portugal, for
that matter)
who do not
have ancestors
who lived in
the United
States, or who
did not share
a history of
racial or
ethnic
discrimination.
On the other
hand, a
similar point
could be made
about
individuals
who are
transgender:
you can be "a
woman
trapped in a
man's body",
but as a man,
you did not
suffer the
kinds of
discrimination
faced by women
who were
always biologically
female.
It's
complicated,
and it will be
interesting to
see how the
debate plays
out.
Anyway, the basic point is that the
self-concept concerns how we identify ourselves
-- how we perceive ourselves, not how others
perceive us. (We can incorporate others'
perceptions of us into our self-concepts, but that
is not at all the same thing.) Another recent turn in identity
politics concerns intersectionality, a
term coined by Kimberle Crenshaw (University of
Chicago Legal Forum, 1989), a legal
scholar, to label the situation where an
individual finds him- or herself a member of
two or more groups.
For
example, an African-American woman
might find herself in
conflict between her identify
as a black woman
(distinguishing herself from other
women), and her identify as a black woman (distinguishing
herself from other black
people), and
identify herself instead as
a black woman -- a
completely different
category entirely. In Crenshaw's
argument, black women
constitute a multiply-burdened
[sic] class.
Intersectionality isn't just a matter
of a subordinate
category (black woman)
inheriting the features
associated with two
superordinate categories
(blacks and women);
rather, it has
special features
that inhere
to its
intersectionality.
Put
bluntly, black women
confront issues that are different
from those
that confront either
black people in
general, or women in
general. Although
the issue of
intersectionality originally arose
out of discussions of
feminist theory, it's
easy to imagine other
points of intersection
-- for example, a gay
African-American man -- where
a person posses the "markers"
of two or more different
minority identities.
|
Our
discussion of the self as concept and as story illustrates a
strategy that we have found particularly useful in our work:
beginning with some fairly informal, folk-psychological notion
of the self-concept, we see what happens when we apply our
technical understanding of what that form of self-knowledge
looks like. Much the same attitude (or, if you will, heuristic)
can be applied to another piece of ordinary language: the
self-image.
Schilder
(1938, p. 11) defined the self-image as "The picture of our own body which we form in our mind, that is to say, the way in which the body appears to ourselves".What follows from this? |
First, there is the question of whether, in talking about our mental images of ourselves, we should be talking about mental images at all. Beginning in the 1970s, a fairly intense debate raged about the way in which knowledge is stored in the mind. At this point, most cognitive psychologists are comfortable distinguishing between two forms of mental representation: meaning-based and perception-based (Anderson, 1983).
Perception-based representations represent the physical appearance of an object or event -- including the spatial relations among objects and features (up/down, left/right, back/front), and the temporal relations among objects and features (before/after). Perception-based representations are analog representations, comprising our "mental images" of things.
Meaning-based representations store knowledge about the semantic relations among objects, features, and events, that is abstracted from perceptual detail, such as their meaning and category relations. Meaning-based representations take the form of propositions -- primitive sentence-like units of meaning which omit concrete perceptual details.
The self-concept is a meaning-based representation of oneself -- regardless of whether it is organized as a proper set, a fuzzy set, a set of exemplars, or a theory. The self-as-story is also meaning-based.
The self-image is a perception-based representation of the self, which stores knowledge about our physical appearance.
Perception-based
representations are relatively unstudied in social cognition,
but it is quite clear that we have them.
In the laboratory, studies of the self-image qua image are very rare. One exception is a fascinating study by Mita, Derner, and Knight (1977) on the mere exposure effect (Zajonc, 1968), in which subjects view a series of unfamiliar objects (e.g., nonsense polygons or Turkish words), and later make preference ratings of these same objects and others which had not been previously presented. On average, old objects tend to be preferred to new ones, and the extent of preference is correlated with the number of prior exposures. In Mita et al.'s (1977) experiment, subjects were presented with pairs of head-and-shoulder photographs of themselves and their friends, and asked which one they preferred. In each pair, one photo was the original, and the other was a left-right reversal.
Some sense of the procedure in Mita et al.'s experiment is given by considering these two photographs of Marilyn Monroe. The left-hand photo is a true photograph of the actress, with the conspicuous beauty mark on her left cheek. The right-hand photo is mirror-reversed, so that the beauty mark appears on her right cheek. According to Zajonc's prediction, Monroe herself, if she were given the choice, would prefer the reversed photo on the right, because it reflects (sorry) the way she would see herself in the mirror. But other people would prefer the original photo on the left, because that is the way they have seen her in movies, on television, and photographs.
The result was as predicted. When viewing photos of their friends, subjects preferred the original view (that is, the view as seen through the lens of the camera); but when viewing photos of themselves, the same subjects preferred the left-right reversal (that is, the view as would be seen in a mirror). Thus, our preferences for pictures match the way we typically view ourselves and others. Mita took this as evidence for the mere exposure effect, which it is; but it is also evidence that we possess a highly differentiated self-image which preserves information about both visual details and the spatial relations among them.
The Meta et al. study demonstrates clearly that we possess a highly detailed image of our own (and others') faces, but the point probably applies to the rest of our bodies as well.
A number of psychometric instruments have been devised for assessing aspects of the body image.
Historically speaking, perhaps the most popular assessment method has been the Goodenough-Harris Draw-a-Person Test, in which the subject is asked to draw pictures of a man, a woman, and other figures. The traditional assumption behind this projective technique is that the subject will project his own body image onto the drawings of other people. But this is a dubious assumption, unproven at best. Moreover, the assessment relies heavily on unwarranted assumptions that people can draw well (just try, in the privacy of your own home, to draw a picture of a man and a woman; then shred the results before any of your friends can see the products of your efforts!).
The DAP, like most projective tests, is a pretty crummy psychometric instrument, pretty much lacking anything resembling standardization, norms, reliability, or validity.
Loren and
Jean Chapman, two prominent schizophrenia researchers, developed
a questionnaire method, the Body-Image Aberration Scale,
for assessing body image aberrations (hence the title) in
patients with schizophrenia and other psychoses, and in
individuals hypothetically at risk for schizophrenia. The
scale consists of a number of subscales:
Note to readers prone to medical-student syndrome. The BIAS is actually not a particularly good predictor of later psychosis (it was an interesting idea that didn't really work out -- not least because, as we'll see a little later, body-image aberrations don't seem to be characteristic of schizophrenia), so don't worry too much if you said "yes" to most or all of the sample items given above. For details, see Chapman, Chapman, & Raulin, Journal of Abnormal Psychology, 1978.
By far the
most popular instrument used in research, this consists of line
drawings of males and females, clad in swimsuits, ranging from
thin to not-so-thin. The drawings are connected by a
continuous scale, on which subjects indicate their:
A "generational" study by Rozin & Fallon (1988), compared college men and women to their mothers and fathers. On average, mothers showed larger Current-Ideal discrepancies than their daughters -- and the fathers even more so!
As with the self-concept, the self-image can be illustrated with clinical data.
Patients in the acute stage of schizophrenia often complain of distortions in their perception of their own bodies.
Some classic
studies of body image in acute schizophrenia employed
adjustable mirrors, such as used to be found (perhaps they still
are) in amusement-park "fun houses". These mirrors can be bent in three
planes to produce distorted reflections of the person. In
a a series of studies
by Taub, Orbach, and their colleagues, subjects presented with
distorted reflections of themselves, produced by bending certain
portions of the mirror concave or convex, and asked to adjust
the mirror until they looked right.
The general finding was that the
patients' mirror images remained distorted even after adjustment
-- suggesting that schizophrenics really did have distorted body
images. But then the investigators made the "mistake" of
running a control condition, in which a rectangular picture
frame was placed in the mirror, and the subjects were asked to adjust that
image until it appeared normal. The
schizophrenics showed an aberration in perception of the frame,
as well, suggesting that their perceptual aberration was not
limited to their body image.
Much the same procedure can be used with computer "morphing" software.
In autotopagnosia (also known as body-image agnosia or somatotopagnosia), a neurological syndrome associated with focal lesions in the left parietal lobe, the patient can name body parts touched by the examiner, but cannot localize body parts on demand.
In phantom limb pain, amputees perceive their lost arms and legs as if they were still there.
In body dysmorphic disorder, the patient complains of bodily defects where there really aren't any.
In eating disorders such as anorexia and bulimia, the sufferer sees fat where the body is objectively normal, lean, or even gaunt.
But,
outside of the Traub-Orbach
studies of body image in schizophrenia, little of this
clinical folklore has been studied experimentally.
One exception is in the study of eating-disordered women (eating-disordered men haven't been studied much, though they do exist).
A study by Zellner et al. (1989), using the BIA, found that eating-disordered women showed a bigger current-ideal discrepancy than non-eating disordered women, or males in a comparison group.
Another of the BIA,
study by Williamson et al. (1989) of women
with bulimia, a special form of eating disorder, confirmed this
difference. Even when women with bulimia were
statistically matched with non-bulimic women on actual weight,
the bulimic women showed a much greater current-ideal
discrepancy than the normals, and gave higher current IBS
ratings as well -- indicating not just that they have an
exaggerated ideal body image, but that they have an exaggerated
current body image as well.
Perhaps inspired by the Traub-Orbach studies with the adjustable-mirror paradigm, more recent investigators have made use of computer software for image morphing to study body image in normal and eating-disordered individuals.
One such study employed the Body-Image Assessment Software developed at the University of Barcelona by Letosa-Porta, Garcia-Ferrer, and their colleagues (2005). They take numerous biomorphic measurements of subjects' bodies, and use these to create a computer "avatar" that mimics the shape of the subject's body. The subject is then asked to modify the image so that it corresponds to his/her real and ideal body image. the discrepancy between the original, objective avatar and the subject's real body image is taken as a measure of perceptual distortion; the discrepancy between the subjects real and ideal body images are then taken as a measure of body image. A later study by Ferrer-Garcia et al. (2008) showed that women with a diagnosis of eating disorder, or at risk for ED, scored higher on both measures than women who were not at risk.
Another approach employs the Adolescent Body Morphing Tool developed by Aleong et al. (2007) at the University of Montreal. They first developed an Adolescent Body-Shape database from front and side photographs of 160 male and female Canadian adolescents. The subjects were first photographed (front and side views while dressed in a body suit and ski mask (to protect anonymity). Then trained judges applied virtual tags to various points on the body image. A factor analysis of various measurements yielded a multidimensional representation of the "average" adolescent body. The resulting images can be morphed by increasing or decreasing the size of these dimensions by a certain percentage.
In
a later study,
Aleong et al. (2009) employed height, weight,
and body-mass index to match each of 182
normal (i.e., non-eating-disordered) males and
females to one of the images stored in the
Adolescent Body-Shape Database, and then
distorted the image, especially around the
hips, thighs, and calves. The subjects
were then asked to return the image to
"normal". In psychophysical
terms, the point of subjective
equality, reached when the subject
believed that the image had been returned
to normal, is a measure of the accuracy of
the body image: females were less accurate
than males, but only with the side image.
The difference
limen measured
how much morphing was
required for the subject to detect a
difference from normal: females were
more sensitive than males -- again, especially
when viewing side images.
But you don't have to suffer from mental
illness in order to have a self-image that is wildly discrepant
from objective reality. A clever demonstration of this took the
form of the "Dove
Beauty Sketches",
an advertising campaign mounted by Dove, a unit of Unilever that
makes a popular brand of bath soap, in 2013. Dove commissioned
Gil Zamora, a forensic artist who has worked with the San Jose
Police Department and the FBI, who prepared sketches of ordinary
women (i.e., not movie stars or other celebrities) from their
self-descriptions (the women sat behind a screen, so that Zamora
was able to work only from their verbal descriptions). Then
Zamora prepared another sketch, of the same women, based on
verbal descriptions of them by a randomly selected perceiver.
When the sketches were placed side by side, the sketch based on
the observer's description was more attractive than the one
based on the women's own descriptions. The moral of the
exercise, according to Dove: "You're more beautiful than you think you
are". ("Ad About Women's
Self-Image Creates a Sensation" by Tanzina Vega, New York Times
04/19/2013). Link to
a YouTube video of the Dove Real Beauty Sketches.
How do we
know what we look like? We look in the mirror (which
reverses left and right), and we look down toward our feet
(which gives us a somewhat distorted view of what lies between),
and we feel our hearts beating (but not our blood
pressure). Like video, only more so, new virtual-reality
technology allows us to see ourselves as others see us, and also
allows us to experience what it would be like to live in other
bodies -- a process known as "VR embodiment". VR
embodiment also enables subjects to have "out of body
experiences" in which the self (at least the conscious self)
appears to leave the body.
Interoception, Proprioception, and the
Proto-Self
|
For most of its history the study of memory has been the study of verbal learning. And accordingly, many psychologists have come to think of memory as a set of words (or phrases or sentences), each representing a concept, joined to each other by associative links representing the relations between them, the whole kit and kaboodle forming an associative network of meaning-based knowledge (Anderson, 1983) -- Schank and Abelson's (1995) theory of knowledge as stories is explicitly opposed to this conventional thinking. It is also commonplace to distinguish between two broad types of verbal knowledge stored in memory (Tulving, 1983). Episodic memory is autobiographical memory for a person's concrete behaviors and experiences: each episode is associated with a unique location in space and time. Semantic memory is abstract, generic, context-free knowledge about the world. Almost by definition, episodic memory is part of the self-concept, because episodic memory is about the self: It is the record of the individual person's past experiences, thoughts, and actions. But semantic memory can also be about the self, recording information about physical and psychosocial traits of the sort that might be associated with the self-concept.
Within the verbal-learning tradition, knowledge about other people has been studied extensively in a line of research known as person memory (Hastie, Ostrom, Ebbesen, Wyer, Hamilton, & Carlston, 1980). Several different models of person memory have been proposed (Kihlstrom & Hastie, 1993), and some of these have been appropriated for the study of memory for one's self (Kihlstrom & Klein, 1994; Klein & Loftus, 1993). The simplest person-memory model is an associative network with labeled links. Each person (perhaps his or her name) is represented as a single node in the network, and knowledge about that person is represented as fanning out from that central node. The person-nodes are also connected to each other, to represent relationships among them, but that is another matter. The point is that in these sorts of models the various nodes are densely interconnected, so that each item of knowledge is associatively linked to lots of other items. In theory, the interconnections among nodes form the basis for associative priming effects, in which the presentation of one item facilitates the processing of an associatively related one.
Of course, knowledge about a person can build up pretty fast: consider how much we know about even our casual acquaintances. According to the spreading activation theory that underlies most associative network models of memory (Anderson, 1983), this creates a liability known as the fan effect: the more information you know about someone or something, the longer it takes to retrieve any particular item of information.
Is there any way around the fan effect in person memory? One possibility which has been suggested is that our knowledge about ourselves and others (especially those whom we know well) is organized in some way -- perhaps according to its trait implications. There is some evidence that organization does abolish the fan effect (Smith, Adams, & Schorr, 1978), but this evidence is rather controversial, and some have concluded that memory isn't really organized in this manner after all (Reder & Anderson, 1980). Nevertheless, a hierarchically organized memory structure is so sensible that many person-memory theorists, such as Hamilton and Ostrom, have adopted it anyway (Hastie et al., 1980; Kihlstrom & Hastie, 1993).
How can we generalize from person
memory to the structure of memory for the self? Sure: The
simplest expedient is simply to take a generic
associative-network model of person memory, which has nodes
representing knowledge about a person fanning out from a node
representing that person him- or herself, -- following the "James Bartlett" example discussed
at length in the lecture supplement on Social Memory -- and
substitute a "Self" node for the "Person" node.
Research on person memory -- i.e., memories for other people depicts a model in which episodic self-knowledge is encoded independently of semantic self-knowledge -- or, put another way, in which knowledge of behaviors is represented separately from knowledge of traits. If the self is a person just like any other, we would expect that the representation of self in memory would have the same structure as memory representations of other people.
On the other hand, the self may be exceptional, in that memory for specific behavioral episodes might be organized by their trait implications. In this model, nodes representing traits fan off the node representing the self, and nodes representing specific episodes which exemplify these traits fan off the trait-nodes. This hierarchical model implies that retrieval has to pass through traits to access information about behaviors. Thus, traits will be activated in the course of gaining access to information about behaviors.
On the other hand, Bem's self-perception theory denies that we retain any knowledge about our traits and attitudes (because we don't really have any traits or attitudes); when we are asked about our traits and attitudes, we infer what they might be from knowledge of relevant behaviors. From this point of view, the self contains only episodic knowledge about experiences and behaviors, and that semantic knowledge about traits is known only indirectly, by inference. One such inferential process would involve sampling autobiographical memory, and integrating the trait implications of the memories so retrieved. In this computational model, retrieval must pass through behaviors in order to reach traits. Put another way, nodes representing behaviors will be activated in the course of recovering -- put precisely, in the course of constructing -- information about traits.
An extensive series of studies by Klein and Loftus (1992) has produced a compelling comparative test of these models. These studies adapted for the study of the self the priming paradigm familiar in studies of language and memory, in which presentation of one item facilitates the processing of another associatively related item. Subjects were presented with trait adjectives as probes, and performed one of three tasks. In the define task, they simply defined the word; in the describe task, they rated the degree to which the term described themselves; in the recall task they remembered an incident in which they displayed behavior relevant to the trait. For each probe, two of these tasks were performed in sequence -- for example, describe might be followed by recall, or define by recall, or recall by describe. There were nine possible sequences, and the important data was the subject's response latency when asked the second question of each pair.
Because priming occurs as a function of overlap between the requirements of the initial task and the final task, systematic differences in response latencies will tell us whether activation passes through trait information on the way to behaviors, or vice-versa, or neither. When the two processing tasks were identical, there was a substantial repetition priming effect of the first one on the second. But when Klein and Loftus (1992) examined the effect of recall on describe, they saw no evidence of semantic priming compared to the effects of the neutral define task. Nor was there semantic priming when they examined the effect of describe on recall (again, compared to the effects of the neutral define task). Contrary to the hierarchical model, the retrieval of autobiographical memory does not automatically invoke trait information. And contrary to the self-perception model, retrieval of trait information does not automatically invoke memory for behavioral episodes. Because self-description and autobiographical retrieval do not prime each other, Klein and Loftus conclude that items of episodic and semantic self-knowledge must be represented independently of each other.
Parallel findings have been obtained in case studies of amnesic patients' self-knowledge.
The Case of K.C. The first such study was, in fact, inspired by Klein's (Klein & Loftus, 1993) claim, based on his priming studies, that episodic self-knowledge is encoded independently of semantic self-knowledge. In a commentary on Klein's paper, Tulving reported an experiment that he conducted with Patient K.C., who was rendered permanently amnesic as a result of a motorcycle accident at age 30. K.C. is an especially interesting case of amnesia, because he has a complete anterograde and retrograde amnesia: he has no conscious recollection of anything he has done or experienced throughout his entire life, both before and after his accident. Interestingly, K.C. also underwent a marked personality change as a result of his accident. Whereas his "premorbid" personality was rather extraverted (he was injured riding a motorcycle, after all!), his "postmorbid" personality became rather introverted.
Tulving was interested in
whether K.C. had any knowledge of what he was like as a person,
despite his dense amnesia in terms of explicit episodic
memory. Employing an adjective checklist supplied by
Klein, Tulving asked K.C. and his mother to rate both
their own and each other's personality.
Of course, much of this apparent
accuracy could be achieved if K.C. and his mother simply said
good things about each other. Accordingly, Tulving
conducted a more rigorous test in which K.C. and his mother were
asked to select the more descriptive adjective from pairs of
adjectives that were matched for social desirability.
The Case of W.J. Klein himself obtained similar findings from an 18 y/o college student who temporarily lost consciousness following a concussive blow to the head received when she fell out of bed during the second quarter of her freshman year (Klein, Loftus, & Kihlstrom, 1996). Although a medical examination revealed no neurological abnormalities, W.J. did show an anterograde amnesia covering the 45 minutes that elapsed after her injury (very common in cases of concussion), as well as a traumatic retrograde amnesia that covered the preceding 6-7 months -- essentially, the entire period of time between her high-school graduation and her concussion (organic amnesias that are bounded by personally significant events are rare, but they do occur, and they are very intriguing). This retrograde amnesia remitted over the next 11 days, but not before Klein was able to complete extensive memory and personality testing. The personality testing was particularly interesting because W.J., like many college students, showed rather marked changes in personality as a college student, compared to what she had been like in high school.
Memory testing of W.J. confirmed her retrograde amnesia. Employing the Galton-Crovitz cued-recall technique, in which subjects are presented with a familiar word and asked to retrieve a related autobiographical memory, Klein et al. showed that she was much less likely than control subjects to produce memories from the most recent 12 months of her life, and much more likely to produce memories from earlier epochs. After W.J.'s retrograde amnesia lifted, her performance on this task was highly similar to that of the controls.
Personality testing revealed a
pattern of performance similar to that observed in patient
K.C.
Taken together, these neuropsychological case studies support the conclusions of Klein's priming studies of self-knowledge. Amnesic patients, who lack autobiographical memory for their actions and experiences, nevertheless retain substantial knowledge of their own personalities. This indicates that "semantic" trait information and "episodic" behavioral information are represented independently in memory.
Semantic Self-Knowledge as Implicit MemoryTechnically, in both cases the semantic
memory about the self preserved in cases of amnesia
accompanied by personality change probably reflects
spared implicit memory for the patient's
past personal experiences. Another
perspective on the relationship between memory and
identity is provided by Encircling, a
novel in three volumes by Carl Frode Tiller (2007;
translated from the Norwegian by Barbara Haveland,
2015; reviewed in "The Possessed" by Clair Wills, New
York Review of Books, 07/22/2021).
David, a Norwegian in his thirties who has lost most
of his memory, places a newspaper advertisement
asking for people who knew him to write to him and
fill in the gaps in memory. Vol. 1 covers his
teenage years; Vol. 2, childhood; Vol. 3, young
adulthood. David's recent memories have been
spared (so far), so in all there are nine different
accounts of his life (as well as nine different
perspectives on the recent social history of rural
Norway). The result is what Endel Tulving (and
I) have
called remembering by knowing -- abstract
knowledge of the past, knowing the events of one's
own life much as one can recount the major battles
of the Civil War. Because autobiographical
memory is an important part of identity, David
constructs (or maybe reconstructs) his
self-concept the same time. On the jacket of Vol.,
3, a blurb states that "Identity is not a monolith
but a collage", built up of fragments. On
the other hand, David's therapist suggests that
(quoting Wills quoting the book) "we are little
more than 'situation-appropriate personas', with
no coherent identity at all". Wills herself
has a different take: that Tiller "is not so much
interested in how we are formed by the
perspectives of other people as in how we are
destroyed by them. Again and again his
characters battle to maintain a sense of self in
their encounters with others, and again and again
they lose the battle". |
The Case of D.B.: Projecting Oneself into the Future
Another amnesic patient illustrates the same basic points. Patient D.B. (Klein, Loftus, & Kihlstrom, 2002) was a 78-year-old retired man who went into cardiac arrest while playing basketball (this patient is not the D.B. of "blindsight" fame). As a result, he experienced anoxic encephalopathy, or brain damage due to a lack of oxygen supply to the brain. A CT scan revealed no neurological abnormalities, but upon recovery he displayed a pattern of memory loss similar to that of patient K.C.: a dense anterograde amnesia covering memories for episodes since his accident, plus a dense retrograde amnesia covering his entire life before the accident.
D.B. was tested with both episodic and semantic versions of the Galton-Crovitz procedure. When asked to recall cue-related personal experiences, D.B. showed a profound deficit compared to control subjects. But when he was asked to recall cue-related historical events, his performance improved greatly. Again, these results are consistent with the idea that amnesia affects episodic self-knowledge, but spares semantic knowledge.
In another part of the study,
D.B. was asked to imagine the future as well as to
remember the past, with respect to both personal experiences and
historical events. For example:
Compared to controls, D.B. displayed almost no knowledge of his past experiences, but also no ability to project himself into the future. By contrast, he showed fairly good ability to recall public issues from the past, and to give reasonable predictions of issues that might emerge in the future.
These
findings on the experience of temporality are consistent
with the conclusion that amnesia impairs episodic
self-knowledge, but spares other forms of memory, including
semantic knowledge about the self and semantic knowledge about
the historical past. But D.B.'s data also makes the
intriguing suggestion that the ability to imagine the future
is related to the ability to remember the past. Both
are components of chronesthesia, or mental time
travel -- defined by Tulving (2005) as the ability to project
oneself, mentally, into the present and the future.
Both narrating the personal past and projecting the personal future entail storytelling -- which brings up yet another form of mental representation, knowledge as stories. Recently, Schank and Abelson (1995, p. 80) have asserted that "from the point of view of the social functions of knowledge, what people know consists almost exclusively of stories and the cognitive machinery necessary to understand, remember, and tell them. As they expand on the idea (p. 1).
Schank and Abelson concede that knowledge also can be represented as facts, beliefs, lexical items like words and numbers, and rule systems (like grammar), but they also argue that, when properly analyzed, non-story items of knowledge actually turn out to be stories after all; or, at least, they have stories behind them; or else, they turn out not to be knowledge (for example, they may constitute indexes used to organize and locate stories). Their important point is that from a functional standpoint, which considers how knowledge is used and communicated, knowledge tends to be represented as stories.
The idea of knowledge as stories, in turn, is congruent with Pennington and Hastie's (1993) story model of juror decision-making. According to Pennington and Hastie, jurors routinely organize the evidence presented to them into a story structure with initiating events, goals, actions, and consequences. According to Schank and Abelson (1995), each of us does the same sort of thing with the evidence of our own lives. From this point of view, the self consists of the stories we tell about ourselves -- stories which relate how we got where we are, and why, and what we have done, and what happened next. We rehearse these stories to ourselves to remind ourselves of who we are; we tell them to other people to encourage them to form particular impressions of ourselves; and we change the stories as our self-understanding, or our strategic self-presentation, changes. When stories aren't told, they tend to be forgotten -- a fact dramatically illustrated by Nelson's (1993) studies of the development of autobiographical memory in young children. Furthermore, when something new happens to us, the story we tell about it, to ourselves and to other people, reflects not only our understanding of the event, but our understanding of ourselves as participants in that event. Sometimes, stories are assimilated to our self-understanding; on other occasions, our self-understanding must accommodate itself to the new story. When this happens, the whole story of our life changes.
The twin ideas of self as
memory, and of self as story, bring us inevitably to the topic
of autobiographical memory (ABM) -- that is, a real
person's memories for his own actions and experiences, which
occurred in the ordinary course of everyday living.
Memory and Identity
ABM is an
obviously important aspect of the self, as it contains a record of the individual's own
actions and experiences.
In his Essay Concerning Human Understanding (1690), the philosopher John Locke went so far as to assert that memory formed the basis for the individual's identity -- our sense that we are the same person, the same sense, now as we were at some previous time. Locke's idea sounds reasonable, but other philosophers raised objections to his equation of identity with memory.
Reid's point is a logical one, based on the principle of transitivity:Suppose a brave officer to have been flogged when a boy at school for robbing an orchard, to have taken a standard from the enemy in his first campaign, and to have been made a general in advanced life; suppose, also, which must be admitted to be possible, that, when he took the standard, he was conscious of his having been flogged at school, and that, when made a general, he was conscious of his taking the standard but had absolutely lost consciousness of the flogging.
Note for Coincidence-Collectors: The day that Dean died, May 31, 2009, happened to be the 98th anniversary of the launching of the Titanic. And her older brother died on April 14, 1992, which was the 80th anniversary of the shipwreck. These are examples of what is sometimes called an anniversary reaction, in which people die on the anniversary of some important event in their lives. The deaths of John Adams and Thomas Jefferson, who both died on July 4, 1826, the 50th anniversary of the signing of the Declaration of Independence, are two other examples.
For
more about memory and identity, see "Memory and the Sense
of Personal Identity" by Stanley B. Klein and Shaun
Nichols, Mind, 2012). |
Autobiographical
memory is
technically classified as episodic memory -- itself
a subset of declarative memory, consisting of
factual knowledge concerning events and experiences that
have unique locations in space and time (two events can't
occur at the same time in the same space). Episodic
memory is commonly studied with
variations on the verbal-learning paradigm, is explicitly
intended as a laboratory analogue of autobiographical memory:
each list, and each word on a list, constitutes a discrete
event, with a unique location in space and time. And, as
we'll see, ABM can be studied with variants on verbal-learning
procedures.
ABM is episodic memory, as opposed to semantic
memory or procedural knowledge, but ABM isn't just
episodic memory -- there's more to it than a list of items
studied at particular places and particular times (Kihlstrom,
2009).
I expand on these points below.
Self-Reference
Autobiographical memories are episodic memories, but they're not just episodic
memories. In an important essay, Alan Baddeley (1988) put
his finger on the difference: "Autobiographical
memory...
is particularly concerned with the past as it relates to
ourselves as persons" (p. 13). To really qualify
as autobiographical, a memory ought to have some auto in it, so
that the self is really psychologically present -- in
a way that it's not present in a memory like The hippie
touched the debutante.
Taking
a leaf from Fillmore's case-grammar approach to linguistics
(Fillmore, 1968; see also Brown & Fish, 1983), it seems
that in every autobiographical memory the self is
represented in terms of
But what is self-reference reference to? As discussed earlier, from a cognitive point of view, the self is, simply, one's mental representation of oneself -- no different, in principle, from the person's mental representation of other objects. This mental representation can be variously construed as a concept (think of the "self-concept"), an image (now think of the "self-image"), or as a memory structure. For present purposes, we can think of the self simply as a declarative knowledge structure that represents factual knowledge of oneself. And, like all declarative knowledge, declarative knowledge comes in two forms:
Within the framework of a
generic network model of memory (such as ACT), we can
represent the self as a node in a memory network, with links
to nodes representing other items of declarative
self-knowledge. Although this illustration focuses on
verbal self-knowledge, it should be clear that the self, as a
knowledge structure, contains both meaning-based and
perception-based knowledge about the self. Examples
follow:
For present purposes,
we're going to focus on verbalized recollections.
The Autobiographical Knowledge Base
Perhaps the most thorough
cognitive analysis of ABM has been provided by Michael Conway
and his colleagues (Conway, 1992, 1996; Conway &
Pleydell-Pearce, 2000) in terms of what Conway calls the autobiographical
knowledge base, which is presented in the form of a
generic associative memory network (although without the
operating computer simulation of ACT and similar formal
models).
In Conway's theory, individual ABMs are represented as nodes linked to various other elements in the network, and organized in various ways.
What Conway calls the self-memory system
reflects the conjunction of the autobiographical knowledge
base with the working self. This structure,
analogous to working memory, and consists of an activated
self-schema as well as current personal goals and
current emotional state.
Conway and Pleydell-Pearce (2000) argue that ABMs are constructed (note the Bartlettian term) in two ways.
Autobiographical memory is not just about episodes, and it is not just about auto: it is also biographical. It is not enough to construe autobiographical memory as memories of one's own experiences, thoughts, and actions, strung together arbitrarily as if they were items on a word-list. Autobiographical memory is the story that the person tells about him- or herself or, at the very least, it is part of that story. As such, we would expect autobiographical memory to have something like an Aristotelian plot structure (see his Poetics): an "arrangement of the incidents" into a chronological sequence.
We can distinguish between two aspects of temporal organization:A good example (if I may say so) is recall of the events of hypnosis, which typically begins at the beginning of the session and proceeds in a more or less chronological succession to the end. Kihlstrom (Kihlstrom & Evans, 1973; Kihlstrom, Evans, Orne, & Orne, 1980) found that the temporal sequencing of recall was disrupted during suggested posthypnotic amnesia. The implication is that (1) episodic memories are marked with temporal tags; (2) episodic memories are organized according to the temporal relationships between then; and (3) that the retrieval process enters the sequence at the beginning and follows it until the end.
It should be noted, however, that in the hypnotic case subjects are given a retrieval cue that specifies the beginning of the temporal sequence. For example, the subject is asked to "recall everything that happened since you began looking at the target (a fixation point used in the induction of hypnosis by eye closure). If the instructions had been different -- for example, a request to recall "everything that happened while you were hypnotized", a somewhat different pattern of recall might be observed.
For current purposes, we'll focus on external organization, or how one autobiographical memory is related to others.
Viewed across the lifespan, the distribution of memories shows a clear temporal gradient, which was summarized by Conway & Pleydell-Pearce (2000) as follows. As a general rule, ABM shows the usual pattern of time-dependent forgetting. Most ABMs are of very recent events, and the frequency of ABMs drops off progressively. This is just a recency effect, and needs no special explanation. However, there are two somewhat unusual features of the distribution.
Despite infantile and childhood amnesia and the
reminiscence bump, the major feature of ABMs is the temporal
gradient. This is universally observed, no matter the
means by which ABMs are sampled.
Employing this Crovitz-Robinson technique, Robinson (1976) observed that the distribution of response latencies followed an inverted-U-shaped function. Recall of very recent memories occurred relatively quickly; as the age of the memory increased, so did response latency. The exception was rather short latencies associated with very remote memories. Robinson suggested that these very remote memories might be unrepresentative of ABMs in general: because they are highly salient, they are quickly retrieved. Otherwise, the distribution suggests that the retrieval of ABMs follows a serial, backwards, self-terminating search. That is, the retrieval process begins with the most recent ABMs, and searches backward until it identifies the first memory that satisfies the requirements of the cue.
The way to get around this problem is to present subjects with retrieval cues that constrain the time period from which the memory is to be retrieved. This is what Chew (1979) did in an unpublished doctoral dissertation from Harvard University (see also Kihlstrom et al., 1988). Chew found that cues with high imagery value produced shorter response latencies than did those with low imagery value, and that imagery value interacted with the epoch (remote or recent) from which the memory was drawn. But still, memories drawn from the more recent epoch were associated with shorter response latencies than those drawn from the more remote one. Moreover, within each epoch the distribution of memories was characterized by a reverse J-shape. That is, fewer memories were retrieved from the early portions of each epoch. These findings are consistent with the hypothesis that retrieval begins at the most recent end of a temporal period, and proceeds backwards in time until it reaches an appropriate memory.
At the same time, the idea that ABMs are
organized along a single continuous chronological sequence
seems unlikely to be true. First, it's very
inefficient. Yes, it would produce a fan effect, but as
subjects aged the memories would pile up, and retrieval would
be extremely inefficient. Rather, it seems likely that
the chronological sequence of ABMs is organized into chunks
representing smaller segments of time. There's an
analogy to the English alphabet. Yes, it's a single
temporal sequence, from A and B through L and M to X, Y, and
Z. But this large temporal string is also broken up into
smaller chunks.
Consider, for example, a telephone keypad (or, for those old enough to remember, a rotary dial), which breaks the alphabet up into 3- or 4-letter strings corresponding to the numbers 2-9.
Or, better yet, "The Alphabet Song" sung almost universally by native English-speaking children as they learn their alphabet, which breaks the alphabet up into 6 or 7 chunks (depending on how you count)
Klahr et al. (1983), perhaps inspired by "The Alphabet Song", reported a study of an alphabetic search task that may well serve as a model for chunking in the temporal organization of autobiographical memory. Klahr et al. presented their subjects with a randomly selected letter of the alphabet, and asked them simply to report the letter which immediately followed, or preceded, the probe. Analysis of response latencies revealed a sawtooth pattern suggesting that subjects searched through sub-strings, not the whole 26-letter string.
Based on these results, they developed a model of the alphabetic search task, implemented as ALPHA, an operating computer simulation. In the model, the letters of the alphabet are represented in -- forgive me -- alphabetical order, but that ordered string is also subdivided into chunks roughly following the "Alphabet Song" learned by probably every English-speaking schoolchild. In the theory, the subject enters the string from the beginning, but then branches out until it finds the chunk, or subsidiary ordered string, which contains the probe item. Search then proceeds through the subgroup until it locates the probe. ALPHA successfully mimicked the performance of real subjects on the alphabetic search task.
Along similar lines, Skowronski and his colleagues (2007) proposed that ABM is also organized into chunks. In their experiments, subjects first listed and dated their ABMs for high school and college. Then pairs of these memories were presented to subjects in a judgment of recency task -- in which, as its name implies, subjects are simply asked to say which of two remembered events was the more recent. Skowronski et al. then divided the memories into various epochs: high school vs. college, year, quarter of the school year, freshman-sophomore vs. junior-senior, and academic year vs. summer. On some trials, both memories were drawn from the same epoch; on other trials, the memories were drawn from different epochs. Subjects were highly accurate on this task, making the correct choice 82.5% of the time. When Skowronski et al. looked at response latencies, however, they found that, in general, response latencies were shorter for between-epoch judgments than for within-epoch judgments. The exception was whether the memories were drawn from the earlier or later years of high school or college. They concluded that the chronological sequence of autobiographical memory was indeed divided into smaller temporal chunks, promoting a retrieval process not unlike that uncovered by Klahr et al. in the alphabetic search task.
Although Skowronski et al. imposed the same chunking scheme on all of their subjects, it seems likely that there will be some idiosyncratic variation in this -- variation which, itself, reflects the vicissitudes of the life cycle. A student who switched schools in the middle of high school might well, for example, differentiate between his freshman and sophomore years at Elmira Free Academy and his junior and senior years at Notre Dame. A child whose parents divorced, or remarried, might use those events to mark out epochs in memory.
And for that matter, the markers might well be psychosocial in nature -- perhaps along the lines of Erik Erikson's "Eight Ages of Man".
The point here is that any
temporal epochs in an individual's memory are likely to be individual,
not universal, and themselves reflect his or her
self-concept. Put another way, the epochs which break up
an individual's autobiographical memory are likely to be
subjective, not objective, in character.
Aristotle lists a number of causal relations
that might be represented in a drama, and these might be
recorded in ABM as well:
Along the same lines, but much more recently, David Pillemer (1998, 2001) provided a list of causal events in the individual's life story:
Aristotle suggested that the events in a good drama should reveal something about character, and this is likely to be true of ABMs as well. Not every remembered episode reveals our tragic flaws, not least because not every life is a tragedy: still, our memories say something about ourselves, and about the other people in the events we remember -- which is perhaps just another way of saying that they say something about us (McAdams, 1993). This is a big subject, but in general we can distinguish two broad points of view.
Milvena Dean, the last survivor of the Titanic disaster, died on May 31, 2009-- interestingly, on the 98th anniversary of the ship's launching. In her later years, especially after the release of James Cameron's movie, Titanic (1997), she enjoyed some degree of celebrity, but she had no personal recollection of the event-- not least because she was only nine weeks old when the ship went down, and she only learned that she had been on the ship at age eight. She knew she was a Titanic survivor, and that fact played an important part in her life, but she had no recollections of the event at all.It appears, however, that remembering and knowing do not exhaust the varieties of recollective experience. At the very least, both "remembering" (viewed as full-fledged conscious recollection of an event as part of one's subjective autobiography) and "knowing" (viewed as retrieval from semantic memory of an event as part of one's objective biography) can be further distinguished from an intuitive "feeling" that something happened in the past. The "feeling of knowing" state is well documented in studies of verbal learning and retrieval from semantic memory, but the same sort of feeling occurs in genuine autobiographical memory, as when we have a feeling that we have met someone somewhere before, but cannot say where or when. I have a feeling that I saw Woody Allen's Midsummer Night's Sex Comedy at the New Yorker Theatre in Manhattan in 1982, soon after its premiere, but-- with apologies to the friends who must have been with me at the time -- I don't actually remember it; and I know full well that Woody Allen movies premiered at the Beekman Theatre, not the New Yorker. Perhaps the memory is, at least, in part, the product of priming: I spent a lot of time in
Most research on ABM is on conscious
recollections, reminding us (sorry) that there are several
varieties of recollective experience observed in episodic
memory:
This focus on the conscious experience of ABM raises the question of whether there are unconscious ABMs as well. Certainly Freud thought so: the "reminiscences" that "hysterics" ostensibly "suffered from" were unconscious, because they had been repressed. A more recent take on this formula is the trauma-memory argument, which holds that certain forms of mental illness and other problems in living are caused by repressed (or dissociated) memories of trauma, especially of childhood sexual abuse. I discuss this problem at length in the "Memory" course lecture supplements on "Emotion and Memory".
"Autobiographical memory... is particularly concerned with the past as it relates to ourselves as persons.... [It] is important because it acts as a repository for those experiences that constitute one's self-concept. If you lose contact with your past, then presumably you lose contact with much of yourself" (p. 13).
Among our many autobiographical memories are what are called flashbulb memories. In a groundbreaking paper, Brown and Kulik (1977) asked subjects to remember the circumstances under which they learned about a surprising, consequential, affect-laden event. The classic example, for people of my generation, and for the subjects in Brown & Kulik's experiment, is the assassination of President John F. Kennedy. Other events included in the survey were:
Brown
& Kulik performed a content analysis on the subjects'
responses, and found that they commonly contained the following
information:
They suggested that "flashbulb memories" represent a richly detailed, almost sensory memory of the circumstances in which a highly surprising, affect-laden, and consequential event occurred. Such emotionality induces a great deal of overt and covert rehearsal, leading to a highly elaborate episodic memory trace. They also invoked Livingston's (1967) "Now Print!" mechanism, by which it is as if the mind takes a picture of what is going on at the time of a surprising, consequential, affect-laden event. Like Livingston, they suggested that such flashbulb memories might have evolutionary significance, in that they produced a prompt, enduring record of critical events in the organism's life.
Moreover, the rich detail in such memories seemed inconsistent with Bartlett's view that memory is reconstructive, and thus inaccurate, in nature. In these cases, anyway, the memories appear to be reproduced "verbatim".
Following
in the wake of Brown and Kulik's study, other investigators have
studied flashbulb memories for various surprising,
consequential, and affect-laden news events:
Ratings of the free-recall narratives showed that most of these features (except for Time) occurred with a high frequency. When subjects were specifically asked about each detail in a subsequent questionnaire, the frequencies shot up to almost 100% in all categories.
Most subjects' free-recall narratives included at least four of the features; and the distribution was even more skewed for the questionnaires.
The average subject's memory had more than four of the six "canonical" categories employed by Brown & Kulik -- and on the questionnaire, the average subject's memory had all six. Flashbulb memories are, indeed, very rich representations of the event in question -- but are they accurate?
Some of these subsequent studies compared memories collected immediately after the event, with those collected after some delay. For example, Neisser and Harsch studied college students' flashbulb memories for the 1986 Challenger disaster (probably the most studied of all flashbulb memories). When retested in 1988, Neisser and Harsch investigators found that many subjects were highly confident that their memories were accurate; but in fact, most of the memories were quite inaccurate, and many bore no resemblance to the accounts that the same subjects had given shortly after the event occurred.
Similar findings have been obtained by other investigators studying the Challenger disaster, and from studies of other flashbulb memories. Apparently flashbulb memories are not as accurate as they appear to be.
Along the
same lines, Larsen (1992) performed a "self-study" -- that is, a
study of his own memories -- to compare the accuracy of
recollections of personal and public events (there is a long
history of such studies, beginning with one performed by Harriet
Linton).
Still, whether they are accurate or not, our memories, including our flashbulb memories, are -- well, they're our memories -- our mental representations of the past. And because they typically represent surprising, consequential, emotional events, they're relevant to the question of emotion and memory.
Regardless of whether they're accurate, flashbulb memories are our recollections of important events. And "importance" is something that differs from group to group (remember, in the Brown & Kulik study, that relatively few white subjects had flashbulb memories for the assassination of Medgar Evars). And it's also something that differs from individual to individual. I have a flashbulb memory for the time I first met my spouse, but I don't have flashbulb memories for everyone I ever met for the first time. An uninvestigated aspect of flashbulb memories are those idiosyncratic events that are important to us as individuals, even if they're not important to the wider public or history. For that reason, our flashbulb memories -- not necessarily of 9/11 and the Challenger Disaster, but the more "mundane" memories for things like our first kiss -- are important expressions of our personality. And, it turns out, they are also important for social interaction. So, we'll take them up again in the lectures on Personality and Memory and Social Memory.
But if flashbulb memories are not veridical representations of some past event, what are they? Neisser (1992) noted that flashbulb memories are, first and foremost, stories that follow certain narrative conventions -- they indicate who did what, where, when, and why. Rather than being "pictures" that the mind has taken of certain events, he suggested that flashbulb memories serve as benchmarks marking the intersection of private and public history. As Neisser put it:
"They are the places where we line up our own lives with the course of history itself and say 'I was there'."
People's flashbulb memories of the 9/11 terror attacks provide some evidence for Neisser's hypothesis. In 2002, the Pew Research Center for the People and the Press released the results of a survey which showed that 97% of Americans could remember exactly where they were or what they were doing the moment they heard of the attacks -- thus fulfilling the basic criterion for a flashbulb memory (Pew Center, 09/05/02). The survey also asked respondents to describe "the biggest life event of the past year". In past surveys, this question had elicited a fairly routine miscellany of births and deaths, marriages and divorces. But in this case, fully 38% of those surveyed cited 9/11. September 11, 2001, was certainly an important date for public history; but it was also, apparently, an important date in personal history -- an intersection of public and private marked by a "flashbulb" memory.
9/11 may be a
benchmark, and other flashbulb memories may represent
benchmarks too, but apparently not all of them are benchmarks,
as data from our study of flashbulb memories for the Challenger
disaster shows (Mullane Swar, Glisky, & Kihlstrom,
2002). Before subjects reported their memories of the
Challenger incident, we asked them to remember events that
occurred in the surrounding 12-month period, from August 1, 1985
to July 31, 1985 (the Challenger disaster occurred on January
28, 1986).
Philosophers, neuroscientists, and even some psychologists often ask what "the self" looks like -- sometimes betraying a nervousness about matters like consciousness and agency, and raising the spectre of the homunculus -- "the little man in the head".
To
this question, cognitive psychology offers four straightforward
answers:
Development can be viewed in two ways: ontogenetically, in terms of changes in individual organisms across the life cycle from birth to death; and phylogenetically, in terms of changes in species across evolutionary time.
Locke viewed a sense of self as essential for personhood, but nonhuman animals may also have a sense of self. In a classic study, Gordon Gallup painted an odorless, nontoxic red mark on the foreheads of anesthetized chimpanzees. In the absence of a mirror, the chimps showed no awareness that their appearance had been altered. When exposed to their reflections in a mirror, however, the animals often examined the spot in the mirror, touched the spot on their foreheads, and then inspected and smelled their fingers. They appeared to recognize a discrepancy between what they thought they looked like, and what they actually looked like -- suggesting, in the process, that they possessed at least a rudimentary self-image. The same effect has been found in some orangutans and bonobos, but not in gorillas (except perhaps for the famous Koko), monkeys, and other primates, or in non-primate species. However, it should be noted that not all chimpanzees pass the self-recognition test, and alternative means of testing may well reveal self-recognition in other species.
By the time they are 18-24 months old, most human infants also pass the mirror-recognition test. However, if the infants are shown a videotape of themselves after a delay as short as three minutes, most fail to recognize themselves on the monitor; most four-year-old pass this more difficult test. By the age of two, then, human infants have at least a minimal sense of self, but it takes a while longer for them to develop a narrative sense of themselves as extended in time -- that they are the same person now that they were a while ago. Similarly, children younger than four years old seem unable to recognize that their current knowledge and beliefs differ from those they held in the past. Interestingly, age four is about the time that children achieve a capacity for episodic memory -- the ability to recognize that a current mental state is in fact a representation of a past mental state.
There's also a third developmental approach, which might be called the cultural view -- i.e., how the self develops over the course of historical time. Julian Jaynes (1979) famously argued that there was a time when humans didn't really have consciousness -- or, at the very least, didn't realize that they had it. In much the same way, there may have been a time when we didn't really have a self-concept -- or, if we did have one, it didn't matter very much.
In cultural terms, the self, at least in Western culture, may have experienced a radical alteration around the time of the Renaissance. As Jacob Burckhardt put it in The Civilization of the Renaissance in Italy (1860):
Man [previously] was conscious of himself only as a member of a race, people party, family, or corporation -- only through some general category. In Italy this veil first melted into air; an objective treatment and consideration of the state and of all things of this world became possible. The subjective side at the same time asserted itself with corresponding emphasis; man became a spiritual individual, and recognized himself as such.
Burckhardt had the idea that this discovery of the self was uniquely a feature of the Italian Renaissance, and particularly of the Florentine Renaissance, based in the Italian city-state of Florence. In fact, the Italians got the idea that portraits of contemporary individuals should be painted at all, and that such portraits should portray their subjects in all their individuality, from the artists of the Northern Renaissance, especially Flanders, and especially the work of Jan Van Eyck (1395-1441). But more generally he argued that the Renaissance emphasis on "the subjective side", in which "man became a spiritual individual, and recognized himself as such" is central to European, and Western, culture.
Andrew Butterfield, reviewing the Renaissance Portrait from Donatello to Bellini (an exhibition at the Metropolitan Museum of Art in New York City, 12/21-2011-03/18/2012), notes:
Walter Benjamin, the 20th-century German philosopher and Marxist cultural critic famous for The Arcades Project (1927-1940), his study of Parisian bourgeois culture, gave a somewhat later date: somewhat between 1830 and 1848, during France's Second Empire. It was at this point, Benjamin claimed, that individuals first became private, in the sense that their work lives were separated (Marx would have said "alienated") from their domestic lives, and the public domain separated from the private. For the first time, as Merve Emre writes in an overview of the personal essay (which emerged at about the same time), ("The Illusion of the First Person", New York Review of Books, 11/03/2022; see also Emre's contribution to the Oxford Companion to the Essay) a middle- or upper-class individual had the time and inclination to "probe what he believed to be his thoughts, lodged in his self, his mind, his body, and his home".Burckhardt's idea has had the most profound influence on the study of the Renaissance portrait.... Beginning in the late thirteenth century several changes greatly stimulated the advance of portraiture.... So portraits had existed before, but it was only in the fifteenth century that independent images of actual persons other than rules and religious figures began to be made in large numbers. This new tendency started in Flanders, and then spread to Florence, where it reached unprecedented currency (reviewed in "They Clamor for Our Attention", New York Review of Books, 03/08/2012; see also "Faces Out of the Crowd" by Barry Schwabsky, The Nation, 03/26/2012).
Whatever
the findings in infants and animals, a sense of self is part and
parcel of the conscious experience of all normal human adults.
However, a number of pathological conditions appear to involve
disruptions in self-recognition and self-awareness.
While
cognitive psychology tends to study mind in the abstract, social
psychology studies mind in action. Mental representations of
self, others, and situations do not exist for themselves, but
rather as guides to social behavior. How we behave towards
others depends not just on how we perceive them, but also on how
we perceive ourselves. Erving Goffman, E.E. Jones, and others
have argued that people often engage in strategic
self-presentation to shape others' impressions of them, in an
attempt to gain or retain control over the social situation.
Many social interactions are characterized by what Robert K.
Merton would call a self-fulfilling prophecy -- in which, for
example, a person who believes that another person is aggressive
may treat him or her in a manner that evokes aggressive behavior
that may not have occurred otherwise. A strong sense of self may
promote strategic self-presentation, but it may also militate
against others' self-fulfilling prophecies concerning oneself.
If one does not define oneself as aggressive, perhaps one will
be less likely to act aggressively, regardless of how he or she
is treated. From a social-psychological perspective, then, the
self is not just something that knows, and is known. It is also
something that one does.
As
philosophers and psychologists became interested in the
biological substrates of mental life, and brain-imaging
techniques have permitted us to watch the brain in action,
cognitive science has evolved into cognitive neuroscience.
Anthony Damasio has argued that the self consists of three different representational levels, each associated with a different brain system (see his book, The Feeling of What Happens: Body and Emotion in the Making of Consciousness, 2000).
Taking cognitive neuropsychology as a model, Klein and Kihlstrom have argued that neuropsychological studies of brain-injured patients, and brain-imaging studies of normal subjects, may provide new solutions to old problems, and afford new theoretical insights, for personality and social psychologists as well. Consider, for example, the relation between self and memory. If, as Locke argued, our sense of self is intimately tied up with our recollection of our past, what is the sense of self for an amnesic patient? H.M., the famous patient with the amnesic syndrome, cannot consciously remember anything that he did or experienced since the operation that destroyed his medial temporal lobes. Of course, H.M.'s amnesia is primarily anterograde in nature, and his sense of self may be confined to whatever memories he has from before his surgery. Moreover, Locke did not fully appreciate the distinction between episodic and semantic memory. Amnesic patients retain some ability to acquire new semantic knowledge, and this dissociation may permit their self-concepts to be based on "updated" semantic knowledge, even if they are lacking a complete record of autobiographical memory.
Such questions have not been asked of H.M. himself, but they have been asked of other patients. For example, the patient known as K.C., who suffered a severe head injury as a result of a motorcycle accident, has both a complete anterograde amnesia covering events since the accident, and a complete retrograde amnesia covering his life before the accident. K.C. has no autobiographical memory at all, but research by Endel Tulving reveals that he has a fairly accurate self-concept. The same accident that caused his amnesia also resulted in a profound personality change: the premorbid K.C. was quite extraverted, while the postmorbid K.C. is rather introverted. When asked to rate himself as he is now, K.C. rates himself as introverted, in agreement with his mother's ratings of him. Interestingly, his ratings of his premorbid personality do not agree with his mother's. K.C. has acquired semantic knowledge about himself, but he has not retained in episodic memory the experiences on which this self-knowledge is based; and his newly acquired semantic self-knowledge has effectively replaced that which he possessed before the accident.
Similar results were obtained by Klein and his colleagues in a study of W.J., a college freshman who suffered a temporary retrograde amnesia, covering the period since her high-school graduation, as a result of a concussive blow to the head. Asked to describe herself, W.J. showed a good appreciation of how she had changed since matriculating, as corroborated by her boyfriend's ratings of her. Findings such as these lend strength to the conclusion, based on experimental studies of priming, that semantic (trait) knowledge of the self is encoded independently of episodic (behavioral) knowledge.
Amnesic patients typically suffer damage to the hippocampus and related structures in the medial temporal lobes, leading to the conclusion that these structures constitute a module, or system, for encoding consciously accessible autobiographical memories. Is there a similar structure responsible for the sense of self? Recently, Craik and his colleagues (1999) used PET to image the brain while subjects rated themselves on a list of trait adjectives. As comparison tasks, subjects rated the Prime Minister of Canada on the same traits; they also judged the social desirability of each trait, and the number of syllables in each word. One analytic technique, statistical parametric mapping, revealed no differences in brain activation between the self- and other-ratings tasks. While this finding would be consistent with the proposition that the self is a person like any other, a partial least squares analysis showed significant self-other differences in the right and left medial frontal lobes, and the middle and inferior frontal gyri of the right hemisphere. Further studies of this sort are obviously in order.
So is the
mental representation of the self located somewhere in the right
hemisphere? Probably not. Self-referent processing may be
performed by a module or system localized in the right frontal
lobe, but control is critical in these conditions, and it may
well be that other-referent processing is performed by the same
system, provided that the other is well-liked and/or well-known.
Although cognitive neuroscience has generally embraced a
doctrine of modularity, the neural representation of individual
items of declarative knowledge is distributed widely across the
cerebral cortex. Self-reference may be localized, but
self-knowledge is widely distributed over the same neural
structures that represent knowledge of others. I discuss
this issue more in the lectures on Social Neuropsychology.
Lying
behind all four of these models of the self is the general idea
that oneself is a person like anyone else, and represented
accordingly. However, there may be both quantitative and
qualitative differences between self-perception and the
perception of other people.
On the quantitative side, it seems obvious that we simply have more knowledge about ourselves than we do about other people. Evidence for this proposition comes from studies of the self-reference effect in memory, by Rogers and his associates (1977).
Rogers
made use of a popular procedure in the study of memory known as
the depth-of-processing (DOP) paradigm, also known as the levels
of processing (LOP) paradigm. In DOP studies, subjects are
presented with a list of words (or other material), about which
they have to make one of a number of judgments:
For more details on the DOP/LOP paradigm and its implications, see the page on Memory in the lecture supplements on General Psychology; and also the page on Encoding in the lecture supplements on Human Learning and Memory.
To the standard DOP paradigm, Rogers et al. added a self-referent judgment. They presented a number of trait adjectives and, in addition to the standard conditions, asked subjects to decide whether each adjective was self-descriptive. The finding was that self-reference produced a huge increment in memory. On the basis of this self-reference effect (SRE), and in line with the standard interpretation of the DOP, Rogers et al. suggested that the self was, perhaps, the largest knowledge structure in human memory.
The conclusion was quite provocative, and quickly a number of investigators stepped in to put it to the test. In particular, Keenan and Baillet (1980) performed a number of studies in which they compared self-referent processing to the effects of processing information with respect to other people. Most critically, they found that processing with respect to the subject's best friend yielded a DOP effect roughly equivalent to that of self-reference. Keenan and Baillet concluded that self-referent processing affords no privilege in memory. The implication is that self-reference is no different from other-reference (provided, perhaps, that the other person is liked and well-known); or, put another way, that the representation of the self in memory is no richer, no more elaborate, than the representation of other people that the individual knows well and likes. (Setting aside the very interesting question of the self-reference effect in depressive individuals, who may not like themselves very much; or in people with personality disorders, who may not know themselves very much!).
But, as they say in late-night television ads, there's more!
Stanley
Klein noticed a subtle confound in the standard SRE experiment
-- which is that self-reference and organization are confounded
in memory.
For more details on the Organization Principle and its relation to the Elaboration Principle, see the page on Memory in the lecture supplements on General Psychology; and also the page on Encoding in the lecture supplements on Human Learning and Memory.
The
problem, then, is to unconfound self-reference and
organization. Klein accomplished this in an ingenious
way. For his experiment, he shifted the stimulus materials
from trait adjectives to body parts.
Again, one of these conditions entails self-reference while the other one does not, but both of these conditions encourage categorization -- to sort the target items into two dichotomous categories.
Klein argued that the standard SRE experiment compared an organized self-referent condition with an unorganized semantic condition. And, indeed, when you look at just those two conditions, you see a big SRE.
But when you look at all four conditions, you see that the SRE is matched by the semantic condition -- so long as the semantic condition is also organized. In fact, virtually 99.44% of the variance in memory performance was accounted for by the organization factor, and virtually none by the self-reference factor. As Klein suspected, the SRE is wholly an artifact of organizational activity, and has nothing to do with the self.
The bottom line is that the self may, indeed, be the largest knowledge structure in memory. Most of us do, after all, know more about ourselves than we do about others, and most of us probably care about ourselves more than we do about others as well. But the self-reference effect doesn't provide any evidence for this proposition -- because, it turns out, the SRE has nothing to do with the self.
With
respect to qualitative differences between cognition of self and
others, there is of course the large social psychological
literature on actor-observer differences -- which is to say, self-other
differences -- in causal attribution.
The first thing to be said about these two differences is that it never was clear that these biases were intrinsic to self-knowledge. Perhaps they applied to knowledge about others, as well, so long as we like them (as we tend to like ourselves) and/or know them well (as we think we know ourselves). This is the lesson of the self-reference effect.
But we now know that early studies of causal attribution were misleading, because they made an inappropriate distinction between the Person and the Environment -- and, more critically, because they attempted to fit causal explanation into an inappropriate Internal-External framework. In fact, Malle's (2006) review, indicated that there is no evidence for the Actor-Observer Difference, and precious little evidence for the self-serving bias either.
So is it true, that the self is "just another person" after all, and that there are no qualitative differences between self-perception and other-perception? Not so fast.
There is one difference between self and other that is absolutely qualitative: While we have direct introspective access to the contents of our own minds -- our beliefs, feelings, and desires -- we can know the minds of others only indirectly -- from what they tell us, and from observing their behavior.
Knowledge of our own minds is direct (at least in part);
Knowledge of other minds is (always) indirect.
So, in the final analysis, Cantor and I were perhaps too quick to offer a solution to Allport's problem.
The "puzzling problem" of the self is not that of self as object of knowledge. Viewed as an object of knowledge, the self may indeed just be a mental representation of oneself, no different from our mental representations of other people. (Though, frankly, I do think that the idea of the self as a mental representation of oneself was an awfully good idea.)
The "puzzling problem" of the self is, rather, that of self as knower. Viewed as a subject, as the person who has self-knowledge, the problems are of how self-knowledge arises, how we know what we know about ourselves -- the little man inside the head that captures the experience of consciousness.
And consciousness is a very puzzling problem indeed.
Further ReadingThose interested in more details may
wish to consult the following articles:
Really, what did you expect in a Lecture Supplement on "The Self", except a bunch of self-citations? Author Tom Wolfe famously designated the 1970s the "Me Decade" (1976), at roughly the same time that the historian and social critic Christopher Lasch was writing about The Culture of Narcissism (1979). But it wasn't just the 1970s. The New York Times Magazine dubbed the entire second millennium of the Common Era the "Me Millennium" and devoted one of its six special "Millennium Issues" to the self, narcissism, and related issues. Link to the October 17, 1999 issue of the New York Times Magazine online. |
This page last modified 07/18/2023.