Featured Minds: Porter Abbott

Porter Abbott is Research Professor Emeritus in the English Department at the University of California, Santa Barbara. He is the author most recently of The Cambridge Introduction to Narrative (2nd ed. 2008) and Real Mysteries: Narrative & the Unknowable (2013). An eclectic narratologist, his personal “turn” toward the evolutionary and cognitive sciences was marked by “The Evolutionary Origins of the Storied Mind: Modeling the Prehistory of Narrative Consciousness and its Discontents” (2000).

What are you currently working on in your research or teaching that relates to the mind?

Right now, along the cognitive line, I am finishing an essay titled “What Does It Mean to be Mad? Diagnosis, Narrative, Science, & the DSM” (2017). I will be test-driving it this month (Oct. 2016) at a Symposium in Nuremberg. Very briefly, I think mental disorder has a special status in pursuing the old conundrum of how mind emerges from matter. It also throws into sharp relief a disconnect between the felt reality of individual minds and the scientific understanding of the mind.

This essay also reflects a long-standing interest in narrative gaps, limits of narrative, and the limits of knowing. So, recently, there’s Real Mysteries, which is about how narrative handles what we simply cannot know, and my chapter “How Do We Read What Isn’t There to be Read: Shadow Stories and Permanent Gaps” in the The Oxford Handbook of Cognitive Literary Studies (2014). This in turn relates to my interest in the incompatibility of emergent behaviors, like natural selection, with narrative modes of understanding and also the ways everyday discourse confidently masks emergence with narratable chains of causality.

How did you become interested in the field of Literature and the Mind?

In the late 90’s, I hung out with John Tooby & Leda Cosmides, who initiated the field of evolutionary psychology, and Paul Hernadi, and our own Lisa Zunshine who was then doing doctoral work at UCSB before becoming a cognitivist meteor currently lighting up the firmament at the University of Kentucky. The work of Steven Mithen, Merlin Donald, and Ellen Spolsky were also important for me. I guess my baptismal moment came when I was invited to give a presentation at an IHC conference, “Imagination & the Adapted Mind” in the summer of 1999. It brought together scientists & humanists of almost every stripe. I then edited the double-issue of SubStanceOn the Origin of Fictions (2001)—that came out of that conference. I am still a great fan of Ellen Spolsky, whose Gaps in Nature was a major early influence. My review of her latest, characteristically provocative, “cognitivist” book, The Contracts of Fiction, will appear shortly in Poetics Today.

What unique contributions to mind studies are literary scholars (or scholars of the arts, or of the humanities in general) positioned to make to mind studies?

I think the major front, still, is work that demonstrates the “exchange value” that science and the humanities have for each other. In the early days of cognitivist literary work, much of the research went in one direction, scienceàhumanities. There was also much condescension from scientists like E. O. Wilson and Steven Pinker, which didn’t help. Showing what science in return can gain from the humanities, our special intellectual leverage, has long been advocated by Meir Sternberg. An issue of Poetics Today (32.3, 2011) was dedicated to it, and now there are younger scholars, like Marco Bernini and Marco Caracciolo—and for sure our new colleague Sowon Park—who are opening up this approach in many ways.

What does literature do for minds?

This is such a huge question, I will simply rephrase it: How do different embodied minds engage with different fictional or nonfictional texts in different media in different contexts at different times? All this and more is what we are trying to find out.

What role does last year’s L&M research theme of improvisation play within your research?

To date, my major contribution to the study of improvisation has been as director for Rob Wallace’s wonderfully original dissertation, Improvisation and the Making of American Literary Modernism, which was almost immediately published by Bloomsbury (2010).

What relationship does intersubjectivity, understood as the study of interdependent relationships and interactions, have to your work?

A very difficult theme, I think. Intersubjectivity links up with several areas that have become more challenging the more we know: empathy, theory of mind, mirror neurons, distributed cognition, the extended mind. It is interestingly adjacent to work of mine that has focused on representations of subjectivity and how they engage the reader, especially in autography and the work of Samuel Beckett.

Where do you see this field heading? What’s unanswered (or just beginning to be answered) that you are curious about?

Well, there are no answers, nor will there ever be. Our minds and how we respond to literature and the arts are simply way to complex. So this field is all a wonderful work in progress. E. O. Wilson wrote that fields in the humanities exceed the complexity of physics by many orders of magnitude, which means that what we call “theories” are, from a scientific perspective, hypotheses, though indeed some have great and productive staying power.

Selections from Porter’s Work:

Below you will find an excerpt from the first few pages of Porter’s article “Humanists, Scientists, and the Cultural Surplus” (SubStance 30.1/2 [2001]: 203–219):

When E. O. Wilson chides humanists for invoking the idea of “processes too complex for reductionistic analysis” (“the white flag of the secular individual, the lazy modernist equivalent of The Will of God”), he is playing chicken in an inappropriate way. Of course we might find reductionistic analysis applying in cultural areas where we are unable to apply it now. But the scientist goes wrong in failing to understand the degree to which the humanist researcher must accept and work with ideas that are, by scientific standards, hypothetical. After all, so many of the really important things we have to think about in this life lie in areas where the question of causation is highly speculative and where an answer, if one exists, is far down the road—if it’s down there at all.


It may well be, for example, that there are narrative “functions” and “archetypes” that not only replicate across the entire range of cultures but that indicate, if not direct genetic causation, a significant degree of epigenetic constraint. And in some cases the epigenetic leash appears to be quite tight. What Freud called the “law of the talon”—the revenge imperative, an eye for an eye—may well be rooted in a genetic predisposition to strike back. This would make a kind of grim evolutionary sense. So it’s no surprise that in popular narrative, one finds its archetypes everywhere. Hollywood specializes in it, often exhaustingly, in narratives that ignite the genetic need, rousing audiences to a frenzy of unsatiated craving for revenge, and then satisfying it.


But you also find the revenge plot in Hamlet and Moby Dick, neither of which appears to give way to the genetic predisposition. As we follow the narrative, we are not so much roused to revenge as we are deeply conflicted. This is one of the many qualities they have that makes them as absorbing as they are frustrating. They engage in what no amount of reduction will make finally clear: the immensely important task of reconciling the claims of knowledge and desire. This is a task that begins in articulation and that is extended by interpretation. It is a complex ethical business that no culture escapes, and it requires, not reduction, but thoughtful construction. Should such texts work as cautionary tales with powerful moral lessons that keep warring parties from killing each other off, so much the better. But the main object is larger than this and arguably more important: not just to survive, but to live.


The issue of the limits of a solely Darwinian accounting of things was provocatively extended by Gould and Lewontin’s idea of the “spandrel,” or the accidental consequence of evolutionary necessity. In architecture, a spandrel is the curved triangle caused by the struts in a dome. This structural necessity became, in its turn, a frame for art, some of it rather wonderful, but none of it necessary for the structure’s architectural integrity. In evolutionary bio/psychology, scholars have argued about specific applications of the theory of the spandrel to accidental consequences of human capabilities like language or narrative. Of course, accidental consequences have a way of becoming evolutionary necessities. Still, it would appear almost certainly the case that the in-fill Shakespeare and Melville created to elaborate the evolutionary givens of narrative and revenge is not present by evolutionary necessity. It is all quite spandrelesque.


O vengeance!

Why what an ass I am! This is most brave,

That I, the son of a dear father murdered,

Prompted to my revenge by heaven and hell,

Must like a whore unpack my heart with words

And fall a-cursing like a very drab,

A stallion! Fie upon’t, foh!


Hamlet is having a great deal of difficulty here casting himself in a simple tale of revenge, an old masterplot in which he is supposed to play the role of avenger with all the appropriate passion. He can’t squeeze himself into this reductive mold, which also requires reducing his mother, his uncle, even his “dear father” to their functions in the tale. As a sign of his own irreducible human complexity, a multitude of other considerations inflect his thinking on the subject. The inclusion of both heaven and hell as “prompters,” the metaphor of prostitution, the opposition of language and deeds are all at play in his thinking in this brief passage. It is a rich intellectual counterpoint, set off by the additional music of blank verse.


In other words, gifts that we have by virtue of our long struggle to survive and reproduce have given us other behaviors that do not seem to be dictated by Darwinian imperatives. They are way out there on the genetic leash. This point is critical if we are to avoid the naturalistic fallacy of passing from an “is” to an “ought.” Species survival may or may not be good, but if it is good, it is surely not the only good thing. Nor very distinguished. Worms do it. Moreover, they have been doing it much longer than we have and will be still doing it after we’re gone. The really important task is not survival but making a life out of the situation survival has landed us in, equipped as we are with individually specific self-awareness. What art and literature so often enable, then, and what interpretive work enables in its turn, is some alleviation of the burden of consciousness.



Below you will find an excerpt from the first few pages of Porter’s “Cognitive Literary Studies: The ‘Second Generation’” (Poetics Today 27.4 [2006]: 711-722):

A biologist, about to dissect a frog, is startled to observe certain remarkable features: the superior glossiness of its coat, the strength of its sinews, the speed of its reactions, a look of what one might almost call the light of understanding in its eye. He sets it aside in its own cage and as the days go by observes other facets of this remarkable frog. Its croak is almost musical in its sustained baritone, like the ancient low notes of certain Buddhist monks. Over time, this special frog begins to draw the biologist from his experimental work on the tibial reflex of Rana catesbiana. He devotes more and more time to an appreciation and celebration of the achievements of Froggo Bullmeister, surely the most gifted and accomplished of its entire species.


Here’s an example closer to home: In an inquiry into the origins of racism, a sociologist has devised an experiment involving human subjects, each of whom is asked to write a page narrativizing what seems to be happening in a film clip depicting the interactions of characters of different apparent races. The data she seeks are the spontaneous recurrences of certain keywords. But one response astonishes her. It is so eloquent, so deeply felt, that she finds herself moved to tears. Rashly, she breaks the code of anonymous subjects and contacts this ragged, impoverished, untutored product of the urban ghetto. She encourages him to expand on what he has written. It turns out he has a trove of autobiographical fiction. Over time, she increasingly neglects her project as she devotes more of her time to encouraging this genius in a career that eventually dazzles the world. She in turn becomes his most brilliant interpreter, critic, and biographer.


You get the point. The academy has always housed what appear to be two fundamentally opposed objects of study that generate two fundamentally opposed ways of going about that study. The one is bound to the repeatable, the other to the unrepeatable; the one to the norm, the other to the exception; the one to the general, the other to the particular. This is why, post 1967, so many old-timers reacted viscerally to what was and is still called “theory”—not because they were old, or conservative, or incapable of abstract reasoning, but because what was common or predictable in their field bored them except insofar as it helped one appreciate the uncommon. In academic work in the humanities, as the gravitational influence of the scientific model has grown stronger, the market for books devoted to the appreciation of individual writers has waned. Yet even today any number of future scholars go into graduate work because they are just simply blown away by Shakespeare or Büchner or Neil Gaiman. Some adapt without abandoning their enthusiasms. Those who don’t adapt drop out or do “creative writing” or go into publishing or become columnists or, tellingly, accept jobs as generalists in smaller private institutions. . . .


Cognitive literary study emerged in the 1970s largely (though not entirely) as an extension of the “dynamic” (Sternberg) and reader-response (Iser) approaches to interpretation. As such, it sought, logically, to extend the understanding of the reader/text relation back into the mind, about which we know a little more now than we did then, though still much by inference.


For the second generation there was no “theory” of cognitive literary (or, more broadly, cultural) criticism, not even the spongy kind referenced by most humanists (much less the full-blooded, muscular scientific theory that supports every interpretive move of the literary Darwinists). It was, rather, an approach or, perhaps better, a stance, manned by a bunch of scholar-pirates who plundered for their purposes troves of hypotheses, bright ideas, and, yes, rigorous scientific work, dragging it into the work they do as still quite recognizable literary scholar-critics. If there was a danger, as Tony Jackson wrote 2002, in the ease with which “the vocabulary of cognitive rhetoric” can be “plugged into the interpretation,” I think there is much less now. And having survived, allopatrically, a long constructivist era, cognitive study in the humanities is here for the duration—a constant reminder that cultural constructions require human universals, and that if we include the latter in our vision the whole subject of representation and interpretation becomes larger and more satisfyingly complex.



Featured Minds: Laura Otis


Laura Otis works as a neuroscientist-turned-literary scholar in her position as Samuel Candler Dobbs Professor of English at Emory University. Her most recent book, Rethinking Thought: Inside the Minds of Creative Scientists and Artists (Explorations in Narrative Psychology, 2015), describes her interviews with scientists and artists such as Temple Grandin and Salman Rushdie to illustrate how greatly the experience of conscious thinking can vary from person to person. Otis pays special attention to her creative interviewees’ relations with visual mental images and verbal language, since people differ in the ways they use words and pictures to solve problems and imagine other worlds. By showing how differently thinking can work, she aims to build respect for a diverse range of thinking styles.

UCSB and the L&M Initiative were treated to an in-depth look at Dr. Otis’ work on two occasions in Spring 2016. She led our reading group in an engaging discussion of Rethinking Thought; and she presented at UCSB’s Interdisciplinary Humanities Center during its series on “The Humanities and the Brain.”

What are you currently working on in your research or teaching that relates to the mind?laura-otis-scaled

My current research project, Banned Emotions, analyzes metaphors for culturally unpopular emotions such as self-pity, spite, bitterness, grudge-bearing, and prolonged anger. I am trying to learn how bodily experiences and cultural ideologies combine in the ways that people talk and think about emotions. I compare emotion metaphors used in classic and recent novels, popular films, scientific articles, and religious texts. Last spring, I presented this research to UCSB scholars in a talk called “The Physiology and Politics of Emotion Metaphors.”

I am also currently earning a Master’s Degree of Fine Arts in Fiction from Warren Wilson College. The craft analysis I have been doing in this program has led me to a new project on how fiction-writers use language to blend sensory experiences in order to create an illusion of lived reality. Neuroscientists who study sensory systems are challenged by the “binding problem”: How do people combine sensations of sight, sound, smell, taste, and touch to create a unified mental representation of a person, place, or thing? Fiction-writers may offer insight into this problem, and it is worth analyzing their solutions.

How did you become interested in the field of Literature and the Mind?

I came to the study of Literature and the Mind via an unusual route. I majored in Biochemistry in college, studied Neuroscience at UCSF, and worked in labs for eight years before deciding to earn a PhD in Comparative Literature. All of my research projects since the dissertation and first book, Organic Memory, have sought common patterns in the ways that laboratory scientists and literary writers use language to develop ideas. These books include Membranes, Networking, Müller’s Lab, Literature and Science in the Nineteenth Century: An Anthology, and a translation of the Spanish neuroscientist Santiago Ramón y Cajal’s Vacation Stories. I am interested in memory, identity, communication systems, and in ways that literary writing can shed light on scientific problems.

What unique contributions are literary scholars (or scholars of the arts, or of the humanities in general) positioned to make to mind studies?

Besides contributing to neuroscience and sensory physiology, Literature and the Mind as an emerging field may raise new questions for scholars in Disability Studies. Although neuroscience tends to focus on what human nervous systems have in common, scientists are showing increasing interest in individual variation, and literary representations of compelling minds suggest not just what human minds share, but how they vary. Many people read fiction to “enter” fascinating minds, and literary depictions can reinforce scientific studies by showing unusual minds struggling and thriving in the contexts that have shaped them.

How do you see your interests in literature and the mind intersecting with other fields of study in the humanities (such as environmental scholarship, gender and sexuality, race and ethnicity, etc.)?

Literature and the Mind promises to grow most hardily when scientists and literary scholars collaborate. I have benefited from team-teaching with Emory neurologist Krish Sathian, who studies the interaction of the visual and tactile systems and the neural basis of metaphor (http://neurology.emory.edu/faculty/neuro_rehab/sathian_krish.html). Together we have designed and taught two courses, “Images, Metaphors, and the Brain,” and “Language, Literature, and Mental Simulation,” and organized a one-day symposium, “Metaphors and the Mind.” Our classes bring students in Neuroscience, Psychology, English, and Comparative Literature into the same classroom and lead to surprising insights. Many laboratory scientists are eager to learn from literary scholars, and team-teaching can be an energizing learning experience.

What does literature do for minds?

What literature can do for human minds is a question for neuroscientists as well as literary scholars. The perspective of fiction-writers needs to be considered, too, because nothing shows you all the details a unique mental world involves better than trying to create one yourself. My undergraduate teaching now includes scientific, analytical, and creative assignments, because these approaches to Literature and the Mind offer complementary mental workouts. In my “Languages of Emotion” courses, students compare Sigmund Freud’s insights to relevant findings published in recent, peer-reviewed articles and create scenes in which they, as an attending physician in charge of an ER, have to call in experts such as Freud, William James, or Paul Ekman to evaluate a suffering patient. Literature and the Mind may be even more productive as a teaching field than as a research field, because it can inspire a new generation of scientists, doctors, scholars, and writers.


Selections from Laura’s Work:

Below you will find an excerpt from the first few pages of Laura’s Rethinking Thought (Oxford University Press, 2015):

Chapter One

Who’s the “You”?

Rethinking Thought (cover)

One day I walked into the lab and cried. I’d been a graduate student in neuroscience for almost two years, and in that time, my feeling of foreignness had grown from queasy twinges to overwhelming nausea. I was in the wrong place, and my ashamed attempts to hide it were sapping the energy I needed for creative work. The monoclonal antibodies I had raised to identify developing neurons stuck to no proteins identifiable on a Western blot. I needed to start over, and I read the setback as a signal: it was time to get out. Resting my elbows on the white bench paper, I hid my wet face. How could I disappoint the people who’d invested so much time teaching me? Yet I sensed that by staying, I’d be committing a greater betrayal. As far as I could see, the place I’d chosen to work demanded things my mind couldn’t do and had little use for the things that it did.

Three decades later, I’ve come to know my mind better. It will never lose its potential to learn, but its strengths and weaknesses have emerged clearly. My mental world functions acoustically, and my passions for languages, music, and stories are supported by sensitivity to sound. On most days, I can pull an “A” out of thin air. To find my keys, I shake my purse once and know in which corner they’ve lodged. If a friend drops a coin, I know it’s a quarter. Sometimes I think I could echolocate like a bat. When I write dialogue, I transcribe the voices I hear—not because I’m schizophrenic, but because my mind works like an iTunes library. My memories consist of people’s voices replayed as they originally sounded, often without visual components. This system absorbs tones, phrases, and tales, which recur and recombine against a field of gray. What taxes this mind—causing me to collapse in tears—is trying to recall pictorial or spatial information.

I’ve lived in my apartment for ten years but couldn’t tell you which way to turn my echo-located key. Each time I bring my hand to the lock, it’s as if I’ve never done it before. I try it first one way, then the other—it’s a 50-50 shot. As I write this, I’m trying to picture my shower and am unsure whether the hot water is on the left or the right. I certainly couldn’t tell you which way to turn the knob to make the water flow. “Picture an N,” says Stephen Kosslyn, a psychologist who has shown that visual mental imagery can be studied scientifically. In his Harvard office, whose details I can’t recall, I do my absolute best. I close my eyes. I conjure a big, black N, just a little bit fuzzy, like a New York Times “N” under a magnifying glass. “Now rotate it,” says Kosslyn. “Does it form another letter?” I know that the only candidate is “Z.” To see whether the “N” can form a “Z,” I nudge it—clockwise, I think. (I have to think actively about which way a clock turns.) The N dissolves into dust. I try it again. Poof. Frustrated, I struggle to rotate the mental N, but as soon as it budges, it disintegrates. Looking now at all the “N’s” I’ve just typed, I see easily that if you tilt one 90 degrees, it forms a “Z.” But I couldn’t do that with my imagined “N.” Why, thirty years ago, did I want to study neuronal membrane proteins? How did I ever pass physics?

When I think about physics, a phrase comes to mind: F=ma. In my mental world, that’s what it is: a phrase. I recall it as a series of vowel sounds, a song that runs, “Eh-eh-ay.” I passed high school and college physics by memorizing these melodies and on tests, plugging in numbers for tones. When I read Richard Feynman’s descriptions of science, I realized I’d never understood physics. During a physics class in Brazil, Feynman observed, “The students were all sitting there taking dictation, and when the professor repeated the sentence, they checked it to make sure they wrote it down right. . . . There, have you got science? No! You have only told what a word means in terms of other words” (Feynman 1997, 213, 217). For me, as for the Brazilian students, the formulas didn’t correspond to anything real. A friend who majored in physics told me how he’d puzzled over “F=ma,” the second law of classical mechanics. Newton’s law dictates that force equals mass times acceleration, which at first seems counter-intuitive. Shouldn’t a force consist of a mass times its velocity? For weeks, my friend thought about it until one day he understood. I can’t imagine what went on in his head during those weeks, but maybe, like Feynman, he was picturing examples. The Nobel prize-winning physicist confessed, “I can’t understand anything in general unless I’m carrying along in my mind a specific example and watching it go” (Feynman 1997, 244). For me, there was nothing to understand and nothing to see, only a representation I took on faith. I never learned physics, and it wasn’t my teachers’ fault. At the time, I wouldn’t force myself to think in a way that didn’t come naturally.

Rethinking Thought

     In 27 years of teaching courses that combine science, literature, and writing, I’ve been struck by how differently people think. For the purposes of this book, I will define thought as the ways people consciously process information: how they plan, imagine, learn, reason, and remember. Most mental activity occurs without conscious awareness, but I am focusing on the lived experience of thought. People’s mental worlds vary astonishingly, as I’ve learned since trying to picture proteins’ shapes. Mystified, I used to stare at the twin, candy-like structures in my organic chemistry textbook, whose authors swore I should see one three-dimensional molecule. I never did. Skeptically, I listened to other students describe the virtual Calder mobiles they were viewing. In my mind’s eye, I’ve never seen anything in three dimensions, and I thought about “The Emperor’s New Clothes.” Seeing my cohorts’ conviction, though, I couldn’t believe that they were lying. Their minds were doing something mine couldn’t do. I felt inadequate and ashamed, but simultaneously, I was fascinated.

What goes on in other minds is a mystery that can evoke frustrated responses. Thinking, I joke, is like going to the toilet: we don’t know what the experience is like for other people, and we rarely talk about it. We presume that their experience is a lot like ours, but we don’t know for sure. When it comes to thinking, this premise is shaky.

The recent HBO film about engineer Temple Grandin depicts a breakthrough realization (Ferguson 2010). Noticing that the teenage Grandin has a good visual memory for horses, her science teacher asks her if she remembers common objects just as well—shoes, for instance. Representing the activity of Grandin’s mind—which she compares to the search engine Google Images—the film flashes pictures of shoes, increasing the pace as her excitement mounts. As fast as she can, Grandin names all the shoes she’s seeing, but her speech can’t keep up with her visual memory. “So you can picture every pair of shoes you’ve ever seen?” interrupts her science teacher. “Sure, can’t you?” she asks.

As someone who has moved from science to literature, I have experienced this moment repeatedly. Again and again, I’ve seen people astonished to learn what other people’s minds can and can’t do, such as mentally rotate the letter “N” 90 degrees and observe its new properties. I’ve felt the strength—and deadliness—of each person’s premise that other people have the same mental life and think just as she or he does. This assumption not only thwarts communication; it can lead unconventional thinkers to believe that they can’t think at all.

This book is for anyone who’s ever been told, “You’re not thinking!” All too often, thought that occurs in an unfamiliar form is mistaken for the absence of thought. As explanations emerge for the ways that thought works, we risk losing valuable knowledge if we impose pre-fabricated narratives on minds rather than letting them tell their own stories.

In a recent discussion at Emory University, a psychologist was telling some literature professors how human brains process language.

“When you hear speech,” he said, “There’s activity in your left cortex. You–”

“Wait a minute,” said Rosemarie Garland-Thomson, “Who’s the you?”

For Garland-Thomson, who has helped create the field of Disability Studies, every “you” is unique. She questions attempts to establish a normal “you,” since they can cause variant “yous” to be seen as inadequate. Temple Grandin’s skill with visual mental images has made her a creative designer and engineer. Variant ways of thinking that create disadvantages in some contexts can confer advantages in others.[1]

Until recently, many neuroscientists and psychologists have sacrificed intriguing studies of individual differences to build basic knowledge of human brains. This choice to focus on shared human traits has been a conscious, informed decision made in order to lay a foundation for emerging fields. Interest in personal variations has always been high, but until recently, laboratory scientists have had to concentrate on common features to produce data they can trust. For the most part, studying individual quirks has been a luxury they cannot yet afford. Scholars in the humanities do experimenters an injustice when they criticize the “naïveté” of scientists seeking the “neural underpinnings” of complex phenomena such as telling jokes. No neuroscientist expects to learn everything about humor by studying functional magnetic resonance (fMRI) images; most are keenly aware of their methods’ strengths and limitations. Having worked in labs for nearly ten years, I appreciate the innovation, dedication, and bravery of experimental scientists. Designing controlled experiments to explore complex functions such as ways of thinking—and writing grants to get them funded—means having the courage to begin.

So far, building basic knowledge about the neural mechanisms underlying human thought has meant concentrating on neural features most humans share. In twenty-first-century science, however, individual variations are attracting increasing attention. Sharon L. Thompson-Schill, Todd S. Braver, and John Jonides have argued that cognitive neuroscientists will learn a great deal if they regard individual differences as data rather than noise (Thompson-Schill, Braver, and Jonides 2005, 115-16).[2] In an fMRI study of how practice affects performance on mental imagery tasks, Kosslyn and his colleagues found that “individual-differences analyses may be helpful in revealing brain areas that are overlooked in standard group analyses” (Ganis, Thompson and Kosslyn 2005, 245). The notion that science pursues universal truths, whereas literature illuminates particular situations, is crumbling fast. Like the opposition of science to literature, that of the universal to the particular may be blocking the understanding of human thought.[3] To learn how thinking works, scholars in every field that analyzes cognition need to combine their methods and insights. Together, we need to rethink thought. We need to develop the emergent science of “the” human brain into a science of human brains, since a body of knowledge restricted to what seven billion mental worlds share will create a severely limited, unrealistic picture of what human thinking involves.

This book contributes to this task by exploring differences in people’s thought experiences. Like Vera John-Steiner’s study of creativity, Notebooks of the Mind (1985), it aims to complement laboratory research by comparing and analyzing introspections.[4] As a narrative study, it offers a diastolic response to the driving systole of laboratory work. While only controlled, intelligently planned experiments produce generalizable data, studies examining personal introspections can provide insights that affirm, challenge, or trouble experimental results. Most significantly, narrative analyses of individual thinking can suggest new experiments to try.[5] By focusing on individual experiences, I have sacrificed any attempt to make universal claims in order to provide a glimpse of lived reality–at least as some people experience it. In the terms of psychologist Jerome Bruner, I am analyzing material offered in the narrative mode of thought (which aims to tell good stories) to shore up the paradigmatic mode (which seeks to explain), but I do not see these modes as opposed (Bruner 1986, 11-13). On a very small scale, I have tried to learn what thinking is by studying differences in the way thinking feels.

[1] The research underlying this book may contribute to neurodiversity studies, although the neurodiversity movement has often emphasized the experiences and perspectives of people diagnosed with disorders such as autism. To the best of my knowledge, all but one of my participants are neurologically “normal,” but analyzing the astonishing range of the so-called normal also reveals the diversity of human minds. For a review of the neurodiversity movement, see Kras 2010. I thank Adam Newman for pointing out the affinity of this project to recent studies of neurodiversity.

[2] The entire volume of Cognitive, Affective, and Behavioral Neuroscience in which Thompson-Schill’s, Braver’s, and Jonides’s editorial appears is dedicated to research illustrating what can be learned from fMRI studies that analyze individual differences. I thank Corey Inman for bringing this volume to my attention.

[3] Patrick Colm Hogan argues that, “universalism vs. particularism is a false dichotomy” (Hogan 2003, 16).

[4] In her study of creative thinking, John-Steiner wrote that she aimed “to complement and extend the analyses of thinking obtained from laboratory studies with a broad, theoretical, and interdisciplinary approach.” For the most part, however, John-Steiner did not bring her participants’ insights into dialogue with the outcomes of laboratory experiments (John-Steiner 1997, 3).

[5] I am grateful to psychologist Jessica Alexander for introducing me to this idea.