Daniel Dennett: Wikis

Advertisements
  
  

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

Advertisements

From Wikipedia, the free encyclopedia

Daniel Clement Dennett
Full name Daniel Clement Dennett
Born March 28, 1942 (1942-03-28) (age 67)
Era 20th / 21st-century philosophy
Region Western Philosophy
School Analytic philosophy
Main interests Philosophy of mind
Philosophy of biology
Philosophy of science
Notable ideas Heterophenomenology
Intentional stance
Intuition pump
Multiple Drafts Model
Greedy reductionism

Daniel Clement Dennett (born March 28, 1942 in Boston, Massachusetts) is an American philosopher whose research centers on the philosophy of mind, philosophy of science and philosophy of biology, particularly as those fields relate to evolutionary biology and cognitive science. He is currently the co-director of the Center for Cognitive Studies, the Austin B. Fletcher Professor of Philosophy, and a University Professor at Tufts University. Dennett is a noted atheist and secularist as well as being a prominent advocate of the Brights movement.

Contents

Early life and education

Dennett spent part of his childhood in Lebanon, where, during World War II, his father was a covert counter-intelligence agent with the Office of Strategic Services posing as a cultural attaché to the American Embassy in Beirut.[1] The young Dennett and family returned to Massachusetts in 1947 after his father died in an unexplained plane crash.[2] His sister is the investigative journalist Charlotte Dennett.[1]

He attended Phillips Exeter Academy and spent one year at Wesleyan University before receiving his B.A. in philosophy from Harvard University in 1963, where he was a student of W.V. Quine. In 1965, he received his D.Phil in philosophy from Hertford College, Oxford, where he studied under the ordinary language philosopher Gilbert Ryle.

Career in academia

Daniel Dennett in 2008

Dennett is currently (April 2009) the Austin B. Fletcher Professor of Philosophy, University Professor, and Co-Director of the Center for Cognitive Studies (with Ray Jackendoff) at Tufts University.[3]

Dennett describes himself as "an autodidact — or, more properly, the beneficiary of hundreds of hours of informal tutorials on all the fields that interest me, from some of the world's leading scientists."[4]

Dennett gave the John Locke lectures at the University of Oxford in 1983, the Gavin David Young Lectures at Adelaide, Australia, in 1985, and the Tanner Lecture at Michigan in 1986, among many others. In 2001 he was awarded the Jean Nicod Prize and gave the Jean Nicod Lectures in Paris. He has received two Guggenheim Fellowships, a Fulbright Fellowship, and a Fellowship at the Center for Advanced Studies in Behavioral Science. He was elected to the American Academy of Arts and Sciences in 1987. He was the co-founder (1985) and co-director of the Curricular Software Studio at Tufts University, and has helped to design museum exhibits on computers for the Smithsonian Institution, the Museum of Science in Boston, and the Computer Museum in Boston. He is a Humanist Laureate of the International Academy of Humanism and a Fellow of the Committee for Skeptical Inquiry. The American Humanist Association named him the 2004 Humanist of the Year.

Free will

While he is a confirmed compatibilist on free will, in "On Giving Libertarians What They Say They Want" - Chapter 15 of his 1978 book Brainstorms,[5] Dennett articulated the case for a two-stage model of decision making in contrast to libertarian views.

The model of decision making I am proposing has the following feature: when we are faced with an important decision, a consideration-generator whose output is to some degree undetermined produces a series of considerations, some of which may of course be immediately rejected as irrelevant by the agent (consciously or unconsciously). Those considerations that are selected by the agent as having a more than negligible bearing on the decision then figure in a reasoning process, and if the agent is in the main reasonable, those considerations ultimately serve as predictors and explicators of the agent's final decision.[6]

While other philosophers have developed two-stage models, including William James, Henri Poincaré, Arthur Holly Compton, and Henry Margenau, Dennett's defends this model for the following reasons:

  1. First...The intelligent selection, rejection, and weighing of the considerations that do occur to the subject is a matter of intelligence making the difference.
  2. Second, I think it installs indeterminism in the right place for the libertarian, if there is a right place at all.
  3. Third...from the point of view of biological engineering, it is just more efficient and in the end more rational that decision making should occur in this way.
  4. A fourth observation in favor of the model is that it permits moral education to make a difference, without making all of the difference.
  5. Fifth — and I think this is perhaps the most important thing to be said in favor of this model — it provides some account of our important intuition that we are the authors of our moral decisions.
  6. Finally, the model I propose points to the multiplicity of decisions that encircle our moral decisions and suggests that in many cases our ultimate decision as to which way to act is less important phenomenologically as a contributor to our sense of free will than the prior decisions affecting our deliberation process itself: the decision, for instance, not to consider any further, to terminate deliberation; or the decision to ignore certain lines of inquiry.

These prior and subsidiary decisions contribute, I think, to our sense of ourselves as responsible free agents, roughly in the following way: I am faced with an important decision to make, and after a certain amount of deliberation, I say to myself: "That's enough. I've considered this matter enough and now I'm going to act," in the full knowledge that I could have considered further, in the full knowledge that the eventualities may prove that I decided in error, but with the acceptance of responsibility in any case.[7]

Leading libertarian philosophers such as Robert Kane have rejected Dennett's model, specifically that random chance is directly involved in a decision, which eliminates the agent's motives and reasons, character and values, and feelings and desires. They claim that if chance is the primary cause of decisions, then agents cannot be liable for resultant actions.

Dennett and Kane agree that their models do not find the appropriate location in the brain for randomness that will only help and not hurt a model of free will that provides agent control. Kane says:

[As Dennett admits,] a causal indeterminist view of this deliberative kind does not give us everything libertarians have wanted from free will. For [the agent] does not have complete control over what chance images and other thoughts enter his mind or influence his deliberation. They simply come as they please. [The agent] does have some control after the chance considerations have occurred.

But then there is no more chance involved. What happens from then on, how he reacts, is determined by desires and beliefs he already has. So it appears that he does not have control in the libertarian sense of what happens after the chance considerations occur as well. Libertarians require more than this for full responsibility and free will.[8]

Other philosophical views

Dennett has remarked in several places (such as "Self-portrait", in Brainchildren) that his overall philosophical project has remained largely the same since his time at Oxford. He is primarily concerned with providing a philosophy of mind that is grounded in empirical research. In his original dissertation, Content and Consciousness, he broke up the problem of explaining the mind into the need for a theory of content and for a theory of consciousness. His approach to this project has also stayed true to this distinction. Just as Content and Consciousness has a bipartite structure, he similarly divided Brainstorms into two sections. He would later collect several essays on content in The Intentional Stance and synthesize his views on consciousness into a unified theory in Consciousness Explained. These volumes respectively form the most extensive development of his views.[9]

In Consciousness Explained, Dennett's interest in the ability of evolution to explain some of the content-producing features of consciousness is already apparent, and this has since become an integral part of his program. He defends a theory known by some as Neural Darwinism. He also presents an argument against qualia; he argues that the concept is so confused that it cannot be put to any use or understood in any non-contradictory way, and therefore does not constitute a valid refutation of physicalism. Much of Dennett's work since the 1990s has been concerned with fleshing out his previous ideas by addressing the same topics from an evolutionary standpoint, from what distinguishes human minds from animal minds (Kinds of Minds), to how free will is compatible with a naturalist view of the world (Freedom Evolves). In his 2006 book, Breaking the Spell: Religion as a Natural Phenomenon, Dennett attempts to subject religious belief to the same treatment, explaining possible evolutionary reasons for the phenomenon of religious adherence.

Dennett self-identifies with a few terms:

[Others] note that my 'avoidance of the standard philosophical terminology for discussing such matters' often creates problems for me; philosophers have a hard time figuring out what I am saying and what I am denying. My refusal to play ball with my colleagues is deliberate, of course, since I view the standard philosophical terminology as worse than useless — a major obstacle to progress since it consists of so many errors.

Daniel Dennett, The Message is: There is no Medium

Yet, in Consciousness Explained, he admits "I am a sort of 'teleofunctionalist', of course, perhaps the original teleofunctionalist'". He goes on to say, "I am ready to come out of the closet as some sort of verificationist". In Breaking the Spell: Religion as a Natural Phenomenon he admits to being "a bright", and defends the term.

Role in evolutionary debate

Dennett sees evolution by natural selection as an algorithmic process (though he spells out that algorithms as simple as long division often incorporate a significant degree of randomness).[10] This idea is in conflict with the evolutionary philosophy of paleontologist Stephen Jay Gould, who preferred to stress the "pluralism" of evolution (i.e. its dependence on many crucial factors, of which natural selection is only one).

Dennett's views on evolution are identified as being strongly adaptationist, in line with his theory of the intentional stance, and the evolutionary views of biologist Richard Dawkins. In Darwin's Dangerous Idea, Dennett showed himself even more willing than Dawkins to defend adaptationism in print, devoting an entire chapter to a criticism of the ideas of Gould. This stems from Gould's long-running public debate with E. O. Wilson and other evolutionary biologists over human sociobiology and its descendant evolutionary psychology, which Gould and Richard Lewontin opposed, but which Dennett advocated, together with Dawkins and Steven Pinker.[11] Strong disagreements have been launched against Dennett from Gould and his supporters, who allege that Dennett overstated his claims and misrepresented Gould's to reinforce what Gould describes as Dennett's "Darwinian fundamentalism".[12]

Dennett's theories have had a significant influence on the work of evolutionary psychologist Geoffrey Miller. He has also written about and advocated the notion of memetics as a philosophically useful tool, most recently in his "Brains, Computers, and Minds," a three part presentation through Harvard's MBB 2009 Distinguished Lecture Series.

Personal life

Dennett in Tahiti in 1984

Dennett lives with his wife in North Andover, Massachusetts, and has a daughter, a son, and two grandsons.[13] He is also an avid sailor.

In October 2006, Dennett was hospitalized due to an aortic dissection. After a nine-hour surgery, he was given a new aorta. In an essay posted on the Edge website, Dennett gives his firsthand account of his health problems, his consequent feelings of gratitude towards the scientists and doctors whose hard work made his recovery possible, and his complete lack of a "deathbed conversion". By his account, upon having been told by friends and relatives that they had prayed for him, he resisted the urge to ask them, "Did you also sacrifice a goat?"[14][15]

Selected books

  • Brainstorms: Philosophical Essays on Mind and Psychology (MIT Press 1981) (ISBN 0-262-54037-1)
  • Elbow Room: The Varieties of Free Will Worth Wanting (MIT Press 1984) — on free will and determinism (ISBN 0-262-04077-8)
  • The Mind's I (Bantam, Reissue edition 1985, with Douglas Hofstadter) (ISBN 0-553-34584-2)
  • Content and Consciousness (Routledge & Kegan Paul Books Ltd; 2nd ed. January 1986) (ISBN 0-7102-0846-4)
  • The Intentional Stance (MIT Press; reprint edition 1989) (ISBN 0-262-54053-3)
  • Consciousness Explained (Back Bay Books 1992) (ISBN 0-316-18066-1)
  • Darwin's Dangerous Idea: Evolution and the Meanings of Life (Simon & Schuster; reprint edition 1996) (ISBN 0-684-82471-X)
  • Kinds of Minds: Towards an Understanding of Consciousness (Basic Books 1997) (ISBN 0-465-07351-4)
  • Brainchildren: Essays on Designing Minds (Representation and Mind) (MIT Press 1998) (ISBN 0-262-04166-9) — A Collection of Essays 1984–1996
  • Freedom Evolves (Viking Press 2003) (ISBN 0-670-03186-0)
  • Sweet Dreams: Philosophical Obstacles to a Science of Consciousness (MIT Press 2005) (ISBN 0-262-04225-8)
  • Breaking the Spell: Religion as a Natural Phenomenon (Penguin Group 2006) (ISBN 0-670-03472-X).
  • Neuroscience and Philosophy: Brain, Mind, and Language (Columbia University Press 2007) (ISBN 978-0-231-14044-7), co-authored with Maxwell Bennett, Peter Hacker, and John Searle

See also

References

  1. ^ a b Feuer, Alan (2007-10-23), "A Dead Spy, a Daughter’s Questions and the C.I.A.", New York Times, http://www.nytimes.com/2007/10/23/nyregion/23spydad.html, retrieved 2008-09-16 
  2. ^ Brown, Andrew (2004-04-17). "The semantic engineer". The Guardian. http://books.guardian.co.uk/departments/politicsphilosophyandsociety/story/0,6000,1193371,00.html. Retrieved 2010-02-01. 
  3. ^ Center for Cognitive Studies
  4. ^ Dennett, Daniel C. (2005-09-13) [2004]. "What I Want to Be When I Grow Up". in John Brockman. Curious Minds: How a Child Becomes a Scientist. New York: Vintage Books. ISBN 1-4000-7686-2. http://www.edge.org/books/curious_index.html. 
  5. ^ Brainstorms: Philosophical Essays on Mind and Psychology, MIT Press (1978), pp.286-299
  6. ^ Brainstorms, p.295
  7. ^ Brainstorms, pp.295-97
  8. ^ Robert Kane, A Contemporary Introduction to Free Will, Oxford (2005) p.64-5
  9. ^ Guttenplan, Samuel (1994), A companion to the philosophy of mind, Oxford: Blackwell, pp. 642, ISBN 0-631-19996-9 
  10. ^ p. 52-60, Darwin's Dangerous Idea: Evolution and the Meanings of Life (Simon & Schuster; reprint edition 1996) (ISBN 0-684-82471-X)
  11. ^ Although Dennett has expressed criticism of human sociobiology, calling it a form of "greedy reductionism," he is generally sympathetic towards the explanations proposed by evolutionary psychology. Gould also is not one sided, and writes: "Sociobiologists have broadened their range of selective stories by invoking concepts of inclusive fitness and kin selection to solve (successfully I think) the vexatious problem of altruism—previously the greatest stumbling block to a Darwinian theory of social behavior. . . . Here sociobiology has had and will continue to have success. And here I wish it well. For it represents an extension of basic Darwinism to a realm where it should apply." Gould, 1980. "Sociobiology and the Theory of Natural Selection" In G. W. Barlow and J. Silverberg, eds., Sociobiology: Beyond Nature/Nurture? Boulder CO: Westview Press, pp. 257-269.
  12. ^ 'Evolution: The pleasures of Pluralism' — Stephen Jay Gould's review of Darwin's Dangerous Idea, June 26, 1997
  13. ^ Daniel C. Dennett's Home Page
  14. ^ Richard Dawkins: 'The Genius of Charles Darwin' (2008.).
  15. ^ 'Thank Goodness!', edge 195, Nov. 3, 2006

Further reading

  • "Dennett: Reconciling Science and Our Self-Conception" Matthew Elton (Polity Press, 2003) (ISBN 0-7456-2117-1)
  • Daniel Dennett edited by Andrew Brook and Don Ross (Cambridge University Press 2000) (ISBN 0-521-00864-6)
  • Dennett's Philosophy: A Comprehensive Assessment edited by Don Ross, Andrew Brook and David Thompson (MIT Press 2000) (ISBN 0-262-18200-9)
  • Dennett, among others, is discussed in John Brockman's The Third Culture.
  • On Dennett John Symons (Wadsworth Publishing Company 2000) (ISBN 0-534-57632-X)
  • Philosophical Foundations of Neuroscience, P. Hacker and M.R. Bennett (Blackwell, Oxford, and Malden, Mass., 2003) (ISBN 1-4051-0855-X) has an appendix devoted to a strong critique of Dennett's philosophy of mind

External links

Media

Quotes

Up to date as of January 14, 2010

From Wikiquote

Daniel Dennett (born March 28, 1942) is a prominent American philosopher. Dennett's research centers on philosophy of mind and philosophy of science, particularly as those fields relate to evolutionary biology and cognitive science. He is also a prominent atheist.

Contents

Sourced

  • In fact, of course, science is an unparalleled playground of the imagination, populated by unlikely characters with wonderful names (messenger RNA, black holes, quarks) and capable of performing the most amazing deeds: sub-atomic whirling dervishes that can be in several places - everywhere and nowhere - at the same time; molecular hoop-snakes biting their own tails; self-copying spiral staircases bearing coded instructions; miniature keys searching for the locks in which they fit, on floating odysseys in a trillion synaptic gulfs.
    • "Reflections on 'A Conversation With Einstein's Brain'" in The Mind's I (1981), Douglas R. Hofstadter and Daniel C. Dennett, eds.
  • I think religion for many people is some sort of moral viagra.
    • "Atheism Tapes, part 6", BBC tv documentation of Jonathan Miller, produced by Richard Denton, recorded 2003, broadcast 2004
  • Not a single one of the cells that compose you knows who you are, or cares.
    • Sweet Dreams: Philosophical Obstacles to a Science of Consciousness, MIT Press, 2005, p. 2, ISBN 0262042258

Elbow Room (1984)

Elbow Room: The Varieties of Free Will Worth Having. The MIT Press. ISBN 0-262-54042-8

  • The distinction between responsible moral agents and beings with diminished or no responsibility is coherent, real, and important. It is coherent, even if in many instances it is hard to apply; it draws an empirically real line, in that we don't all fall on one side; and, most important, the distinction matters: the use we make of it plays a crucial role in the quality and meaning of our lives. [...] We want to hold ourselves and others responsible, but we recognize that our intuitions often support the judgement that a particular individual has "diminished responsibility" because of his or her infirmities, or because of particularly dire circumstances upon upbringing or at the time of action. We also find it plausible to judge that nonhuman animals, infants, and those who are severely handicapped mentally are not responsible at all. But since we are all more or less imperfect, will there be anyone left to be responsible after we have excused all those with good excuses? [...] We must set up some efficiently determinable threshold for legal competence, never for a moment supposing that there couldn't be intuitively persuasive "counterexamples" to whatever line we draw, but declaring in advance that such pleas will not be entertained. [...] The effect of such an institution [...] is to create [...] a class of legally culpable agents whose subsequent liability to punishment maintains the credibility of the sanctions of the laws. The institution, if it is to maintain itself, must provide for the fine tuning of its arbitrary thresholds as new information (or misinformation) emerges that might undercut its credibility. One can speculate that there is an optimal setting of the competence threshold (for any particular combination of social circumstances, degree of public sophistication, and so on) that maximizes the bracing effect of the law. A higher than optimal threshold would encourage a sort of malingering on the part of the defendants, which, if recognized by the populace, would diminish their respect for the law and hence diminish its deterrent effect. And a lower than optimal threshold would yield a diminishing return of deterrence and lead to the punishment of individuals who, in the eyes of society, "really couldn't help it." The public perception of the fairness of the law is a critical factor in its effectiveness.
    • chapter 7, "Why Do We Want Free Will?", pages 157-162

The Intentional Stance (1987)

The MIT Press. ISBN 0-262-54053-3

  • The trouble with the canons of scientific evidence [...] is that they virtually rule out the description of anything but oft-repeated, oft-observed, stereotypic behavior of a species, and this is just the sort of behavior that reveals no particular intelligence at all - all this behavior can be more or less plausibly explained as the effects of some humdrum combination of "instinct" or tropism and conditioned response. It is the novel bits of behavior, the acts that couldn't plausibly be accounted for in terms of prior conditioning or training or habit, that speak eloquently of intelligence; but if their very novelty and unrepeatability make them anecdotal and hence inadmissible evidence, how can one proceed to develop the cognitive case for the intelligence of one's target species?
  • Philosophers are never quite sure what they are talking about - about what the issues really are - and so often it takes them rather a long time to recognize that someone with a somewhat different approach (or destination, or starting point) is making a contribution.

Consciousness Explained (1991)

Little, Brown. ISBN 0-316-18065-3

  • The juvenile sea squirt wanders through the sea searching for a suitable rock or hunk of coral to cling to and make its home for life. For this task, it has a rudimentary nervous system. When it finds its spot and takes root, it doesn't need its brain anymore, so it eats it! (It's rather like getting tenure.)*

    * The analogy between the sea squirt and the associate professor was first pointed out, I think, by the neuroscientist Rodolfo Llinas.

  • I find it breathtaking [...] that when musical composition competitions are held, the contestants often do not submit tapes or records (or live performances) of their works they submit written scored, and the judges confidently make their aesthetic judgements on the basis of just reading the scores and hearing the music in their minds. How good are the best musical imaginations? Can a trained musician, swiftly reading a score tell just how that voicing of dissonant oboes and flutes over the massed strings will sound?
  • A neurosurgeon once told me about operating on the brain of a young man with epilepsy. As is customary in this kind of operation, the patient was wide awake, under only local anesthesia, while the surgeon delicately explored his exposed cortex, making sure that the parts tentatively to be removed were not absolutely vital by stimulating them electrically and asking the patient what he experienced. Some stimulations provoked visual flashes or hand-raisings, others a sort of buzzing sensation, but one spot produced a delighted response from the patient: "It's 'Outta Get Me' by Guns N'Roses, my favorite heavy metal [sic] band!"

    I asked the neurosurgeon if he had asked the patient to sing or hum along with the music, since it would be fascinating to learn how "high fidelity" the provoked memory was. Would it be in exactly the same key and tempo as the record? Such a song (unlike "Silent Night") has one canonical version, so we could simply have superimposed a recording of the patient's humming with the standard record and compare the results. Unfortunately, even though a tape recorder had been running during the operation, the surgeon hadn't asked the patient to sing along. "Why not?" I asked, and he replied: "I hate rock music!"

    Later in the conversation the neurosurgeon happened to remark that he was going to have to operate again on the same young man, and I expressed the hope that he would just check to see if he could restimulate the rock music, and this time ask the fellow to sing along. "I can't do that," replied the neurosurgeon, "since I cut out that part." "It was part of the epileptic focus?" I asked, and he replied, "No, I already told you - I hate rock music."

  • Philosophers' Syndrome: mistaking a failure of the imagination for an insight into necessity.
  • Up till now [the development of proto-consciousness], we can suppose, nervous systems solved the "Now what do I do?" problem by a relatively simple balancing act between a strictly limited repertoire of actions - if not the famous four F's (fight, flee, feed, or mate), then a modest elaboration of them.
  • We're all zombies.*

    *It would be an act of desperate intellectual dishonesty to quote this assertion out of context!

  • Minds are in limited supply, and each mind has a limited capacity for memes, and hence there is considerable competition among memes for entry in as many minds as possible. This competition is the major selective force in the memosphere, and, just as in the biosphere, the challenge has been met with great ingenuity. For instance, whatever virtues (from our perspective) the following memes have, they have in common the property of having phenotypic expressions that tend to make their own replication more likely by disabling or preempting the environmental forces that would tend to extinguish them: the meme for faith, which discourages the exercise of the sort of critical judgment that might decide that the idea of faith was, all things considered a dangerous idea; the meme for tolerance or free speech; the meme of including in a chain letter a warning about the terrible fates of those who have broken the chain in the past; the conspiracy theory meme, which has a built-in response to the objection that there is no good evidence of a conspiracy: "Of course not - that's how powerful the conspiracy is!" Some of these memes are "good" perhaps and others "bad"; what they have in common is a phenotypic effect that systematically tends to disable the selective forces arrayed against them. Other things being equal, population memetics predicts that conspiracy theory memes will persist quite independently of their truth, and the meme for faith is apt to secure its own survival, and that of the religious memes that ride piggyback on it, in even the most rationalistic environments. Indeed, the meme for faith exhibits frequency-dependent fitness: it flourishes best when it is outnumbered by rationalistic memes; in an environment with few skeptics, the meme for faith tends to fade from disuse.
  • As Akins observes, it is not the point of our sensory systems that they should detect "basic" or "natural" properties of the environment, but that they should serve our "narcissistic" purposes in staying alive; nature doesn't build epistemic engines.
    • citing Kathleen Akins. "On Piranhas, Narcissism and Mental Representations: An Essay on Intentionality and Naturalism" (1989)
  • In a Thumbnail Sketch here is [the Multiple Drafts theory of consciousness] so far:

    There is no single, definitive "stream of consciousness," because there is no central Headquarters, no Cartesian Theatre where "it all comes together" for the perusal of a Central Meaner. Instead of such a single stream (however wide), there are multiple channels in which specialist circuits try, in parallel pandemoniums, to do their various things, creating Multiple Drafts as they go. Most of these fragmentary drafts of "narrative" play short-lived roles in the modulation of current activity but some get promoted to further functional roles, in swift succession, by the activity of a virtual machine in the brain. The seriality of this machine (its "von Neumannesque" character) is not a "hard-wired" design feature, but rather the upshot of a succession of coalitions of these specialists.

    The basic specialists are part of our animal heritage. They were not developed to perform peculiarly human actions, such as reading and writing, but ducking, predator-avoiding, face-recognizing, grasping, throwing, berry-picking, and other essential tasks. They are often opportunistically enlisted in new roles, for which their talents may more or less suit them. The result is not bedlam only because the trends that are imposed on all this activity are themselves part of the design. Some of this design is innate, and is shared with other animals. But it is augmented, and sometimes even overwhelmed in importance, by microhabits of thought that are developed in the individual, partly idiosyncratic results of self-exploration and partly the predesigned gifts of culture. Thousands of memes, mostly borne by language, but also by wordless "images" and other data structures, take up residence in an individual brain, shaping its tendencies and thereby turning it into a mind.

    • p. 253–4.
  • I have grown accustomed to the disrespect expressed by some of the participants for their colleagues in the other disciplines. "Why, Dan," ask the people in artificial intelligence, "do you waste your time conferring with those neuroscientists? They wave their hands about 'information processing' and worry about where it happens, and which neurotransmitters are involved, but they haven't a clue about the computational requirements of higher cognitive functions." "Why," ask the neuroscientists, "do you waste your time on the fantasies of artificial intelligence? They just invent whatever machinery they want, and say unpardonably ignorant things about the brain." The cognitive psychologists, meanwhile, are accused of concocting models with neither biological plausibility nor proven computational powers; the anthropologists wouldn't know a model if they saw one, and the philosophers, as we all know, just take in each other's laundry, warning about confusions they themselves have created, in an arena bereft of both data and empirically testable theories. With so many idiots working on the problem, no wonder consciousness is still a mystery. All these charges are true, and more besides, but I have yet to encounter any idiots. Mostly the theorists I have drawn from strike me as very smart people – even brilliant people, with the arrogance and impatience that often comes with brilliance – but with limited perspectives and agendas, trying to make progress on the hard problems by taking whatever shortcuts they can see, while deploring other people's shortcuts. No one can keep all the problems and details clear, including me, and everyone has to mumble, guess and handwave about large parts of the problem.

"Time and the observer" (1995)

Dennett, D.C. & Kinsbourne, M. (1995). "Time and the observer: The where and when of consciousness in the brain.". Behavioral and Brain Sciences (15 (2)).

  • Wherever there is a conscious mind, there is a point of view. A conscious mind is an observer, who takes in the information that is available at a particular (roughly) continuous sequence of times and places in the universe. A mind is thus a locus of subjectivity, a thing it is like something to be (Farrell, 1950, Nagel, 1974). What it is like to be that thing is partly determined by what is available to be observed or experienced along the trajectory through space-time of that moving point of view, which for most practical purposes is just that: a point. For instance, the startling dissociation of the sound and appearance of distant fireworks is explained by the different transmission speeds of sound and light, arriving at the observer (at that point) at different times, even though they left the source simultaneously.
    • pp. 183–247
  • But if we ask where precisely in the brain that point of view is located, the simple assumptions that work so well on larger scales of space and time break down. It is now quite clear that there is no single point in the brain where all information funnels in, and this fact has some far from obvious consequences.

Darwin's Dangerous Idea (1995)

Darwin's Dangerous Idea: Evolution and the Meanings of Life. Simon & Schuster. ISBN 0-684-80290-2

  • If I were to give an award for the single best idea anyone has ever had, I'd give it to Darwin, ahead of Newton and Einstein. And everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law. But it is not just a wonderful scientific idea. It is a dangerous idea.
    • chapter 1, "Is Nothing Sacred", p. 21
    • another expression of the same idea, almost certainly written by Dennett, but unsourced is:

      If I were to give a prize for the single best idea anybody ever had, I'd give it to Darwin for the idea of natural selection – ahead of Newton, ahead of Einstein. Because his idea unites the two most disparate features of our universe: The world of purposeless, meaningless matter-in-motion, on the one side, and the world of meaning, and purpose, and design on the other. He understood that what he was proposing was a truly revolutionary idea.

  • My admiration for Darwin's magnificent idea is unbounded, but I, too, cherish many of the ideas and ideals that it seems to challenge, and want to protect them. ... The only good way to do this - the only way that has a chance in the long run - is to cut through the smokescreens and look at the idea as unflinchingly, and dispassionately, as possible.
    • chapter 1, "Is Nothing Sacred", p. 21-22
  • The fundamental core of contemporary Darwinism, the theory of DNA-based reproduction and evolution, is now beyond dispute among scientists. It demonstrates its power every day, contributing crucially to the explanation of planet-sized facts of geology and meteorology, through middle-sized facts of ecology and agronomy, down to the latest microscopic facts of genetic engineering. It unifies all of biology and the history of our planet into a single grand story. Like Gulliver tied down in Lilliput, it is unbudgable, not because of some one or two huge chains of argument that might - hope against hope - have weak links in them, but because it is securely tied by thousands of threads of evidence anchoring it to virtually every other area of human knowledge. New discoveries may conceivably lead to dramatic, even "revolutionary" shifts in the Darwinian theory, but the hope that it will be "refuted" by some shattering breakthrough is about as reasonable as the hope that we will return to a geocentric vision and discard Copernicus.
  • The evidence of evolution pours in, not only from geology, paleontology, biogeography, and anatomy (Darwin's chief sources), but from molecular biology and every other branch of the life sciences. To put it bluntly but fairly, anyone today who doubts that the variety of life on this planet was produced by a process of evolution is simply ignorant - inexcusably ignorant, in a world where three out of four people have learned to read and write. Doubts about the power of Darwin's idea of natural selection to explain this evolutionary process are still intellectually respectable, however, although the burden of proof for such skepticism has become immense...
  • Much of the controversy and anxiety that has enveloped Darwin's idea ... can be understood as a series of failed campaigns to contain Darwin;s idea within some acceptably "safe" and merely partial revolution. Cede some or all of modern biology to Darwin, perhaps, but hold the line there! Keep Darwinian thinking out of cosmology, out of psychology, out of human culture, out of ethics, politics, and religion! In these campaigns, many battles have been won by the forces of containment: flawed applications of Darwin's idea have been exposed and discredited, beaten back by the champions of the pre-Darwinian tradition. But new waves of Darwinian thinking keep coming.
  • [A] skyhook is ... an exception to the principle that all design, and apparent design, is ultimately the result of mindless, motiveless mechanicity. A crane, in contrast, is a subprocess or special feature of a design process that can be demonstrated to permit the local speeding up of the basic, slow process of natural selection, and that can be demonstrated to be itself the predictable (or retroactively explicable) product of the basic process. ... [T]he physicist Steven Weinberg, in Dreams of a Final Theory (1992) ... distinguishes between uncompromising reductionism (a bad thing) and compromising reductionism (which he ringingly endorses). Here is my own version. We must distinguish reductionism, which is in general a good thing, from greedy reductionism, which is not. The difference, in the context of Darwin's theory, is simple: greedy reductionists think that everything can be explained without cranes; good reductionists think that everything can be explained without skyhooks.
  • [I]f you want to reason about faith, and offer a reasoned (and reason-responsive) defense of faith as an extra category of belief worthy of special consideration, I'm eager to [participate]. I certainly grant the existence of the phenomenon of faith; what I want to see is a reasoned ground for taking faith as a way of getting to the truth, and not, say, just as a way people comfort themselves and each other (a worthy function that I do take seriously). But you must not expect me to go along with your defense of faith as a path to truth if at any point you appeal to the very dispensation you are supposedly trying to justify. Before you appeal to faith when reason has you backed into a corner, think about whether you really want to abandon reason when reason is on your side. You are sightseeing with a loved one in a foreign land, and your loved one is brutally murdered in front of your eyes. At the trial it turns out that in this land friends of the accused may be called as witnesses for the defense, testifying about their faith in his innocence. You watch the parade of his moist-eyed friends, obviously sincere, proudly proclaiming their undying faith in the innocence of the man you saw commit the terrible deed. The judge listens intently and respectfully, obviously more moved by this outpouring than by all the evidence presented by the prosecution. Is this not a nightmare? Would you be willing to live in such a land? Or would you be willing to be operated on by a surgeon you tells you that whenever a little voice in him tells him to disregard his medical training, he listens to the little voice? I know it passes in polite company to let people have it both ways, and under most circumstances I wholeheartedly cooperate with this benign agreement. But we're seriously trying to get at the truth here, and if you think that this common but unspoken understanding about faith is anything better than socially useful obfuscation to avoid mutual embarrassment and loss of face, you have either seen much more deeply into the issue that any philosopher ever has (for none has ever come up with a good defense of this) or you are kidding yourself.
  • A prosthetically enhanced imagination is still liable to failure, especially if it is not used with sufficient rigor.
  • There is a familiar trio of reactions by scientists to a purportedly radical hypothesis: (a) "You must be out of your mind!", (b) "What else is new? Everybody knows that!", and, later - if the hypothesis is still standing - (c) "Hmm. You *might* be on to something!" Sometimes these phases take years to unfold, one after another, but I have seen all three emerge in near synchrony in the course of a half-hour's heated discussion following a conference paper.
  • People ache to believe that we human beings are vastly different from all other species - and they are right! We are different. We are the only species that has an extra medium of design preservation and design communication: culture. ... We have language, the primary medium of culture... In a few short millennia - a mere instant in biological time - we have already used our new exploration vehicles to transform not only our planet but the very process of design development that created us.
  • Philosophers might care to ask themselves ... how often they are accomplices in increasing the audience for a second-rate article simply because their introductory course needs a simple-minded version of a bad idea that even freshmen can refute. Some of the most frequently reprinted articles in twentieth-century philosophy are famous precisely because nobody believes them; everybody can see what's wrong with them. ... The confirmation of this claim is left as an exercise for the reader. Among the memes that structure the infosphere and hence affect the transmission of other memes are the laws of libel.
  • When comparing the time scales of genetic and cultural evolution, it is useful to bear in mind that we today - every one of us - can easily understand many ideas that were simply unthinkable by the geniuses in our grandparents' generation!
  • Experience teaches...that there is no such thing as a thought experiment so clearly presented that no philosopher can misinterpret it...
  • From what can "ought" be derived. The most compelling answer is this: ethics must be somehow based on an appreciation of human nature - on a sense of what a human being is or might be, and on what a human being might want to have or want to be. If that is naturalism, then naturalism is no fallacy. No one could seriously deny that ethics is responsive to such facts about human nature. We may just disagree about where to look for the most compelling facts about human nature -n novels, in religious texts, in psychological experiments, in biological or anthropological investigations. The fallacy is not naturalism but, rather, any simple-minded attempt to rush from facts to values. In other words, the fallacy is greedy reductionism of values to facts, rather than reductionism considered more circumspectly, as the attempt to unify our world-view so that out ethical principles don't clash irrationally with the way the world is.
  • [I]n all mammalian species that have so far been carefully studied, the rate at which their members engage in the killing of conspecifics is several thousand times greater than the highest homicide rate in any American city.
    • citing the research of George Williams from "Huxley's Evolution and Ethics in Sociobiological Perspective" in Zygon (v.23/88)
  • If you have ever asked yourself if there are facts about yourself (about your health, your competence, your prospects) you would rather not know, and decided that there were, you should be prepared to consider seriously the suggestion that the best - perhaps the only - way to ensure that such facts are not imposed on people is by prohibiting investigations likely to discover them.
  • It is not "scientism" to concede the objectivity and precision of good science, any more than it is history worship to concede that Napoleon did once rule in France and the Holocaust actually happened. Those who fear the facts will forever try to discredit the fact-finders.
  • [T]here are no forces on this planet more dangerous to us all than the fanaticisms of fundamentalism, of all species: Protestantism, Catholicism, Judaism, Islam, Hinduism, and Buddhism, as well as countless smaller infections.
  • A faith, like a species, must evolve or go extinct when the environment changes. It is not a gentle process in either case. ... It's nice to have grizzly bears and wolves living in the wild. They are no longer a menace; we can peacefully co-exist, with a little wisdom. The same policy can be discerned in our political tolerance, in religious freedom. You are free to preserve or create any religious creed you wish, so long as it does not become a public menace. We're all on the Earth together, and we have to learn some accommodation. ... The message is clear: those who will not accommodate, who will not temper, who insist on keeping only the purest and wildest strain of their heritage alive, we will be obliged, reluctantly, to cage or disarm, and we will do our best to disable the memes they fight for. Slavery is beyond the pale. Child abuse is beyond the pale. Discrimination is beyond the pale. The pronouncing of death sentences on those who blaspheme against a religion (complete with bounties or reward for those who carry them out) is beyond the pale. It is not civilized, and it is owed no more respect in the name of religious freedom than any other incitement to cold-blooded murder. ... That is - or, rather, ought to be, the message of multiculturalism, not the patronizing and subtly racist hypertolerance that "respects" vicious and ignorant doctrines when they are propounded by officials of non-European states and religions.
  • [T]he only meaning of life worth caring about is one that can withstand our best efforts to examine it.

Kinds of Minds (1996)

  • The task of the mind is to produce future, as the poet Paul Valery once put it. A mind is fundamentally an anticipator, an expectation-generator. It mines the present for clues, which it refines with the help of the materials it has saved from the past, turning them into anticipations of the future. And then it acts, rationally, on the basis of those hard-won anticipations.
  • Evolution embodies information in every part of every organism. ... This information doesn't have to be copied into the brain at all. It doesn't have to be "represented" in "data structures" in the nervous system. It can be exploited by the nervous system, however, which is designed to rely on, or exploit, the information in the hormonal systems just as it is designed to rely on, or exploit, the information embodied in your limbs and eyes. So there is wisdom, particularly about preferences, embodied in the rest of the body. By using the old bodily systems as a sort of sounding board, or reactive audience, or critic, the central nervous system can be guided - sometimes nudged, sometimes slammed - into wise policies. Put it to the vote of the body, in effect. ...

    When all goes well, harmony reigns and the various sources of wisdom in the body cooperate for the benefit of the whole, but we are all too familiar with the conflicts that can provoke the curious outburst "My body has a mind of its own!" Sometimes, apparently, it is tempting to lump together some of the embodied information into a separate mind. Why? Because it is organized in such a way that it can sometimes make independent discriminations, consult preferences, make decisions, enact policies that are in competition with your mind. At such time, the Cartesian perspective of a puppeteer self trying desperately to control an unruly body-puppet is very powerful. Your body can vigorously betray the secrets you are desperately trying to keep - by blushing and trembling or sweating, to mention only the most obvious cases. It can "decide" that in spite of your well-laid plans, right now would be a good time for sex, not intellectual discussion, and then take embarrassing steps in preparation for a coup d'etat. On another occasion, to your even greater chagrin and frustration, it can turn a deaf ear on your own efforts to enlist it for a sexual campaign, forcing you to raise the volume, twirl the dials, try all manner of preposterous cajolings to persuade it.

  • Animals are not just herbivores or carnivores. They are, in the nice coinage of the psychologist George Miller, informavores.
  • [I]t makes little difference where we draw the line between the pruning and shaping [of behavior] by natural selection which is genetically transmitted to offspring (the wiring you are born with), and the pruning and shaping that later takes place in the individual (the rewiring you end up with, as a result of experience or training). nature and nurture blend seamlessly together.
  • [T]he Capgras delusion [is] a bizarre affliction that occasionally strikes human beings who have suffered brain damage. The defining mark of the Capgras delusion is the sufferer's conviction that a close acquaintance (usually a loved one) has been replaced by an impostor who looks like (and sounds like, and acts like) the genuine companion, who has mysteriously disappeared! ... What is particularly surprising about these cases is that they don't depend on subtle disguises and fleeting glimpses. On the contrary, the delusion persists even when the target individual is closely scrutinized by the [Capgras sufferer], and is even pleading for recognition. Capgras sufferers have been known to murder their spouses, so sure are they that these look-alike interlopers are trying to step into their shoes - into whole lives - that are not rightfully theirs! There can be no doubt that in such a sad case, the [sufferer] in question has deemed true some very specific proposition of nonidentity: This man is not my husband; this man is a qualitatively similar to my husband as ever can be, and yet he is not my husband. Of particular interest to us is the fact that people suffering from such a delusion can be quite unable to say why they are so sure.
  • Unpredictability is in general a fine protective feature, which should never be squandered but always spent wisely. There is much to be gained from communication if it is craftily doled out - enough truth to keep one's credibility high but enough falsehood to keep one's options open. (This is the first point of wisdom in the game of poker: he who never bluffs never wins; he who always bluffs always loses.)
  • It is commonly observed - but not commonly enough! - that old folks removed from their homes to hospital settings are put at a tremendous disadvantage, even though their basic bodily needs are well provided for. They often appear to be quite demented - to be utterly incapable of feeding, clothing, and washing themselves, let alone engaging in any activities of greater interest. Often, however, if they are returned to their homes, they can manage quite well for themselves. How do they do this? Over the years, they have loaded their home environments with ultrafamiliar landmarks, triggers for habits, reminder of what to do, where to find the food, how to get dressed, where the telephone is, and so forth. An old person can be a veritable virtuoso of self-help in such a hugely overlearned world, in spite of his or her brain's increasing imperviousness to new bouts of learning... Taking them out of their homes is literally separating them from large parts of their minds - potentially just as devastating a development as undergoing brain surgery.
  • Of all the mind tools we acquire in the course of furnishing our brains from the stockpiles of culture, none are more important, of course, than words - first spoken, then written. Words make us more intelligent by making cognition easier, in the same way (many times multiplied) that beacons and landmarks make navigation in the world easier for simple creatures. Navigation in the abstract multidimensional world of ideas is simply impossible without a huge stock of movable, memorable landmarks that can be shared, criticized, recorded, and looked at from different perspectives.
  • Every human mind you've ever looked at ... is a product not just of natural selection but of cultural redesign of enormous proportions.

Brain Children (1998)

The MIT Press. ISBN 0262540908

  • The first stable conclusion I reached … was that the only thing brains could do was to approximate the responsivity to meanings that we presuppose in our everyday mentalistic discourse. When mechanical push comes to shove, a brain was always going to do what it was caused to do by current, local, mechanical circumstances, whatever it ought to do, whatever a God's-eye view might reveal about the actual meaning of its current states. But over the long haul, brains could be designed – by evolutionary processes – to do the right thing (from the point of view of meaning) with high reliability. … [B]rains are syntactic engines that can mimic the competence of semantic engines. … The appreciation of meanings – their discrimination and delectation – is central to our vision of consciousness, but this conviction that I, on the inside, deal directly with meanings turns out to be something rather like a benign "user-illusion".
    • chapter 25, "Self-Portrait"

"Postmodernism and truth" (1998)

Paper delivered at the 1998 World Congress of Philosophy (13 August 1998)

  • We alone can be wracked with doubt, and we alone have been provoked by that epistemic itch to seek a remedy: better truth-seeking methods. Wanting to keep better track of our food supplies, our territories, our families, our enemies, we discovered the benefits of talking it over with others, asking questions, passing on lore. We invented culture. Then we invented measuring, and arithmetic, and maps, and writing. These communicative and recording innovations come with a built-in ideal: truth. The point of asking questions is to find true answers; the point of measuring is to measure accurately; the point of making maps is to find your way to your destination. ... In short, the goal of truth goes without saying, in every human culture.
  • Scientists are just as vulnerable to wishful thinking, just as likely to be tempted by base motives, just as venal and gullible and forgetful as the rest of humankind. Scientists don't consider themselves to be saints; they don't even pretend to be priests (who according to tradition are supposed to do a better job than the rest of us at fighting off human temptation and frailty). Scientists take themselves to be just as weak and fallible as anybody else, but recognizing those very sources of error in themselves and in the groups to which they belong, they have devised elaborate systems to tie their own hands, forcibly preventing their frailties and prejudices from infecting their results.
  • The methods of science aren't foolproof, but they are indefinitely perfectible. Just as important: there is a tradition of criticism that enforces improvement whenever and wherever flaws are discovered. The methods of science, like everything else under the sun, are themselves objects of scientific scrutiny, as method becomes methodology, the analysis of methods. Methodology in turn falls under the gaze of epistemology, the investigation of investigation itself--nothing is off limits to scientific questioning. The irony is that these fruits of scientific reflection, showing us the ineliminable smudges of imperfection, are sometimes used by those who are suspicious of science as their grounds for denying it a privileged status in the truth-seeking department--as if the institutions and practices they see competing with it were no worse off in these regards. But where are the examples of religious orthodoxy being simply abandoned in the face of irresistible evidence? Again and again in science, yesterday's heresies have become today's new orthodoxies. No religion exhibits that pattern in its history.

Breaking the Spell (2006)

Breaking the Spell: Religion As A Natural Phenomenon

  • Since September 11, 2001, I have often thought that perhaps it was fortunate for the world that the attackers targeted the World Trade Center instead of the Statue of Liberty, for if they had destroyed our sacred symbol of democracy I fear we as Americans would have been unable to keep ourselves from indulging in paroxysms of revenge of a sort the world has never seen before. If that had happened, it would have befouled the meaning of the Statue of Liberty beyond any hope of subsequent redemption -- if there were any people left to care. I have learned from my students that this upsetting thought of mine is subject to several unfortunate misconstruals, so let me expand on it to ward them off. The killing of thousands of innocents in the World Trade Center was a heinous crime, much more evil than the destruction of the Statue of Liberty would have been. And, yes, the World Trade Center was a much more appropriate symbol of al Qaeda's wrath than the Statue of Liberty would have been, but for that very reason it didn't mean as much, as a symbol, to us. It was Mammon and Plutocrats and Globalization, not Lady Liberty. I do suspect that the fury with which Americans would have responded to the unspeakable defilement of our cherished national symbol, the purest image of our aspirations as a democracy, would have made a sane and measured response extraordinarily difficult. This is the great danger of symbols -- they can become too "sacred". An important task for religious people of all faiths in the twenty-first century will be spreading the conviction that there are no acts more dishonorable than harming "infidels" of one stripe or another for "disrespecting" flag, a cross, a holy text.
  • [W]hat good to us is the gods' knowledge if we can't get it from them? How could one communicate with the gods? Our ancestors (while they were alive!) stumbled on an extremely ingenious solution: divination.

    We all know how hard it is to make the major decisions of life: should I hang tough or admit my transgression, should I move or stay in my present position, should I go to war or not, should I follow my heart or my head? We still haven't figured out any satisfactory systematic way of deciding these things. Anything that can relieve the burden of figuring out how to make these hard calls is bound to be an attractive idea.

    Consider flipping a coin, for instance. Why do we do it? To take away the burden of having to find a reason for choosing A over B. We like to have reasons for what we do, but sometimes nothing sufficiently persuasive comes to mind, and we recognize that we have to decide soon, so we concoct a little gadget, an external thing that will make the decision for us. But if the decision is about something momentous, like whether to go to war, or marry, or confess, anything like flipping a coin would be just too, well, flippant.

    In such a case, choosing for no good reason would be too obviously a sign of incompetence, and, besides, if the decision is really that important, once the coin has landed you'll have to confront the further choice: should you honor your just-avowed commitment to be bound by the flip of the coin, or should you reconsider? Faced with such quandaries, we recognize the need for some treatment stronger than a coin flip. Something more ceremonial, more impressive, like divination, which not only tells you what to do, but gives you a reason (if you squint just right and use your imagination).

    Scholars have uncovered a comically variegated profusion of ancient ways of delegating important decisions to uncontrollable externalities. Instead of flipping a coin, you can flip arrows (belomancy) or rods (rhabdomancy) or bones or cards (sortilege), and instead of looking at tea leaves (tasseography), you can examine the livers of sacrificed animals (hepatoscopy) or other entrails (haruspicy) or melted wax poured into water (ceroscopy). Then there is moleosophy (divination by blemishes), myomancy (divination by rodent behavior), nephomancy (divination by clouds), and of course the old favorites, numerology and astrology, among dozens of others.

  • The daily actions of religious people have accomplished uncounted good deeds throughout history, alleviating suffering, feeding the hungry, caring for the sick. Religions have brought the comfort of belonging and companionship to many who would otherwise have passed through this life all alone, without glory or adventure. They have not just provided first aid, in effect, for people in difficulties; they have provided the means for changing the world in ways that remove those difficulties. As Alan Wolfe says, "Religion can lead people out of cycles of poverty and dependency just as it led Moses out of Egypt".[1] There is much for religion lovers to be proud of in their traditions, and much for all of us to be grateful for.

    The fact that so many people love their religions as much as, or more than, anything else in their lives is a weighty fact indeed. I am inclined to think that nothing could matter more than what people love. At any rate, I can think of no value that I would place higher. I would not want to live in a world without love. Would a world with peace, but without love, be a better world? Not if the peace was achieved by drugging the love (and hate) out of us, or by suppression. Would a world with justice and freedom, but without love, be a better world? Not if it was achieved by somehow turning us all into loveless law-abiders with none of the yearnings or envies or hatreds that are wellsprings of injustice and subjugation.

    It is hard to consider such hypotheticals, and I doubt if we should trust our first intuitions about them, but, for what it is worth, I surmise that we almost all want a world in which love, justice, freedom, and peace are all present, as much as possible, but if we had to give up one of these, it wouldn't -- and shouldn't -- be love. But, sad to say, even if it is true that nothing could matter more than love, it wouldn't follow from this that we don't have reason to question the things the we, and others, love. Love is blind, as they say, and because love is blind, it often leads to tragedy: to conflicts in which one love is pitted against another love, and something has to give, with suffering guaranteed in any resolution.

  • If I were designing a phony religion, I'd surely include a version of this little gem -- but I'd have a hard time saying it with a straight face:

    If anybody ever raises questions of objections about our religion that you cannot answer, that person is almost certainly Satan. In fact, the more reasonable the person is, the more eager to engage you in open-minded and congenial discussion, the more sure you can be that you're talking to Satan in disguise! Turn away! Do not listen! It's a trap!

    What is particularly cute about this trick is that it is a perfect "wild card," so lacking in content that any sect or creed or conspiracy can use it effectively. Communist cells can be warned that any criticism they encounter is almost sure to be the work of FBI infiltrators in disguise, and radical feminist discussion groups can squelch any unanswerable criticism by declaring it to be phallocentric propaganda being unwittingly spread by a brainwashed dupe of the evil patriarchy, and so forth. This all-purpose loyalty-enforcer is paranoia in a pill, sure to keep the critics muted if not silent.

    Did anyone invent this brilliant adaptation, or is it a wild meme that domesticated itself by attaching itself to whatever memes were competing for hosts in its neighborhood? Nobody knows, but now it is available for anybody to use -- although, if this book has any success, its virulence should diminish as people begin to recognize it for what it is.

  • Here is a well-known trajectory: You begin with a heartfelt desire to help other people and the conviction, however well or ill founded, that your guild or club or church is the coalition that can best serve to improve the welfare of others. If times are particularly tough, this conditional stewardship -- I'm doing what's good for the guild because that will be good for everybody -- may be displaced by the narrowest concern for the integrity of the guild itself, and for good reason: if you believe that the institution in question is the best path to goodness, the goal of preserving it for future projects, still unimagined, can be the most rational higher goal you can define. It is a short step from this to losing track of or even forgetting the larger purpose and devoting yourself singlemindedly to furthering the interests of the institution, at whatever costs. A conditional or instrumental allegiance can thus become indistinguishable in practice from a commitment to something "good in itself." A further short step perverts this parochial summum bonum to the more selfish goal of doing whatever it takes to keep yourself at the helm of the institution ("who better than I to lead us to triumph over our adversaries?")

    We have all seen this happen many times, and may even have caught ourselves in the act of forgetting just why we wanted to be leaders in the first place.

  • [W]hat [is] the prevailing attitude today among those who call themselves religious but vigorously advocate tolerance? There are three main options, ranging from the disingenuous Machiavellian--

    1. As a matter of political strategy, the time is not ripe for candid declarations of religious superiority, so we should temporize and let sleeping dogs lie in hopes that those of other faiths can gently be brought around over the centuries.

    --through truly tolerant Eisenhowerian "Our government makes no sense unless it is founded on a deeply religious belief -- and I don't care what it is" --

    2. It really doesn't matter which religion you swear allegiance to, as long as you have some religion.

    --to the even milder Moynihanian benign neglect--

    3. Religion is just too dear to too many to think of discarding, even though it really doesn't do any good and is simply an empty historical legacy we can afford to maintain until it quietly extinguishes itself sometime in the distant and unforeseeable future.

    It it no use asking people which they choose, since both extremes are so undiplomatic we can predict in advance that most people will go for some version of ecumenical tolerance whether they believe it or not. ...

    We've got ourselves caught in a hypocrisy trap, and there is no clear path out. Are we like families in which the adults go through all the motions of believing in Santa Claus for the sake of the kids, and the kids all pretend still to believe in Santa Claus so as not to spoil the adults' fun? If only our current predicament were as innocuous and even comical as that! In the adult world of religion, people are dying and killing, with the moderates cowed into silence by the intransigence of the radicals in their own faiths, and many afraid to acknowledge what they actually believe for fear of breaking Granny's heart, or offending their neighbors to the point of getting run out of town, or worse.

    If this is the precious meaning our lives are vouchsafed thanks to our allegiance to one religion or another, it is not such a bargain, in my opinion. Is this the best we can do? Is it not tragic that so many people around the world find themselves enlisted against their will in a conspiracy of silence, either because they secretly believe that most of the world's population is wasting their lives in delusion (but they are too tenderhearted -- or devious -- to say so), or because they secretly believe that their own tradition is just such a delusion (but they fear for their own safety if they admit it)?

  • A philosopher is someone who says, "We know it's possible in practice; we're trying to work out if it's possible in principle!"
  • Evolution is all about processes that almost never happen. Every birth in every lineage is a potential speciation event, but speciation almost never happens, not once in a million births. Mutation in DNA almost never happens -- not once in a trillion copyings -- but evolution depends on it. Take the set of infrequent accidents -- things that almost never happen -- and sort them into the happy accidents, the neutral accidents, and the fatal accidents; amplify the effects of the happy accidents -- which happens automatically when you have replication and competition -- and you get evolution.
  • We used to think that secrecy was perhaps the greatest enemy of democracy, and as long as there was no suppression or censorship, people could be trusted to make the informed decisions that would preserve our free society, but we have learned in recent years that the techniques of misinformation and misdirection have become so refined that, even in an open society, a cleverly directed flood of misinformation can overwhelm the truth, even though the truth is out there, uncensored, quietly available to anyone who can find it.
  • [L]et your self go. If you can approach the world's complexities, both its glories and its horrors, with an attitude of humble curiosity, acknowledging that however deeply you have seen, you have only scratched the surface, you will find worlds within worlds, beauties you could not heretofore imagine, and your own mundane preoccupations will shrink to proper size, not all that important in the greater scheme of things. Keeping that awestruck vision of the world ready to hand while dealing with the demands of daily living is no easy exercise, but it is definitely worth the effort, for if you can stay centered, and engaged, you will find the hard choices easier, the right words will come to you when you need them, and you will be a better person. That, I propose, is the secret to spirituality, and it has nothing at all to do with believing in an immortal soul, or in anything supernatural.
  • Remember Marxism? It used to be a sour sort of fun to tease Marxists about the contradictions in some of their pet ideas. The revolution of the proletariat was inevitable, good Maxists believed, but if so, why were they so eager to enlist us in their cause? If it was going to happen anyway, it was going to happen with or without our help. But of course the inevitability that Marxists believe in is one that depends on the growth of the movement and all its political action. There were Maxists working very hard to bring about the revolution, and it was comforting to them to believe that their success was guaranteed in the long run. And some of them, the only ones that were really dangerous, believed so firmly in the rightness of their cause that they believed it was permissible to lie and deceive in order to further it. They even taught this to their children, from infancy. These are the "red-diaper babies," children of hardline members of the Communist Party of America, and some of them can still be found infecting the atmosphere of political action in left-wing circles, to the extreme frustration and annoyance of honest socialists and others on the left.

    Today we have a similar phenomenon brewing on the religious right: the inevitability of the End Days, or the Rapture, the coming Armageddon that will separate the blessed from the damnned in the final day of Judgment. Cults and prophets proclaiming the imminent end of the world have been with us for several millennia, and it has been another sour sort of fun to ridicule them the morning after, when they discover that their calculations were a little off. But, just as with the Marxists, there are some among them who are working hard to "hasten the inevitable," not merely anticipating the End Days with joy in their hearts, but taking political action to bring about the conditions they think are the prerequisites for that occasion. And these people are not funny at all. They are dangerous, for the same reason that red-diaper babies are dangerous: they put their allegiance to their creed ahead of their commitment to democracy, to peace, to (earthly) justice -- and to truth. If push comes to shove, some of the are prepared to lie and even to kill...

  • Thanks to technology, what almost anybody can do has been multiplied a thousandfold, and our moral understanding about what we ought to do hasn't kept pace. ... You can have a test-tube baby or take a morning-after pill to keep from having a baby; you can satisfy your sexual urges in the privacy of your room by downloading Internet pornography, and you can keep your favorite music for free instead of buying it; you can keep your money in secret offshore bank accounts and purchase stock in cigarette companies that are exploiting impoverished Third World countries; and you can lay minefields, smuggle nuclear weapons in suitcases, make nerve gas, and drop "smart bombs" with pinpoint accuracy. Also, you can arrange to have a hundred dollars a month automatically sent from your bank account to provide education for ten girls in an Islamic country who otherwise would not learn to read and write, or to benefit a hundred malnourished people, or provide medical care for AIDS sufferers in Africa. You can use the Internet to organize citizen monitoring of environmental hazards, or to check the honesty and performance of government officials -- or to spy on your neighbors. Now, what ought we to do?
  • Surely just about everybody has faced a moral dilemma and secretly wished, "If only somebody -- somebody I trusted -- could just tell me what to do!" Wouldn't this be morally inauthentic? Aren't we responsible for making our own moral decisions? Yes, but the virtues of "do it yourself" moral reasoning have their limits, and if you decide, after conscientious consideration, that your moral decision is to delegate further moral decisions in your life to a trusted expert, then you have made your own moral decision. You have decided to take advantage of the division of labor that civilization makes possible and get the help of expert specialists.

    We applaud the wisdom of this course in all other important areas of decision-making (don't try to be your own doctor, the lawyer who represents himself has a fool for a client, and so forth). Even in the case of political decisions, like which way to vote, the policy of delegation can be defended. ... Is the a dereliction of [one's] dut[y] as a citizen? I don't think so, but it does depend on my having good grounds for trusting [the delegate's] judgment. ... That why those who have an unquestioning faith in the correctness of the moral teachings of their religion are a problem: if they themselves haven't conscientiously considered, on their own, whether their pastors or priests or rabbis or imams are worthy of this delegated authority over their own lives, then they are in fact taking a personally immoral stand.

    This is perhaps the most shocking implication of my inquiry, and I do not shrink from it, even though it may offend many who think of themselves as deeply moral. It is commonly supposed that it is entirely exemplary to adopt the moral teachings of one's own religion without question, because -- to put it simply -- it is the word of God (as interpreted, always, by the specialists to whom one has delegated authority). I am urging, on the contrary, that anybody who professes that a particular point of moral conviction is not discussable, not debatable, not negotiable, simply because it is the word of God, or because the Bible says so, or because "that is what all Muslims [Hindus, Sikhs ...] [sic] believe, and I am a Muslim [Hindu, Sikh ...]" [sic], should be seen to be making it impossible for the rest of us to take their views seriously, excusing themselves from the moral conversation, inadvertently acknowledging that their own views are not conscientiously maintained and deserve no further hearing.

  • In spite of ferocious differences of opinion about other moral issues, there seems to be something approaching consensus that it is cruel and malicious to interfere with the life-enhancing illusions of others- unless those illusions are themselves the cause of of even greater ills. The disagreements come over what these greater ills might be- and this has led to the break down of the whole rationale. Keeping secrets from people for their own good can often be wise, but it takes only one person to give away a secret, and since there are disagreements about which cases warrant discretion, the result is an unsavory miasma of hypocrisy, lies, and frantic but fruitless attempts at distraction.

"Thank Goodness" (2006)

(3 November 2006)"Thank Goodness!". Edge: The Third Culture (195). Retrieved on 2006-11-11.

(written from his hospital bed after surviving an aortic dissection)

  • Do I worship modern medicine? Is science my religion? Not at all; there is no aspect of modern medicine or science that I would exempt from the most rigorous scrutiny, and I can readily identify a host of serious problems that still need to be fixed. That's easy to do, of course, because the worlds of medicine and science are already engaged in the most obsessive, intensive, and humble self-assessments yet known to human institutions, and they regularly make public the results of their self-examinations. Moreover, this open-ended rational criticism, imperfect as it is, is the secret of the astounding success of these human enterprises. There are measurable improvements every day.
  • One thing in particular struck me when I compared the medical world on which my life now depended with the religious institutions I have been studying so intensively in recent years. One of the gentler, more supportive themes to be found in every religion (so far as I know) is the idea that what really matters is what is in your heart: if you have good intentions, and are trying to do what (God says) is right, that is all anyone can ask. Not so in medicine! If you are wrong —especially if you should have known better — your good intentions count for almost nothing. And whereas taking a leap of faith and acting without further scrutiny of one's options is often celebrated by religions, it is considered a grave sin in medicine. A doctor whose devout faith in his personal revelations about how to treat aortic aneurysm led him to engage in untested trials with human patients would be severely reprimanded if not driven out of medicine altogether. There are exceptions, of course. A few swashbuckling, risk-taking pioneers are tolerated and (if they prove to be right) eventually honored, but they can exist only as rare exceptions to the ideal of the methodical investigator who scrupulously rules out alternative theories before putting his own into practice. Good intentions and inspiration are simply not enough.

    In other words, whereas religions may serve a benign purpose by letting many people feel comfortable with the level of morality they themselves can attain, no religion holds its members to the high standards of moral responsibility that the secular world of science and medicine does! And I'm not just talking about the standards 'at the top' — among the surgeons and doctors who make life or death decisions every day. I'm talking about the standards of conscientiousness endorsed by the lab technicians and meal preparers, too. This tradition puts its faith in the unlimited application of reason and empirical inquiry, checking and re-checking, and getting in the habit of asking "What if I'm wrong?" Appeals to faith or membership are never tolerated. Imagine the reception a scientist would get if he tried to suggest that others couldn't replicate his results because they just didn't share the faith of the people in his lab! And, to return to my main point, it is the goodness of this tradition of reason and open inquiry that I thank for my being alive today.

The Genius of Charles Darwin (television, 2008)

  • The earth has grown a nervous system, and it's us.

Notes and references

  1. Wolfe, Alan. The Transformation of American Religion: How We Actually Live Our Faith 2003, p.139

External links

Wikipedia
Wikipedia has an article about:

Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message