Wednesday, April 23, 2014

Open mouth, insert foot...Charles Murray's perspective on women in philosophy


"What Charles Murray Doesn't Get About Women and Philosophy"

The conservative author believes that women have contributed little to major philosophical traditions because men are better abstract thinkers.

by

Noah Berlatsky

April 17th, 2014

The Atlantic

"No woman has been a significant original thinker in any of the world's great philosophical traditions." So said the author Charles Murray in a 2005 essay titled "The Inequality Taboo," in which he argued that men are better at abstract thinking than women are. In a recent talk at the University of Austin, timed to promote his new book The Curmudgeon’s Guide to Getting Ahead, a student asked Murray if he stood by this claim. As Amanda Marcotte notes in a piece at Slate, Murray began with condescension (“tell me who you had in mind”), and then added, "Until somebody gives me evidence to the contrary, yeah, I'll stick with that statement."

There are a couple of ways to respond to Murray's claim that women have made no significant original contributions to philosophy. You could, first of all, produce a list of female philosophers. Marcotte takes this path, mentioning names like Hannah Arendt and Elizabeth Anscombe. The problem though, as Marcotte says, is that:

    If you know how the game is played, you'll know that if you start listing philosophers, Murray or any of his defenders will just muse about whether their work is wholly "original," since said women likely have read other philosophers—most of whom are male by virtue of women being squeezed out of educational opportunities and platforms to express their thoughts throughout most of history.

Rather than just throwing names around, then, it seems like it might be more useful to address Murray's question about philosophy more philosophically. When Murray says he is looking for "significant original thinker[s] in the world's great philosophical traditions," what does that mean? What intellectual preconceptions is he operating under? What isn't spoken when he speaks?

Feminist thinkers have actually spent a lot of time philosophizing about women's inclusion and exclusion from lists of most important this or that, especially in the context of literary canons. Murray is willing to acknowledge that there have been important women writers, (since he believes literary thought is less abstract than philosophical thought) but he glosses over the fact that the literary canon, too, is tilted very male. In the Modern Library's list of the 100 greatest novels of the 20th century, for example, the top 14 are all by men (Virginia Woolf's To the Lighthouse is 15.)

What accounts for this? In How To Suppress Women's Writing (1983) science-fiction author Joanna Russ looked at this imbalance and argued that when women are not included in the canon, the problem is not with the women. Instead, she said, "A mode of understanding literature which can ignore the private lives of half the human race is not 'incomplete': It is distorted through and through." She adds that the statement, "This is a good novel," begs the questions, "Good for what? Good for whom?"

Russ' point is that the claim of universal value—or, in Murray's terms, of abstract thought—is duplicitous. It assumes a culture and an intellectual frame in which there is no power differential; in which everyone is the same as everyone else, and in which you can speak from nowhere to everyone. But that “view from nowhere” does not exist (as many important philosophers, from Derrida to Foucault to Irigaray, have pointed out.) To say that all the best books are by men therefore says as much about the interest, and the relationship to power, of the list as it does about the books selected. It means, among other things, that experience coded as male (of manly old men catching fish, for example, or of lusting after young girls named Lolita) is more important than experience coded as female. "When we all live in the same culture," Russ says, "then it will be time for one literature." But we don't, and it isn’t.

Again, most feminist discussions of canon have focused on literature. But some have expanded the argument to philosophy—at least implicitly. In her groundbreaking book Black Feminist Thought, Patricia Hill Collins argues that there is an important intellectual tradition among black women, but that this tradition is often ignored or erased because of the ways in which black women have been prevented from entering academic institutions—or even from attaining literacy. To uncover the intellectual tradition of black women, therefore, you have to look not just to philosophical tomes and the pronouncements of tenured pronouncers like Charles Murray, but to oral accounts, blues lyrics, and other marginal spaces. In this context, Collins highlights Sojourner Truth's famous "Ain't I a woman" speech:

    That man over there says that women need to be helped into carriages, and lifted over ditches, and to have the best place everywhere. Nobody ever helps me into carriages, or over mud-puddles, or gives me any best place! And ain't I a woman? Look at me! Look at my arm! I have ploughed and planted, and gathered into barns, and no man could head me! And ain't I a woman? I could work as much and eat as much as a man—when I could get it—and bear the lash as well! And ain't I a woman? I have borne thirteen children, and seen most all sold off to slavery, and when I cried out with my mother's grief, none but Jesus heard me! And ain't I a woman?

Truth here is thinking through, and challenging, ideas about masculinity and femininity—or as Collins says, "Rather than accepting the existing assumptions about what a woman is and then trying to prove that she fit the standards, Truth challenged the very standards themselves." Are Truth's comments more or less original than Nietzsche's paranoid, misogynist assertion that "Woman has always conspired with the types of decadence, the priests, against the "powerful", the "strong", the men?" Of the two, whose take on masculinity and femininity is more subtle, more nuanced, more surprising? For that matter, whose views have been more influential (among what groups?) on that much-brooded philosophical question, "What is woman?" and its supposedly more universal correlate, "What is man?"

Murray's statements about women and about philosophy are based on a slew of preconceptions—about what philosophy is, about which intellectual traditions are significant (not feminism for him, apparently), about which communities get to define "philosophy," and about what influence is considered consequential. When he says that there have been no significant female contributors to philosophy, he thinks he has made a statement about women. But in fact, he is telling us about philosophy—or about his particular, limited view of it. The real question shouldn't be "can woman do philosophy?" but rather "can philosophy make itself worthy of women like Sojourner Truth, Patricia Hill Collins, and Joanna Russ?" Is there a philosophy that can speak thoughtfully about equality, about injustice, and about women? Or does philosophy only exist in the cramped, querulous skulls of white men like Charles Murray, where it can celebrate its own insularity as originality, and its myopia as far-looking genius.


Charles Murray [Wikipedia]

Vocabulary list #25


Here'sssssssss Tim with a new list of words.

achromic

ey-kroh-mik

adjective

Colorless, without coloring matter.


adscititious
ad-suh-tish-us

adjective

1. Derived or acquired from something on the outside.
2. Supplemental, additional.


agnize

ag-nahyz

verb

To recognize, acknowledge, own.


analphabetic

an-al-fuh-bet-ik

adjective

1. Not alphabetic, an analphabetic arrangement of letters.
2. Unable to read or write, illiterate: analphabetic peoples.


cadge

kaj

verb

Beg, sponge.


catharsis


kuh-thar-seez

noun

1. The purging of the emotions or relieving of emotional tensions, especially through
certain kinds of art, as tragedy or music.
2. [Medicine/Medical] Purgation.


cock-a-hoop

kah-kuh-hoop

adjective

1. Triumphantly boastful, exulting.
2. Awry.


collimate

kah-luh-mayt

verb

To make (something, such as light rays) parallel.



cuittle

ky-tl

verb

To wheedle, cajole,or coax.


epistolary

ih-pist-uh-lair-ee

adjective

1. Of, relating to, or suitable to a letter.
2. Contained in or carried on by letters.
3. Written in the form of a series of letters.


hippophile

ip-uh-fahyl

noun

Lover of horses.


hospitalist


hah-spih-tuh-list

noun

A physician who specializes in treating hospitalized patients of other physicianimms.


immisible

ih-mis-uh-buhl

adjective

Not miscible, incapable of being mixed.


infix 

in-fiks

noun

A derivational or inflectional affix appearing in the body of a word.


lodestar

lohd-stahr

noun

One that serves as an inspiration, model, or guide.


madeleine

mad-uh-lun

noun

1. A small rich shell-shaped cake.
2. One that evokes a memory.


maslin

maz-lin

noun

1. Mixture, medley.
2. A mixture of different grains, flour or meals, especially rye with wheat.


minutia
muh-noo-shee-uh

noun

A minute or minor detail.


oneiric

oh-nye-rik

adjective

Of or relating to dreams, dreamy.


opusculum

oh-pusk-yuh-lum

noun

A minor work (as of literature).


omphalospsis

om-fuh-loh-skep-sis-om

noun

Contemplation of one's navel as an aid to meditation.


passe-partout

pas-pahr-too

noun

1. Something that passes everywhere or provides a universal means of passage.
2. A master key, skeleton key.


perforce

per-forss

adverb

By force of circumstances.


picayune

pik-ee-yoon

adjective

1. Of little value or account, small, trifling: a picayune amount.
2. Petty, carping, or prejudiced: I didn't want to seem picayune by criticizing.


recondite


rek-un-dyte

adjective

1. Hidden from sight, concealed.
2. Difficult or impossible for one of ordinary understanding or knowledge to comprehend, deep.
3. Of, relating to, or dealing with something little known or obscure.


sallow

sal-oh

adjective

Of a grayish greenish yellow color suggesting sickliness.


tabula rasa

tab-yuh-luh-rah-zuh

noun

1. The mind in its hypothetical primary blank or empty state before receiving outside impressions.
2. Something existing in its original pristine state.


timorous

tim-uh-rus

adjective

1. Of a timid disposition, fearful.
2. Expressing or suggesting timidity.


virescent 

vuh-ress-unt

adjective

Beginning to be green, greenish.


xylography

zahy-log-ruh-fee

noun

The art of engraving on wood, or of printing from such engravings.

Vocabulary list--#1

Vocabulary list--#2

Vocabulary list--#3




Vocabulary list--#13

Vocabulary list--#14

Vocabulary list--#15

Vocabulary list--#16

Vocabulary list--#17

Vocabulary list--#18  

Vocabulary list--#19

Vocabulary list--#20 


Vocabulary list--#21 


Vocabulary list--#22
 

Vocabulary list--#23
 

Vocabulary list #24

Tuesday, April 22, 2014

Deceased--John C. Houbolt

John C. Houbolt
April 10th, 1919 to April 15th, 2014

"John C. Houbolt dies at 95; NASA engineer made moon landing possible"

Houbolt's efforts convinced the space agency to focus on landing a module carrying a crew from lunar orbit rather than a rocket from Earth.

April 21st, 2014

latimes.com

John C. Houbolt, an engineer whose contributions to the U.S. space program were vital to NASA's successful moon landing in 1969, has died. He was 95.

Houbolt died April 15 at a nursing home in Scarborough, Maine, of complications from Parkinson's disease, said his son-in-law, Tucker Withington of Plymouth, Mass.

As NASA describes on its website, while under pressure during the U.S.-Soviet space race, Houbolt was the catalyst in securing U.S. commitment to the science and engineering theory that eventually carried the Apollo crew to the moon and back safely.

His efforts in the early 1960s are largely credited with convincing NASA to focus on the launch of a module carrying a crew from lunar orbit, rather than a rocket from Earth or other method.

Houbolt argued that a lunar orbit rendezvous, or LOR, would not only be less mechanically and financially onerous than building a huge rocket to take man to the moon or launching a craft while orbiting the Earth, but LOR was the only option to meet President Kennedy's challenge to reach the moon before the end of the decade.

NASA describes "the bold step of skipping proper channels" that Houbolt took by pushing the issue in a private letter in 1961 to an incoming administrator.

"Do we want to go to the moon or not?" Houbolt asks. "Why is a much less grandiose scheme involving rendezvous ostracized or put on the defensive? I fully realize that contacting you in this manner is somewhat unorthodox, but the issues at stake are crucial enough to us all that an unusual course is warranted."

Houbolt started his career with NASA's predecessor, the National Advisory Committee for Aeronautics, in Hampton, Va., in 1942. He left in 1963 to work for an aeronautical research and consulting firm in Princeton, N.J., then returned to NASA in 1976 as chief aeronautical scientist at Langley Field Center in Virginia. He retired in 1985 but continued private consulting work.

Born April 10, 1919, in Altoona, Iowa, Houbolt grew up in Joliet, Ill., and earned degrees in civil engineering from the University of Illinois at Urbana-Champaign. He received a doctorate from the Swiss Federal Institute of Technology at Zurich in 1957.

Fame, doubt--Lewis Carroll



"Lewis Carroll Hated Fame So Much, He Sometimes Regretted Writing Alice"

by

Lauren Davis
April 18th, 2014

io9

Charles Dodgson, the author and mathematician better known as Lewis Carroll, wrote about a young girl lost in surreal dreamscapes. But Dodgson had trouble navigating treacherous landscape of his own: literary fame.

The author of Alice's Adventures in Wonderland and Through the Looking-Glass wrote in 1891 letter to one Mrs. Symonds, explaining his particular discomfort with his fame:


    I don't think I explained successfully my reasons for disliking letters of mine being put into autograph-collections. All that sort of publicity leads to strangers hearing of my real name in connection with the books, and to my being pointed out to, and started at by, strangers, and treated as a 'lion.' And I hate all that so intensely that sometimes I almost wish I had never written any books at all.

The letter was recently sold through Bonhams at auction for £11,875 by the University of Southern California, meaning the anti-fame missive has landed in a place very much obsessed with fame: Los Angeles.


"Fame-Hating Lewis Carroll Letter Lands in Los Angeles"

by

Jennifer Schuessler

April 17th, 2014

The New York Times

A letter by Lewis Carroll declaring his hatred for fame and sometime wish that he had never written “Alice’s Adventures in Wonderland” has now found a home in the epicenter of the global celebrity hellscape: Los Angeles.

The University of Southern California announced that it is the new owner of the letter, which was purchased anonymously at the London auction house Bonhams in March for $19,800, several times the estimate.

In the three-page letter, written to his friend Anne Symonds in 1891, Carroll (whose real name was Charles Dodgson) railed against collectors of his autograph letters.
“All of that sort of publicity leads to strangers hearing of my real name in connection with the books, and to my being pointed out to and stared at by strangers and being treated as a ‘lion,’” he wrote. “And I hate all of that so intensely that sometimes I almost wish I had never written any books at all.”

In a statement, Abby Saunders, the curator of the university’s collection of more than 3,000 books, pamphlets, games and other items relating to Carroll, acknowledged the dry humor that could be made of the new acquisition. “Here in Los Angeles, where celebrity culture goes hand in hand with the film industry,” she said, “Carroll’s thoughts on fame are especially poignant.”

Monday, April 14, 2014

Jon Pertwee and vacuum tubes


Yes, it's Jon Pertwee who "starred as the Third Doctor in the science-fiction series Doctor Who from 1970 to 1974".

"philosophical self-examination will continue to have a reason for being"


"Philosophy in the Popular Imagination"

by

Andrew Taggart

March/April 2014

Philosophy Now

In my life nothing good has ever come of the “What do you do?” question. Once off my lips, the line “I work on moral philosophy, on ethics,” can lead in only one of two directions. Either my acquaintance, unschooled in philosophy, will be almost preternaturally interested in what I have to say – as if she’s happened upon some sublime creature only thought to exist on blanched parchment – or she’ll be absolutely dumbstruck by the stupidity of a life well wasted. Although it could go either way, let’s suppose she’s alighted on the latter path. “Philosophy… It doesn’t get you anywhere,” she replies, reveling in a truth that she believes is as certain as the claim that night follows day. And I’ve yet to come up with a truly satisfying rejoinder, probably because there’s no such thing. Try a joke, you think? “Oh, I don’t know, it certainly gets you into debt.” Or a plea for clarification? “I suppose it depends on what you mean by ‘get you anywhere’.”

The truth is that neither response will do. For if my conversational partner already thinks philosophy a waste of time, perhaps through having a mistaken conception of philosophy, then she probably (as my former English landlady was fond of saying) “can’t be bothered” to listen to a full rebuttal, and she won’t brook a sharp counterexample either. Like so many others, she’s already made up her mind – or, better put, her mind has already been made up for her.

To do philosophy in the public sphere today is to be immediately put on the defensive, and in most cases, to stand in the wrong. And yet, how we got to this point where philosophy has been put on all fours – either fetishized as being beyond the real world or vilified for playing no part in it – still needs to be explained. A first modest step would be to get straight in our minds how many lay people conceive of philosophy, and why this (mis)conception should matter to those of us who believe, somewhat antiquely, in the life of the mind.

One place to begin is with my interlocutor’s saying that when philosophers discuss something, they never get anywhere. She could mean one of three things by this: first, that philosophers get mired in endless debate, never yielding anything in the way of concrete resolution; or second, that they continually make something out of nothing, causing all parties involved to be brought to a state of mental confusion due to the endless jostling over definitions and the petty squabbling over overnice distinctions; or third, that in the game of philosophy there’s no way to resolve who’s right and who’s wrong. These three doubts, collectively or individually, present considerable challenges to philosophy’s basic self-conception. The first doubt would have it that there can be no authoritative conclusions drawn from a set of competing claims; the second that no mental tranquility can be gained; and the third that there can be no certain judgments concerning winners and losers in the truth game.

Rather than respond to each of the doubts in turn, it occurs to me that it would be wiser to ask what assumptions lie behind my interlocutor’s worries. I suspect that she deeply feels the culturally-widespread loss of faith in the power of reason to help us understand ourselves and our world. She needn’t be a relativist or a deep skeptic to believe this. She may simply believe, for instance, that some combination of emotions, instincts, past experiences, hunches, friends’ advice and expectations, is better than reason at determining how we should act. Against this attitude the philosopher’s belief that reason has its own power to help (as well as its own inherent limitations) requires a profound attitudinal shift: humility must be cultivated where there was once impatience. The light of reason can only shine after we’ve discovered how to quiet our minds and distance ourselves from our ‘empirical selves’ – the chaos of our everyday experience. There’s a long education – an itinerary of sorts – which ultimately leads to this state of mind: a path that the uninitiated hasn’t known or hasn’t taken, and in consequence, can’t find value in.

My interlocutor might concede that if philosophy has any value, it’s philosophy in the sense that everyone has their own personal philosophy. A personal philosophy, she might suggest, is a fundamental set of beliefs that one lives by. Think of a book subtitle like ‘The Personal Philosophies of Remarkable Men and Women’. In this sense of the word, we’d be justified in saying that a coach has her own coaching philosophy, a company its corporate philosophy, a party its governing philosophy, etc.

I’m not so sure that the notion of personal philosophy gets us very far in vindicating philosophy per se, for three reasons. One is that it’s not clear to me how far anyone espousing a personal philosophy is really committed to that set of beliefs. How do we know that he lives his life so that it lines up with his own ‘philosophy’, or that when the sea looks stormy, he won’t jump ship? How far his beliefs line up with his actions has yet to be demonstrated. Another reason is that we would need to know whether his personal philosophy is worth standing by. Merely saying “This I believe!” can’t be the end point of any probing inquiry, but must instead be a starting point. And the last reason, already more than hinted at, is that, whatever it is, philosophy must be more than a doctrine, it must be a certain style of thought – a way of examining one’s life with the goal of determining whether the life I’m leading amounts to anything good, for example. But when someone expresses their personal philosophy, the question of why it’s a good thing to have a personal philosophy still remains unasked, as if it were enough just to have one. So personal philosophies are frequently unphilosophical in nature.

“All right. But if you’re going to dismiss talk of personal philosophy as hopelessly ‘unphilosophical’, you’ll have to come round to agreeing with me that philosophy is otherwise useless,” my non-philosophical friend might (philosophically) argue. “After all, the philosophy you’re talking about has no bearing on the real world. It’s mostly an academic pursuit full of puzzles, word games, and the kind of thing that’s done in universities. It’s up in the clouds, not down-to-earth, and nowhere else useful, either.”

“You’re right, contemporary professional philosophy has, in general, become unhinged from the concerns common to all of us,” I might reply. “And, yes, the worst of it has degenerated into logical puzzles and the search for ingenious counterexamples and knock-down arguments to theories no-one else is interested in. But, beyond these worries, I can hear in your voice the most potent criticism – that philosophy is worthless on the grounds that acting is more important than thinking. ‘Getting things done,’ you imply, should be ranked much higher than ‘pie-in-the-sky reasoning’.”

Suppose for a moment that my interlocutor is right. But then, aren’t there times when we don’t know how to act – as well as times when we’re completely at a loss concerning how to go on, or how we got to where we are – a place where we’d prefer not to be? Times when we’re in a crisis over which we seem to have no control? Times when our lives no longer seem to make any sense? At such times, wouldn’t it be wise for us to try to think our way through the situation, in order to come to some more complete understanding of ourselves and of our place in the order of things? It’s at such tragic moments that the moral philosopher Harry Frankfurt’s questions concerning what we care most about and what (and who) is worthy of our care might ring in our ears. At its best, philosophy asks us to be meticulously honest with ourselves. It impels us to look closely at the hand we’ve been dealt, to determine the extent to which we’ve helped or harmed others, to figure out what ultimately matters to us, and to assess how we’ve lived, in the most fundamental terms we can understand.

Were my questioner then to ask, “Why philosophy now?” my reply would be that we’re living through a historical period marked by great change. The institutions that make up the modern world – education and medicine, family and religion, home and work, among others – as well as the spheres of influence that define our existence – the economy, civil society, the state – are changing dramatically, resulting in new experiments in living being tried out. Some will get replicated and others will be ruled out. Insofar as our time has raised fundamental questions about the nature of our existence (what, for instance, is work? What is meaningful work? What is family? What is justice?), philosophy has returned in the form of a life-need – an activity that, although not fully understood, nor resounding with authority in the public imagination, is more than ever of vital importance. And yet philosophy is vital only if its conclusions are embodied, lived out in practice – that is to say, only so long as they’re taken in, sat with, mulled over, and integrated into our being. In this light, what people should find attractive about the philosophical life is a vision of an integrated soul – a person whose fundamental cares and concerns are integrated into a meaningful whole.

And if, at this late stage, she were to scoff at this seeming self-importance, I might offer the thought that one of the things I’ve learned is not to take myself too seriously. Jane Austen taught me that, along with my myriad failures. Other things I’m still learning include the art of speaking truthfully without bullying or boring; that of being honest without baring all to all; of being accurate without being self-righteous: also, how to be grateful when corrected, open-minded while committed, and, above all, how to love the small things, the grace of them all, and, not least, the sheer fact of my existence.

Reason, it turns out, is neither omnipotent nor impotent in matters of the head and heart. Philosophical wisdom is neither so rare as to be entirely extinct from the world we inhabit, nor so common as to be easily purchasable in the marketplace. Yet thanks to a mature recognition that things aren’t as they ought to be, philosophical self-examination will continue to have a reason for being, because it promises to ultimately bring us peace of mind about the things that matter most.

The progress of philosophy


"How Philosophy Makes Progress"

by

Rebecca Newberger Goldstein

April 14th. 2014

The Chronicle of Higher Education

Philosophy was the first academic field; the founder of the Academy was Plato. Nevertheless, philosophy’s place in academe can stir up controversy. The ancient lineage itself provokes dissension. Philosophy’s lack of progress over the past 2,500 years is accepted as a truism, trumpeted not only by naysayers but even by some of its most enthusiastic yea-sayers. But the truism isn’t true. Both camps mistake the nature of philosophy and so are blind to its progress. Let’s consider the yea-sayers first.

The structure of universities demands that a field be designated as a science, a social science, or one of the humanities. This structure has ill served philosophy. It’s not a science, and it’s not a social science. Therefore it belongs, by default, to the humanities, rubbing shoulders with English literature and art history. And what are the humanities? They are premised, according to one cultural critic, Leon Wieseltier, who is among the most impassioned contemporary defenders of the humanities, on "the irreducible reality of inwardness" and are, in fact, "the study of the many expressions of that inwardness." (Wieseltier’s words were written in response to an essay by my husband, Steven Pinker.)

This definition of the humanities is arguably apposite for the study of art and literature, but most philosophers would reject it, starting with Plato himself. In fact, it sounds like a course-catalog description of the shadow studies in which the prisoners of Plato’s cave are involuntarily enrolled. The man who banished the poets from his utopia would hardly acquiesce in a view of philosophy that rendered it a species of literature. If the arguments of Plato and Descartes, Spinoza and Hume, Kant and Wittgenstein yield us nothing but expressions of our irreducible inwardness, then we can judge them only on aesthetic grounds, as we do Sophocles and Dante, Shakespeare and Milton, Virginia Woolf and James Joyce. Some philosophers might agree to the aestheticizing of the field (Martin Heidegger? Richard Rorty?), but many more would not. Henri Bergson argued that the relentless flow of time captures the essence of reality, and that, therefore, all concepts being static, distort reality. Proust channeled this conclusion into the literary techniques of In Search of Lost Time. But while we evaluate Bergson on the merits of his arguments, argumentative validity has no bearing on the accomplishment of Proust.

When it comes to philosophy’s progress, the inward-looking view of Wieseltier decrees that there is none: "The history of science is a history of errors corrected and discarded. But the vexations of philosophy and the obsessions of literature are not retired in this way. In these fields, the forward-looking cast backward glances." Literature and philosophy are crushed together in the hearty embrace. Plato would shudder.

Now for the naysayers. In the past, opposition to philosophy most often came from the pious, who protested the blasphemous arrogance of human reason seeking to supplant revelation. But nowadays the most vociferous of the naysayers are secular and scientific. While the yea-sayer sees philosophy as a species of literature, the naysayer sees philosophy as failed science. He urges us to look at the history of science and its triumphant expansions, which is simultaneously the history of the embarrassing shrinkage of philosophy. Yes, philosophy was the first academic field, but only because the sciences had not yet developed. Questions of physics, cosmology, biology, psychology, cognitive and affective neuroscience, linguistics, mathematical logic: Philosophy once claimed them all. But as the methodologies of those other disciplines progressed—being empirical, in the case of all but logic—questions over which philosophy had futilely sputtered and speculated were converted into testable hypotheses, and philosophy was rendered forevermore irrelevant.

Is there any doubt, demand the naysayers, about the terminus of this continuing process? Given enough time, talent, and funding, there will be nothing left for philosophers to consider. To quote one naysayer, the physicist Lawrence Krauss, "Philosophy used to be a field that had content, but then ‘natural philosophy’ became physics, and physics has only continued to make inroads. Every time there’s a leap in physics, it encroaches on these areas that philosophers have carefully sequestered away to themselves." Krauss tends to merge philosophy not with literature, as Wieseltier does, but rather with theology, since both, by his lights, are futile attempts to describe the nature of reality. One could imagine such a naysayer conceding that philosophers should be credited with laying the intellectual eggs, so to speak, in the form of questions, and sitting on them to keep them warm. But no life, in the form of discoveries, ever hatches until science takes over.

There’s some truth in the naysayer’s story. As far as our knowledge of the nature of physical reality is concerned—four-dimensional space-time and genes and neurons and neurotransmitters and the Higgs boson and quantum fields and black holes and maybe even the multiverse—it’s science that has racked up the results. Science is the ingenious practice of prodding reality into answering us back when we’re getting it wrong (although that itself is a heady philosophical claim, substantiated by concerted philosophical work).

And, of course, we have a marked tendency to get reality wrong. If you think of the kind of problems our brains evolved to solve in the Pleistocene epoch, it’s a wonder we’ve managed to figure out a technique to get so much right, one that is capable of getting reality itself to debunk some of our deepest intuitions about it—for example, relativity theory playing havoc with our ideas of space and time and quantum mechanics playing similarly with our notions of causality. In contrast, philosophical arguments, lacking that important pushback from the world, don’t have a comparable track record in establishing what Hume called matters of fact and existence.

The naysayer’s view of philosophy as failed or immature science denies it the possibility of progress, as does the yea-sayer’s view of philosophy as a species of literature. But neither conforms to what philosophy is really about, which is to render our human points of view ever more coherent. It’s in terms of our increased coherence that the measure of progress has to be taken, not in terms suitable for evaluating science or literature. We lead conceptually compartmentalized lives, our points of view balkanized so that we can live happily with our internal tensions and contradictions, many of the borders fortified by unexamined presumptions. It’s the job of philosophy to undermine that happiness, and it’s been at it ever since the Athenians showed their gratitude to Socrates for services rendered by offering him a cupful of hemlock.

One troubled conceptual border to which philosophers attend concerns science itself. In his essay "Philosophy and the Scientific Image of Man," the philosopher Wilfrid Sellars agrees that the proper agenda of philosophy lies in mediating among simultaneously held points of view with the aim of integrating them into a coherent whole. But for Sellars the action is focused on the border between what he calls the "scientific image" of us-in-the-world and the "manifest image" of us-in-the-world. (His actual language is "man-in-the-world." Sellars’s paper was published in 1962, based on two talks he gave in 1960. Certain incoherencies in points of view, reflected in linguistic standards, were yet to come to light.)

"For the philosopher is confronted not by one complex many-dimensional picture, the unity of which, such as it is, he must come to appreciate; but by two pictures of essentially the same order of complexity, each of which purports to be a complete picture of man-in-the-world, and which, after separate scrutiny, he must fuse into one vision." The "manifest image" Sellars explained as the conceptual framework "in terms of which man came to be aware of himself as man-in-the-world. It is the framework in terms of which, to use an existentialist turn of phrase, man first encountered himself—which is, of course, when he came to be man. For it is no merely incidental feature of man that he has a conception of himself as man-in-the-world, just as it is obvious, on reflection, that if man had a radically different conception of himself, he would be a radically different kind of man."

In other words, the manifest image is so central to the way in which we think of ourselves that it is constitutive of those very selves. We wouldn’t be the things that we are without it—the very things who progressively elaborate the scientific image, bringing to the task our manifest image of ourselves as rational beings, "able to measure ... [our] thoughts by standards of correctness, of relevance, of evidence." Our having an ever-expanding scientific image of ourselves is itself an aspect of our manifest image, the sense that we have of ourselves as creatures who not only believe but offer reasons for our beliefs (and for our actions as well, but we’ll get to that). We can’t give up on either of the two images of us-in-the-world without destroying the other. They are codependent even when there are issues between them—which is beginning to make philosophy sound like a couples therapist.

Consider, for example, that relativity theory seems to tell us that time doesn’t flow, that all of space-time is laid out in a frozen all-at-once-ness, with the distinctions among past, present, and future "an illusion," in the words of Einstein, "albeit a persistent one." How can such a view of time be reconciled with perhaps the most conspicuous aspect of our manifest image, implicated in almost every emotion we have—our regret and nostalgia for the past, our hopes and terrors for the future? One can’t revamp our notion of physical time without disturbing our conception of the very things we are.

And there is the scientific image of us-in-the-world elaborated by neuroscience, one in which I am a brain consisting of a hundred-billion neurons, connected by a hundred-trillion synapses, and this brain itself hasn’t a clue as to what’s going on among those synapses. How can this be reconciled with the manifest image of me as me, pursuing my life, remembering it and planning for it, singularly committed to its persistence and flourishing? How can the neuron-level view be reconciled with the manifest truth that at some level our brains undeniably think about things? Where’s the aboutness to be found among those neurons and synapses? And is the scientific image even coherent if we can’t assert that we think about that scientific image, and that in thinking about it, we are thinking about the world?

Once again we come up against the codependence of the scientific and manifest images, even as they sit on the couch with arms folded self-protectively across their chests and resentful ungivingness in their glares, while philosophy, charged with bringing them together, recognizes their mutual needs. As science progresses, philosophy’s work of increasing our overall coherence progresses in tandem. In fact, the scientific image couldn’t even coherently claim for itself its expansionist triumphs without helping itself to philosophers’ work—to explicate what is essential to scientific methodology and why it is uniquely effective, to argue why it offers an image of reality and not just one more social construction.

Sellars is right that philosophy is best viewed neither as inward-expressing literature (in which case give me poetry over philosophy) nor as failed science (in which case give me physics over philosophy), but as the systematic attempt to increase our overall coherence. Still, his conception is too narrow. Philosophy does indeed always involve our manifest image, but it needn’t always involve the scientific image. In particular, some of philosophy’s most significant progress has proceeded independently of science, and here the work of increasing our moral coherence is particularly important. And this is philosophical work that hasn’t kept itself locked away in the Academy, which was where Plato chose to pursue philosophy, but has made itself felt in the agora, where a barefoot Socrates wandered among his fellow citizens, trying to get them to feel the point of his questions so that they might begin to make moral progress.

As living organisms we are primed, unthinkingly, to do all we can to thrive; to be more precise, we are primed, unthinkingly, to do all we can to increase the probability that copies of our genes will survive. But our manifest image of us-in-the-world compels us to give reasons for our actions, and this activity, though undoubtedly compromised by the unthinking processes that science has recently brought to light, proceeds on its own terms. Indeed, the fact that it proceeds on its own terms is part of the manifest image of us-in-the-world. The reasons we are prepared to give to ourselves and one another in accounting for our behavior make no mention of the machinations of the selfish gene. Such reasons would never wash, not even if you’re Richard Dawkins. On the contrary, coherence work of the moral kind pushes in the direction of less influence by those unthinking processes and the presumptions they spawn—all variations on "me and my kind are worth more than you and your kind."

Gregarious creatures that we are, our framework of making ourselves coherent to ourselves commits us to making ourselves coherent to others. Having reasons means being prepared to share them—though not necessarily with everyone. The progress in our moral reasoning has worked to widen both the kinds of reasons we offer and the group to whom we offer them. There can’t be a widening of the reasons we give in justifying our actions without a corresponding widening of the audience to which we’re prepared to give our reasons. Plato gave arguments for why Greeks, under the pressures of war, couldn’t treat other Greeks in abominable ways, pillaging and razing their cities and taking the vanquished as slaves. But his reasons didn’t, in principle, generalize to non-Greeks, which is tantamount to denying that non-Greeks were owed any reasons. Every increase in our moral coherence—recognizing the rights of the enslaved, the colonialized, the impoverished, the imprisoned, women, children, LGBTs, the handicapped ...—is simultaneously an expansion of those to whom we are prepared to offer reasons accounting for our behavior. The reasons by which we make our behavior coherent to ourselves changes together with our view of who has reasons coming to them.

And this is progress, progress in increasing our coherence, which is philosophy’s special domain. In the case of manumission, women’s rights, children’s rights, gay rights, criminals’ rights, animal rights, the abolition of cruel and unusual punishment, the conduct of war—in fact, almost every progressive movement one can name—it was reasoned argument that first laid out the incoherence, demonstrating that the same logic underlying reasons to which we were already committed applied in a wider context. The project of rendering ourselves less inconsistent, initiated by the ancient Greeks, has left those ancient Greeks, even the best and brightest of them, far behind, just as our science has left their scientists far behind.

This kind of progress, unlike scientific progress, tends to erase its own tracks as it is integrated into our manifest image and so becomes subsumed in the framework by which we conceive of ourselves. We no longer see the argumentative work it took for this advance in morality to be achieved. Its invisibility takes the measure of the achievement.

I’ve imagined Plato shuddering at a certain conception of the field he helped to shape. Would he likewise shudder at having been left so far behind by that field? I think not. If he was committed to any philosophical position, he was committed to the assertion that philosophy is progress-making. Think of the Myth of the Cave, which could be subtitled "A Philosophical Pilgrim’s Progress." Even excluding the science-directed philosophy of Sellars’s analysis, which couldn’t be accomplished in advance of scientific progress, still the task of rendering us more coherently integrated was too much for any man, for any generation, for any millennium. Our conceptual schemes are fragmented for reasons that run deep in our psyches, having nothing to do with the reasons that function in our manifest image of us-in-the-world, powered instead by unthinking strategies (strategies that science is beginning to illuminate).

No wonder progress, though real, is laborious. But given enough time and talent—and maybe even a bit of funding—our descendants will look back at us and wonder why we stopped short of the greater coherence they will have achieved.