Friday, April 30, 2010

Should have "out sourced"

"AERB to probe presence of radioactive material on campus"


Urvashi Sarkar

May 1st, 2010

The Hindu

Two days after the source of radioactive material that killed one and left several others ill was traced to Delhi University's Chemistry Department, Vice-Chancellor Deepak Pental on Friday said the Atomic Energy Regulatory Board team had been told to investigate all claims pertaining to the presence of radioactive material on the campus.

Chemistry Department professor Ramesh Chandra has claimed that over 20 kg. of radioactive material was dumped in the Department of Physics in 1986 without following regulations of the AERB and the Bhabha Atomic Research Centre. He also claimed that even in the Botany Department radioactive material was thrown into the garbage.

Admitting to complacency by the university, the Vice-Chancellor said the report “may or may not be true.”

“We must get rid of this complacency. Where were these people who are now making these claims before the latest incident? If they are so conscious about the environment, public health and welfare of the university, why did they not come forward earlier?,” the Vice-Chancellor asked.

Another Delhi University professor has also claimed that some three years ago a great quantity of old material and equipment from the Physics Department was disposed of without following safety procedures.
3-member team

The university on Thursday announced the constitution of a three-member team to find out how radioactive material was allowed to leave the Chemistry Department. Asked when the committee would submit its findings, Professor Pental said: “The AERB, which is also conducting a full-fledged enquiry, will move much faster and our committee might take some time.”

Regarding certain cobalt pencils which have reportedly not been traced, the Vice-Chancellor said: “The AERB is in charge of it and will look into it. If experts want to come in again to conduct further investigations, we will cooperate with them.”

The AERB had issued a show-cause notice to Delhi University to explain in two weeks the violations. It also directed the university to suspend all activities involving the use of radiation sources.

Addressing concerns that students enrolled in courses involving use of radioactive material in experiments could be affected in terms of course schedule and examinations, Professor Pental said: “I do not think so. In case there are problems, the agencies concerned would be spoken to.”

B.C. Chaudhary of M.Sc. Previous Nuclear Physics said: “The M.Sc. Previous Nuclear Physics first floor lab has not been affected. It is a laboratory for continuous evaluation. None of the students of the 2009-2010 batch would be affected. If matters are resolved by July 21 when the university reopens after the summer vacations, there will not be any problem.”

Practical laboratory examinations of M.Sc. Final Nuclear Physics and M. Tech. in Nuclear Science and Technology too have finished.

Various departments such as Physics and Chemistry have reportedly been asked to submit details of radioactive sources as part of a report for the AERB.

Not movie star autographs but just as valuable

"logbook: nobel meeting"


Andrea Mustain




Toward the end of June 1962, a virtual pantheon of modern physics descended on a tiny island just off the shores of Lake Constance, in Germany’s rolling Bavarian countryside. They’d come from around the globe for the 12th annual Nobel Laureates Meeting at Lindau. Laureates and select students gathered for a collegial week of lectures, informal meetings, and socializing.

Friedrich Katscher, a Viennese physicist and journalist, covered the conference for Austrian newspapers and radio. He says it was a giddy experience for a science lover: “These were the really great people of the world, and many of them were meeting each other for the first time.”

Halfway through the week, the students rallied for a torch-lit procession through the town as a show of appreciation for the great scientists’ leadership and dedication to teaching. (Paul Dirac, overcome, joined right in.) Then physicists and students together sang “Gaudeamus,” the academic hymn: “Let us rejoice therefore, while we are young. After a pleasant youth, after a troubling old age, the earth will have us.”

Katscher seized the opportunity to get an autograph from almost every scientist present. The list could transform even the most stoic physics buff into swooning groupie: Niels Bohr, James Franck, Lise Meitner, Werner Heisenberg, Max Born, Gustav Hertz, among many others. Katscher says they readily signed for him, and were supremely gracious to press and students alike.

Just months later, Niels Bohr was dead. None of those whose names are scrawled on the yellowing page are still alive. The last to survive, Edwin Salpeter, died in 2008. Katscher says he feels lucky to have been there: “It was one of the last possibilities to see these people together.”

Left column...

Patrick Maynard Stuart Blackett [Wikipedia]

Cecil Frank Powell [Wikipedia]

Edwin Ernest Salpeter [Wikipedia]

Otto Robert Frisch [Wikipedia]

Right column...

James Franck [Wikipedia]

John Douglas Cockcroft [Wikipedia]

Edward Victor Appleton [Wikipedia]

Max Born [Wikipedia]

Paul Adrien Maurice Dirac [Wikipedia]

Otto Hahn [Wikipedia]

Lise Meitner [Wikipedia]

Niels Henrik David Bohr [Wikipedia]

Robert Hofstadter [Wikipedia]

Harold Clayton Urey [Wkipedia]

Emilio Gino Segrè [Wikipedia]

Werner Heisenberg [Wikipedia]

Gustav Ludwig Hertz [Wikipedia]

Glenn Theodore Seaborg [Wikipedia]

Napoleon, Pierre-François-Xavier Bouchard, Rosseta stone

Pierre-François-Xavier Bouchard
April 29th, 1771 to 1832

A yesterday birthday.

Bill Ashworth in the Linda Hall Library Newsletter wrote...

Pierre-François-Xavier Bouchard, a lieutenant and engineer in the French army, was born Apr. 29, 1771. Bouchard was one of many scientifically-trained officers that Napoleon took along when he invaded Egypt in the summer of 1798, but Bouchard distinguished himself on July 19, 1799, when he found part of an ancient Greek stele built into a wall in the Egyptian port city of Rosetta. The "stone from Rosetta" was one of the great archeological finds of the expedition, with its Greek, demotic, and hieroglyphic inscriptions, and it was immediately recognized as a possible key to the decipherment of hieroglyphic writing. The inscriptions were carefully copied by the members of the Institute of Egypt, primarily by taking prints directly from the stone, and casts were made as well, which was a good thing, because when the French surrendered to the British in 1801, the Rosetta stone, and many other antiquities, were surrendered as well. Which is why, if you want to see the Rosetta stone, you must go to the British Museum, and not the Louvre. Ironically, it was not an Englishman, but a Frenchman, Jean-Francois Champollion, working from the copies, who finally deciphered hieroglyphics in 1822.

Francis Bacon, Edward de Vere, Shakespeare--whom?

Francis Bacon

Edward de Vere

William Shakespeare

These scholars did not receive a degree in HVAC or paralegal and one has to admire their scholarship...constantly changing. Who was Shakespeare...did he really write those plays? Maybe it was Francis Bacon or Edward de Vere, the 17th Earl of Oxford.

"Shakespeare: The Question of Authorship"


Jeremy McCarter

April 26th, 2010

The New York Times

Shakespeare is not only peculiar in himself, but the cause of peculiarity in others. The surviving traces of his life, which the Shakespearean scholar Stephen Greenblatt describes as “abundant but thin,” depict a man whose parts aren’t entirely in sync: a provincial who grew wealthy but sued for paltry sums, a literary genius who seems never to have written a letter — or owned a book. But the alternate histories offered by people who reject Shakespeare’s authorship are far stranger, abounding in secret ciphers, baroque conspiracies and readings of the plays as fantastical as what’s in them. Barring the discovery of a ­doorstop-size autobiography or the invention of a time machine, we’ll never get a really satisfying explanation of how “Hamlet” and “Henry V” and all the rest were written, only varying degrees of ­improbability.

Five years ago, James Shapiro wrote “A Year in the Life of William Shakespeare: 1599,” a meticulous study that rendered a slice of the standard history less implausible. Now, in “Contested Will,” he addresses the authorship question itself. His refreshing method is to zoom all the way out, taking an interest “not in what people think — which has been stated again and again in unambiguous terms — so much as why they think it.” Working its way back to the earliest doubters, Shapiro’s book offers both history and historiography, a mix that yields insights even for those who don’t know their “Othello” from their “Pericles.”

Shapiro, a professor of English and comparative literature at Columbia University, uses the fight over Shakespeare’s identity to show how our views of the past are shaped by the contingencies of the evidence that reaches us, and how we’re swayed by the changing spiritual weather of our own time. Though dozens of alternate authors have been proposed over the years — four more while he worked on the book, he writes — he concentrates here on what he calls the two “best-­documented and most consequential” candidacies: those of the philosopher and courtier Francis Bacon and Edward de Vere, the 17th Earl of Oxford. The shifts in their reputations over the last 150 years have been sufficiently extreme to think of them as the reverse of Ben Jonson’s famous praise of Shakespeare: they were not for all time, but of an age.

In Shapiro’s account, the mischief of the authorship controversy began with a kind of scholarly original sin. For a new edition of Shakespeare’s writing in 1790, Edmond Malone tried to put the plays in chronological order. He ransacked the texts for any fleeting corollary to Shakespeare’s life and times, an approach that Shapiro equates with having “carelessly left open a fire door.” Once you assume that Shakespeare could write only about things he experienced firsthand, the absence of certain pursuits from his spotty biographical record — falconry and seamanship, for instance — seems to disqualify him as the author.

Two changes in the 19th century brought a mob swarming through that door. New scholarship dared to challenge the sacrosanct authenticity of Homer and the Gospels. Soon afterward, a spate of popular biographies conveyed to a wide audience the scant facts of Shakespeare’s life — largely derived from surviving financial records and legal proceedings — without making clear that it would be strange to see much else survive from the 16th century. Among those bothered by the gap between the extraordinary plays and the rather ordinary life was a brilliant, troubled American, Delia Bacon. She refused to accept that a “stupid, illiterate, third-rate play-actor” could have written works of such “superhuman genius.” Her innovation was to seek the author’s identity in the plays themselves, sketching the “Wanted” poster that skeptics use to this day: a set of qualities that Shapiro distills to “pure motives, good breeding, foreign travel, the best of educations and the scent of the court.” For her, this described the polymorphous genius of the English Ren­aissance: Francis Bacon (no apparent relation to her), who she claimed led a group of politicians-turned-writers who worked jointly on the plays.

To figure out why people bought into this dubious theory, Shapiro uses a technique that could, in the hands of someone less committed to treating all sides fairly, be an instrument of vicious satire: he turns the skeptics’ arguments against them. When he applies to Delia Bacon’s work the kind of close reading that led her to conclude that frustration and lack of prospects forced Francis Bacon to take up his quill, he finds her own frustration and lack of prospects talking. (She had been thwarted in a playwriting career and shamed by a love affair gone wrong; she would spend her final years in an asylum.) Shapiro finds a similar form of self-­revelation in the Baconian advocacy of Mark Twain, who was, after all, a writer with a taste for pseudonymous autobiographical fiction.

By the end of the 19th century, Shapiro writes, the Baconian case had begun to fade, partly because its claims weren’t holding up, but also because of the changing zeitgeist: “Philosophy and politics were out, Oedipal desires and mourning for dead fathers in.” No more cerebral, distant Prospero; the 20th century wanted a conflicted, disenfranchised Hamlet. In 1920, J. T. Looney, a schoolmaster from England, gave it one: Edward de Vere. Born in 1550 (14 years before Shakespeare), the Earl of Oxford spent his life around the court, wrote poems of his own, supported other artists and traveled extensively in Italy, where many of the plays were set. In fact, on a return trip from the Continent, his ship was attacked by pirates, prefiguring the oddest incident in “Hamlet.”

In its most aggressive form, the Oxford theory doesn’t just reassign custody of the plays, Shapiro says; it attempts “to rewrite . . . both the political and literary histories of England.” Charles Beauclerk’s new book, “Shakespeare’s Lost Kingdom,” takes the theory to its extreme. It maintains that Oxford wasn’t just the author of the plays, he was the secret son of Elizabeth I, the product of an incestuous encounter between the 14-year-old future queen and her uncle. Furthermore, when Oxford grew up, he slept with his mother, who bore him a son/grandson, Henry Wriothesley, the third Earl of Southampton — Shakespeare’s patron and the possible “Fair Youth” of the sonnets. (A curious little frisson: Beauclerk, who has also written a biography of Nell Gwyn, cites Oxford as a direct ancestor. Claiming descent from the Tudors, as they are depicted here, is like asking to be adopted into the House of Atreus.)

To those who aren’t related to him, the appeal of Oxford is that he offers what people think they should be looking for when they go looking for Shakespeare. According to Shapiro, Looney was a Positivist who despised modernity and saw in Oxford an aristocratic champion of feudal virtues. Looney’s most famous supporter, Sigmund Freud, thought that the early death of Oxford’s father and the dismaying re­marriage of his mother (assuming she wasn’t the queen, of course) offered the Oedipal basis he needed for “Hamlet.” The hitch in these interpretations is that Edward de Vere didn’t write “Hamlet”; a glover’s son from Stratford did.

Shapiro makes a strong case for William Shakespeare, but he’s more interesting when he argues that we should move beyond this controversy altogether. “Perhaps it’s time to shift our attention from debating who wrote Shakespeare’s works to whether it’s possible to discover the author’s emotional, sexual and religious life through them,” he writes. Blame doesn’t fall only on the skeptics: Shapiro makes the provocative charge that mainstream historians are also guilty of conflating what the characters say and do and what their creator said and did. He cops to doing it himself, and says Greenblatt did it in “Will in the World.” This foraging for autobiography may be popular, but it does violence to Shakespeare. It diminishes what Shapiro calls “the very thing that makes him so exceptional: his imagination.”

I put down Shapiro’s book thinking he sounded awfully quixotic; it’s hard to believe that any study of Shakespeare written in a memoir-crazy era like ours could be as scrupulously modest as he desires and still be worth reading. Then I noticed that such a book had just arrived.

“Prefaces to Shakespeare” is a collection of the essays that the Cambridge professor Tony Tanner wrote to accompany the plays for the Everyman’s Library series. Tanner, who died in 1998, maintains an easy, book-club tone, at once gentle and generous. Though some essays probe more deeply than others (he’s sharpest on the comedies), he’s always sensitive to how the themes of change and regeneration recur. And at almost every juncture, he resists the temptation to speculate out of hand. The origins of Falstaff? “Unknowable.” Shakespeare’s feelings on female sexuality? “Manifestly imponderable.” His motivation for writing “Henry VIII”? “Simply beyond the reach of informed conjecture.” This approach leads him astray now and then — he decides to treat “Pericles” as entirely Shakespeare’s, though we now know the play was a collaboration — but it pays a sweet dividend.

Consider “A Midsummer Night’s Dream.” For a stalwart Oxfordian like Beauclerk, this play, like many others, is an elaborate allegory about de Vere’s frustrations and Elizabeth’s schemes. Bottom’s vision after his enchanted night with Titania “is nothing as banal as the dreamlike memory of an ass’s head,” Beauclerk writes; “it is the specter of the crown which the fairy queen’s love for him seemed to portend.” Tanner, by contrast, says that what happens between Titania and Bottom when they leave the stage is “a vital blank which we never can fill in.” Such mysteries are one reason that he felt no “more magical play has ever been written,” and that so many of us go on feeling the same. Sometimes an ass’s head is just an ass’s head.

[Jeremy McCarter writes for Newsweek and is the editor of “Bite the Hand That Feeds You: Essays and Provocations,” by Henry Fairlie.]

The "Catalyst"...scifi story by John Gilbey



John Gilbey

April 10th



You've got to understand that all this happened a long time ago, and I reckon that with the monitoring we have in place now we'd have picked up on the event much sooner. But even if it recurred today, would we have any idea what was causing it? Well, I'll leave that for others to judge.

The Bay Area was still trying to tidy itself up after the tsunami of 2018, so a lot of folk were in temporary housing and doing jobs they wouldn't normally do— which is why a senior project director was delegated to pick up the new postdoc from the Caltrain station. Oakland was still covered by mud and debris, San Francisco International remained under military jurisdiction—running relief and logistics flights—so if you wanted to fly in from Europe it was San Jose or nothing, then take the short hop by rail.

Nobody else was there when they met, but something obviously clicked straight away. Bob wasn't known as a great talker, and his sense of humor was widely thought to have been amputated in some freak accident, but even Tony at the Guard House noticed the warm glow he was exuding as he waved his pass on the way into the site. Tony reckoned he looked like someone who'd "wandered into the beam"—a comment that scandalized the entire safety group.

Bob and Fiona were about the same age, mid-thirties at a guess, and both reasonably good looking in an undemanding kind of way. Just regular research folk then, not the kind of people to set the world alight—or so we thought at first. They worked on different projects, but people started noticing that they were ending up co-located a lot of the time, especially on the long evening shifts. Even I, widely regarded as having the social radar of a lump of rock, could tell something was up when I walked into the control room and found them in rapt conversation—eyes locked together, oblivious to the world around them.

If their work had been suffering as a result it might have been a problem, but it was quite the reverse. Alarm bells should have started ringing as soon as I overheard people in the Linear Café talking about the two of them being "lucky" with the experiments they were working on, and looking back at the project records you can see a pattern was clearly emerging.

Things came to a head just after Thanksgiving. As usual, a lot of people were traveling to be with family, so things were fairly quiet. Fiona, being a postdoc, had been kind-of-volunteered to babysit one of the routine engineering jobs. I won't bore you with the details—it was just running a test pattern at low energies, ramping the signal up and down to test some new components—but suffice it to say it didn't challenge her mental abilities.

Nobody noticed the problem in the output files until the next Monday meeting. One of the data-wranglers pointed out what she called an "engineering glitch" and the engineering chief referred to pointedly as a "typical IT foul-up." As head of operations I kept my mouth shut, knowing it was my problem in either case. After banging heads for a while, we agreed that they were probably both wrong—and I started looking for the real cause.

As the potentially guilty party I settled sourly in one of the seminar rooms, which while not too comfortable at least had a decent imaging system. I dialed up the project and maintenance data, flowing it into a time-stream for the day in question. After the run started, three complete cycles ran smoothly and well within the confidence limits overlain in red by the system. The fourth run was different.

At 22:18:06 UTC-8 the reactive efficiency suddenly jumped from a satisfactory, but not breathtaking, 27 percent up to an unheard-of 79 percent—something that the theorists had pointed out should be impossible without divine intervention. Panicking slightly, I superimposed my systems logs on the traces. Nothing: no parameter changes, barely a mouse click, had happened over the whole period; the control streams were stable and not even the climate system showed any variation from nominal.

Running out of ideas, I punched up the video feed from the control room surveillance system and spooled it along to the time of the glitch. At 22:16:12 Bob entered the room and sat down next to Fiona at the console. She took off her headset and turned to him. They talked, slowly moving closer together. Finally, even inevitably, at 22:18:06 they kissed, and the traces headed skywards.

OK, so what happened? What magic ingredient was at work? Was it the first time they had kissed? I mean, they didn't look that good at it. When I spoke to Bob later that day he was glumly reticent about the whole business—because by that time Fiona had sworn never to speak to him again. I didn't ask him for the details.

This left me in kind of a fix. After trying to forget about it for years I've finally had a stab at writing it up for Nature Physics. I've even got a title I'm almost happy with, "Observations on the impact of interpersonal adult infatuation on quasi-stable high energy systems," but there is no way I'm sending it in yet. You see, I know exactly what the referees will say: Replication.

If this effect is real, then I'm going to need a whole batch of new couples who are all set to fall in love. Not just that, they've got to fall in love in the stark gray confines of the SLAC Main Control Center—which is asking a lot, I think you'll agree. Then there are the other possibilities to consider: Does heredity have an impact? Is it relevant that Fiona is English and has red hair, while Bob is a Texan and bald? My list of potential variables is depressingly long, which is why—after all these years—I'm asking you for help.

Take a look around you in the office, the lab, the coffee room. If you see a couple of people who look like they are about to fall in love, could you drop me an email? I'd love to know if the Bob/Fiona Effect is real, but please—please— don't tell the Secretary what I'm doing.

Copyright: John Gilbey 2009


"John Gilbey tweaks the future"


Olga Kuchment

April 10th



John Gilbey is a writer, photographer, educator, and project manager at Aberystwyth University in Wales. For the past two decades, his fiction and non-fiction stories have appeared in the likes of Nature, New Scientist, and the Guardian.

He wrote the preceding science-fiction story "Catalyst" especially for symmetry, loosely drawn from his visits to the San Francisco Bay Area and SLAC National Accelerator Laboratory in July 2009.

He talks with Symmetry about writing, visiting SLAC, experiencing the spirit of Silicon Valley, and remaining an optimist.

John Gilbey's science-fiction stories often take place in the tangible few decades to come, allowing him to play with scientific ideas while working within the framework of a familiar reality. His first-hand experiences with science and the culture of science inform the stories. As a young scientist he studied grasslands and soils in the cold, damp moorlands of England and Wales. The Welsh climate, he says, pushed him toward the "much more attractive" indoor vocation of computer science.

What was the inspiration for "Catalyst"?

I was in California last July, and was doing some articles about the Bay Area. I was lucky enough to get an interview with SLAC Director Persis Drell, and I spent a very, very enjoyable half-day at the center. And as a thank-you to the people who looked after me, I wanted to write a short story.

As to the actual catalyst for "Catalyst": I spend a lot of my time traveling to different research organizations and universities. And when you walk in as an outsider, you always pick up an aura of how the place operates, how the people work together. The simple things—like what's scribbled on the whiteboards in the labs and the offices—give you a quick picture of how the place fits together. There was something about the feeling in the center that made me think, "These are people that might appreciate a slightly off-beat romance."

How did you decide where to take the story from there?

I had to be quite careful because my ignorance of high-energy physics is profound. By setting the story in the future and giving it a set of experiments which were carefully undefined, I felt a bit safer. I tried to plug in my own experiences with lab work and research, pictured against what might happen when you start looking for a reason behind an event. It's the old line from Hamlet: "There are more things in Heaven and Earth, Horatio, than are dreamt of in your philosophy." There may be whole spectra out there that we can't yet see, feel, or measure.

You wrote another story set in the San Francisco Bay Area, published in Nature, and it also mentions the tsunami of 2018.

It's not that I'm predicting a tsunami in 2018; I just want to reassure the city of that. It's just something to upset the balance of what you expect.

What happens to "Catalyst's" Bob and Fiona is kind of sad.

If I had let them live happily ever after, some analysis would have been possible of the ongoing relationship. Maybe there will be a sequel. I would be interested to hear any feedback you get from the story.

What kind of feedback have you gotten on your other stories?

Very varied. Some people like them, some people think they're too flippant. And other people think they can see themselves in the story, which is something I'm always very careful to avoid. The characters are completely synthetic, but the traits they exhibit are not in the least unusual among academics.

What kind of traits?

Ooh, now that's a really awkward question. Let's just say that academics and scientists are prone to the same human weaknesses as everybody else.

You've written about the stereotypes of scientists in popular culture, most notably in the June 2008 Times Higher Education article titled "Fools, I will destroy you all!"

I was looking at the portrayal of scientists in the media, particularly in film. My concern was that as I talked to people who weren't active in the science community, they were more aware of scientists through fictional portrayals than they were of real scientific characters. For example, if you ask someone who's a famous scientist, you get the reaction, "Dr. Strangelove."

It's always an evil stereotype that people seem to pick. If you Google for an image of "scientist" and look at the first 100 hits, a huge number of them are of a lab-coated crazy person with thick glasses, wild hair, doing something very, very dangerous with flasks of mysterious chemicals. It seemed to me something that's getting ingrained in the culture more and more. It's unfortunate.

What I try to expose through the stories is the humanity of the science community. I try to encourage the view of scientists as being just like everybody else.

What do you think is the real value of science to society?

Inestimable. I don't think you can overstate the case for science in digging our way out of the holes we've dug ourselves into as a society.

I'm part of the generation that read Isaac Asimov and Arthur C. Clarke as science-fiction authors. And they were both confident in technological fixes to human situations. That's something that's decreased over time. Doomsday hypotheses have lately been much more about science causing problems rather than science solving them, which I think is a great loss.

What do you say to people who distrust science's ability to dig us out of the holes we've dug for ourselves?

Look at the track record. Look at the developments over the last 20 years that have made huge progress in medicine, in agriculture, in improving the human condition. Whether those improvements have been implemented becomes a society issue, a political issue, but the tools are routinely being generated. And how they are managed is going to be more and more important. That's one reason why we should have a scientifically educated population, so people can make their own judgments of how technologies ought to be applied to society.

As we apply these technologies, how do you think the world is going to change?

I like to think for the better. But I'm an optimist. If you lose faith in humanity's ability to look forward positively, there is little left for society.

I have a dangerous fascination with California, I have to say. I was in Mountain View last July on a Thursday night, and all the restaurants were open late. There were thousands of people strolling around and a band was playing in the middle of the street. And at one little table there were three or four people in their early 20s huddled around a laptop putting a presentation together. And it was obvious that the next day was their big pitch, their one big shot with this development they were planning. They were completely oblivious to everything around them. That sort of energy just permeates the whole environment. That's absolutely fascinating.

In some parts of Europe there is not that culture. If you talk to people in California and set them a problem, they will grab a few people, sit around the table, and say, "How can we do this?" rather than "Why can't we do this?" It's a mindset that I think is hugely beneficial to society in general and science in particular.

How do you think science will change in our lifetimes?

I gave a talk on that with a bunch of science communicators. We were in the lecture theater where Faraday lectured all those years ago. You could feel the ghosts creeping around you.

I was speculating about the process of doing science in 50 years' time. My speculations are that science will become a far more nomadic activity, even more so than it already is. I suspect that more and more, there will be centers like SLAC that can carry out the physical end of research. But more of the research activity will be done virtually. It could be a way for science to get closer to society again. But that's the science-fiction writer in me talking.

How did you become a science-fiction writer?

I am not sure I've got an answer to that. I've always read science fiction, since I was very young. It's a way of talking about science in a way where the people you respect know that you're not actually referring to them, that you're talking conceptually.

And of course you can take huge liberties with your source material. You can make any assumptions you like in the future. You can invent a whole raft of technologies and breakthroughs that may well happen, and nobody can actually say, "That's impossible." Because as soon as someone says, "No, no, that's impossible," it'll spring up.

What ideas do you most like to play with, in your stories?

By setting stories in the future, I'm trying to allow a scrutiny of possible futures. In many cases, they are futures we may wish to avoid. Exposing them to the light will maybe make people think a little harder on how we go about life.

That sounds terribly grand. All my stories are intended to be humorous on the surface, but they all have a dark hidden message.

How does that mesh with being an optimist?

It's much easier to be optimistic if you know that people are thinking about the bad things that could happen. If people are walking off into an unknown future but not thinking about what could go wrong, then it's difficult to be optimistic.

One of my careers has been project management and quality management. These are people who don't like surprises. They want to understand risk and manage it rather than allow things to happen. My science fiction stories are a sort of project management of the future.

College of Staten Island and bucks for names

Okay, a worthy idea, but is it official?

"College of Staten Island astronomy professor who discovered planatoids puts names up for sale"


Clem Richardson

April 30th, 2010


Astronomy Prof. Irving Robbins is offering the astronomical bargain of the decade.

A sizable donation to the College of Staten Island's astronomy program could earn you the right to name not only the school's tiny observatory but also - not one but two - planetoids Robbins was recently credited with discovering.

"Could" is the operative word since Robbins is also considering naming the planetoids - 2009 VH24 and 2009 XNO7 - after his sons, Isen, Davy or Zeav, his girlfriend, Dana, or his ex-wife of 37 years, Freda.

"I found these things, so I get to name them," Robbins said. "Somebody wants to give some money toward a new observatory, we can talk."

Robbins, 69, has taught at CSI for 42 years and started the astrophysics program there.

It's a popular course, attracting about 800 students in the academic year despite the fact Robbins "toughened it up a bit lately by including more math."

As director of the college's Astrophysical Observatory, Robbins, his assistants and students have tracked not only the planetoids he will name but also many Potentially Hazardous Asteroids (PHA) and Near Earth Objects (NEO) whose orbits may eventually cross the Earth's celestial path.

Robbins said the hunt is personal.

"I really believe that there is a good possibility that we're going to get hit" by an errant asteroid, Robbins said. "If an object comes near the Earth, the question is always if it will hit us."

In 1988, Robbins noted, when the Harvard University-based Minor Planet Center first began tracking threatening celestial objects, there were 25 known objects that came close to the Earth.

"Now there are over 2,000," he said. "Some of them get very close and are potentially hazardous. This is an important problem."

Great teachers make complicated things simple. Here is how Robbins explained his discoveries.

Using government grants, Robbins buys time at $200 an hour to remotely control ground-based telescopes located in the Arizona mountains.

Using advanced software and digital cameras, Robbins positions the telescopes on the part of space he is interested in and then snaps a series of pictures over a set interval.

"Say we get three pictures," he said. "Then we get the computer to put the pictures together and flash them, or make them blink.

"Stars don't move. Stars move because the Earth turns, but we're moving with the stars. But the asteroid is moving amongst the stars.

"So when we flash the images, or blink them, we see the asteroid [a tiny white dot among hundreds of white dots] jumping along."

Ethical advertising--name a star

John Montagu and ham on rye

John Montagu
November 3rd, 1718 to April 30th, 1792

John Montagu who?...the 4th Earl of do you get the connection? John Montagu was a "British politician, inventor and explorer, for whom the sandwich is named, which is said he invented in 1762. A story was offered in Grosley's Tour to London that Sandwich often spent excessive time gambling and he didn't want to get up from the gambling table, so he told his servants to bring him meat between two slices of bread. Although this story is often quoted, it seems without support. Rodger, Sandwich's biographer, describes the original form using salt beef, as more likely to have been invented to eat while working at his desk, where he spent long hours. Captain Cook named the Sandwich Islands (Hawaii) for him. As first lord of the admiralty (1771-82) during the American Revolution, he was held responsible for the navy's disastrous unpreparedness for war."

John Montagu [Wikipedia]

Biodiversity...not a rosey picture

International Year of Biodiversity

"Despite Global Action, Biodiversity Is Declining"


Bryan Walsh

April 29th, 2010


In 2002, environment ministers from around the world gathered in The Hague for a major summit on the Convention on Biological Diversity — an international treaty designed to protect the world's plants, forests and wildlife. With rainforests being clear-cut in tropical countries, endangered species nearing extinction around the world, and the seas steadily being fished out, the ministers agreed it was time to take action. In a declaration, they vowed to "strengthen our efforts to put in place measures to halt biodiversity loss, which is taking place at an alarming the year 2010."

At the summit's conclusion, its Dutch leader, Geke Faber, said it had "helped move us from policy development to implementation, from dialogue to action."

Fast forward to 2010, the international year of biodiversity — things aren't getting better. In fact, for wildlife around the world, they're getting much, much worse. In a study published Thursday in Science, a team of scientists and environmentalists from around the world assessed the state of global biodiversity and found that it has been in steady decline. Gauged by the number of endangered species on the International Union for Conservation of Nature's Red List or the waning condition of coral reefs or human's increasing consumption of the planet's ecological assets, the state of the Earth is worsening. "Our analysis suggests that biodiversity has continued to decline over the past four decades, with most state indicators showing negative trends," the Science authors write. In other words, the promises of 2002 have gone unfulfilled.

The study compiled more than 30 different indicators of biodiversity, including any changes in species' population numbers and the extent of preserved habitat. In nearly every category, the news reveals biological depression: since 1970, the world's animal population has decreased by 30%, mangroves and sea grasses have shrunk in area by 20%, and live-coral coverage has fallen by 40%. Despite our best intentions, we are leaving the planet poorer and less diverse than we found it. "The state of biodiversity is definitely showing a rapid decline," says Matt Foster, director of conservation outcomes for Conservation International and one of the lead authors on the Science paper. "And the pressure just keeps increasing."

Ironically, even while biodiversity has deteriorated, the study indicates that policy responses to endangered species and habitat loss have actually improved. The amount of protected land has steadily increased around the world, as has the area of sustainably managed forests. Meanwhile, a growing number of countries have signed onto global pacts designed to limit the spread of invasive species, and the world is spending more than it ever has on biodiversity aid. And thanks to efforts like the new Science study, researchers are getting an increasingly clearer picture of the impact of human activity on Earth.

Yet things are still getting worse — habitats are still being destroyed, and a growing, richer population is taking space and resources away from wildlife. The effect goes beyond endangered animals; human beings rely on a healthy, diverse planet too, and when the Earth suffers, so do we. The Science study found that the populations of vertebrate species used for economic purposes by people had declined 15% since 1970, as has the population of birds and amphibians used for food and medicine. More than 100 million poor people now live in remote areas with threatened ecosystems, and will be particularly vulnerable to the further degradation of a disturbed planet. "We're trying to send the message that in protecting habitats, we're protecting the well-being of humankind," says Foster. "We all benefit from biodiversity and we all hurt when it's lost."

As human population continues to grow — and global warming takes its own toll — the rest of the planet is likely to suffer. We're living on a crowded, hotter Earth, and unless we take strong, collective actions, someday soon we may not even be able to recognize our own planet. We've already lost our chance to meet the promise of 2010 — and we don't have much time left to turn things around.

Noah's Ark...back in the news again

And another quest to discover religious artifacts.

"Has Noah's Ark Been Discovered in Turkey?"


Ishaan Tharoor

April 29th, 2010


To a score of marching drums and pipes, we see the expedition trudge across a snowy expanse and up the mountain. They camp on a hilly bluff, the sun setting over the Anatolian hinterland below. Moments later, we go inside a dark cave and watch members of the expedition inspect what appears to be a solid wooden wall, entombed within layers of glacial ice and volcanic rock. A gnarled beam runs suspended from one part of the cavern to another. There's straw and bits of old rope on the ground; a structure is taking shape. What is it? According to the explorers, it's Noah's Ark, literally frozen in time.

This is the footage of the alleged discovery of the biblical vessel, perched more than 12,000 ft (4,000 m) high on Mount Ararat in eastern Turkey, that was first shown to journalists on April 25 at a press conference in a fancy boutique hotel in Hong Kong. On hand were members of the team, composed largely of Hong Kong–based Evangelicals, an art historian and a handful of Turkish academics and government officials. They displayed specimens of objects recovered from the supposed ark, which they say they encountered in seven dismembered compartments within the mountain: on show are pieces of petrified wood allegedly carbon-dated at 4,800 years old, a chunk of crystal and a cluster of seed-like pellets. "There is a tremendous amount of evidence that this structure is the ark of Noah," said Gerrit Aalten, a Dutch researcher of ark lore who was enlisted to evaluate the team's findings.

Reported sightings of the ark are almost as old as the biblical story itself. In the Book of Genesis, God set about annihilating a "corrupt" and "violent" world once Noah, whom he seemed to like, built a wooden vessel to hold his family and two of "all living creatures, from all flesh." As the floodwaters abated, the barque came to rest by "the mountains of Ararat." Since then, ancient Roman scholars, medieval travelers and Ottoman soldiers have all supposedly spotted this mythical ship amid the region's peaks. So too have a host of Christian explorers — most recently in 2006, when a Colorado-based Evangelical and amateur archaeologist claimed to have uncovered the ark, petrified within Iran's Alborz mountain range along the Turkish border. Upon examination of the find, geologists declared it to be just an oddly shaped variety of rock.

The Hong Kong team says it has been far more thorough. Their trip, heralded at the press conference as a "99.9%" success, took place last October. In the intervening months, says Yeung Wing-Cheung, one of the group's leaders, they have verified their findings with the insight of a few Turkish archaeologists and geologists. While they have only released a brief, edited video of their trek and discovery to the public, Yeung says they have shown full footage of the ascent to colleagues in the cult field of Arkeology. (They justify their secrecy by saying it's for the integrity of the site, which the local government in the area now intends to prepare as a tourist destination.)

The team has conducted missions to Mount Ararat since 2003 under the auspices of a Hong Kong–based organization dubbed Noah's Ark Ministries International, which is in turn linked with another well-funded Christian group in the city called Media Evangelism. The latter drew attention in 2008 when it helped set up a park that now houses a life-size replica of the ark, accompanied by models of animals on board as well as a vivid film depicting God's wrath at a sinful mankind and the flood he sent to wash it away. Yeung talks of the ark's discovery in almost apocalyptic terms: "At this day, at this moment, it has a very special meaning when we see so many natural disasters and earthquakes," he tells TIME. "People should come and see the ark and think about their place in the world."

Not surprisingly, there are skeptics. Eric Cline, a prominent biblical archaeologist at George Washington University and author of the best-selling From Eden to Exile: Unraveling Mysteries of the Bible, questions why this group made up mostly of amateurs in the field chose to announce their findings at a press conference rather than have them peer-reviewed and then published in a scholarly journal, as is standard archaeological and scientific practice. "You see these sorts of claims almost every other year," he says. "When people of faith go out looking for things, it seems they almost always find them."

Archaeologists on blogs and forums have suggested the structure up on Mount Ararat may well just be a hut or some other form of rudimentary shelter. Cline also wonders why the ark over time would have been left intact at all. Indeed, according to the 1st century A.D. Jewish-Roman historian Flavius Josephus, the ship was already being torn down. "It is said," he wrote, "that a portion of the vessel still survives ... on the mountains ... and that persons carry off pieces of [it], which they use as talismans."

Nevertheless, Yeung and his colleagues are pressing ahead, hoping to gain the support of UNESCO and spend the next few years deepening their analysis of the site. Cline says this sort of work strays from the real purpose of biblical archaeology, which is to bring to light the greater social realities of that ancient time, rather than prove the truth of Christian doctrine with quests for biblical totems.

It also misses a larger point about the history of the myth. The flood has echoes in legends from Central America to South Asia, and it almost certainly predates Judeo-Christian times. Scholars believe it was most likely transmitted to the Israelites from Mesopotamia: in the far older Epic of Gilgamesh, we encounter Utnapishtim, a man chosen by the gods to live alone in a boat full of animals while the world around him ended in a deluge. Just like Noah, as the rains stopped he sent out both a dove and a raven to gauge whether the waters had receded. "That's why I tell my students," says Cline, "that if I am going to look for an ark, it won't be that of Noah. Maybe it would be Utnapishtim's."

71 years ago and promises of the future

"1939’s ‘World of Tomorrow’ Shaped Our Today"


Jon Snyder

April 29th, 2010


The New York World’s Fair of 1939 and 1940 promised visitors they would be looking at the “World of Tomorrow.” Not everything they saw there came true, but plenty was close. One reason for that was the fair’s own lasting influence on American architecture and industrial design.

It was a futuristic city inspired by the pages — and covers — of pulp science fiction: huge geometric shapes, sweeping curves, plenty of glass and chromium, and gleaming white walls. The fair was the last great blossoming of the Streamlined Moderne style of Art Deco. It was also heavily influenced by the still-rising International Style of such architects as Le Corbusier and Ludwig Mies van der Rohe.

What people saw at the fair, they wanted for themselves. And when World War II ended, the American consumer machine began giving them what they wanted, or at least what they thought they wanted, or maybe even what the marketers thought the public thought they wanted.

The fair also influenced sci-fi art, both in print and in the set design of hundreds of movies and TV shows, continuing to shape our collective notions of what tomorrow is all about.

1939 New York World's Fair [Wikipedia]

Thursday, April 29, 2010

Amusing...physicist bumped from conference

"He didn't see that coming, or did he?"


Matthew Reisz

April 29th, 2010

TAL Education

An extraordinary spat has broken out after a Nobel prizewinning physicist was "uninvited" from a forthcoming conference because of his interest in the paranormal.

Details of the conference in August for experts in quantum mechanics sounded idyllic. Participants were due to discuss "de Broglie-Bohm theory and beyond" in the Towler Institute, which is housed in a 16th-century monastery in the Tuscan Alps owned by Mike Towler, Royal Society research fellow at Cambridge University's Cavendish Laboratory.

Last week, any veneer of serenity was shattered. Conference organiser Antony Valentini, research associate in the Theoretical Physics Group at Imperial College London, wrote to three participants to say their invitations had been withdrawn.

The physicist and science writer David Peat, biographer of David Bohm (co-founder of de Broglie-Bohm theory), was considered tainted because of his books on "Jungian synchronicity" and "connections between Native American thought and modern physics".

Brian Josephson, head of the Mind-Matter Unification Project at Cambridge, was rejected on the grounds that "one of his principal research interests is the paranormal".

Professor Josephson, who shared the 1973 Nobel Prize for Physics for his work on superconductivity, has long been one of the discipline's more colourful figures.

In 2001, he attracted derision from some of his peers when he discussed telepathy in his contribution to a booklet issued to celebrate the centenary of the Nobel prizes.

Recent developments in quantum theory, theories of information and computation "may lead to an explanation of processes still not understood within conventional science such as telepathy, an area where Britain is at the forefront of research", he wrote.

Speaking this week, Professor Josephson said: "I was keen to attend the conference and would have concentrated on the theoretical ideas and touched on the paranormal as only one aspect. I thought it would be an interesting opportunity for cross-fertilisation."

News of the exclusions led to what Dr Towler described as a "great email storm".

Even spoon-bending psychic Uri Geller joined in, and on 24 April Dr Towler "renewed the invitation" to Dr Peat and Professor Josephson but not to the third rejected participant, American theoretical physicist Jack Sarfatti. Dr Towler claimed Dr Sarfatti had "written something like 100 emails" since his invitation was withdrawn, "many ... suggesting that we are in the pay of the CIA".

Dr Peat agreed to participate while Professor Josephson was considering his position.

It's back...instant Polaroid photos

Polaroid instant film [and camera are back] for consumer appreciation. Actually it was never gone for the film was still made for professional purposes in 8 X 10 format. Nevertheless, relive the old days [like the revival of vinyl records] and add to the instant porn of a new generation.

"Polaroid Lives! New Camera Uses Real Instant Film"


Charlie Sorrel

April 29th, 2010


Like a phoenix rising from the flames and gently fading back into view as you pointlessly flap it in the air, Polaroid has returned. And this time, with real instant film, not that awful camera/printer — the Pogo — we saw last year.

The PIC-300 has the familiar snap-and-wait action, spitting a photo from a slot in its top whereupon the internal chemical pack goes to work to develop the image. The camera itself has four exposure settings and an automatic flash built into its ugly, bulbous and toylike exterior, and runs on four AA batteries or a rechargeable li-ion (all included).

The crying shame is that the photos are smaller than the originals, although they do have that classic shape with the fat (chemical-containing) bottom-border. Similar in size to a business card, the print is 2.1 x 3.4 inches (with a 1.8 x 2.4-inch image) versus the old 3.5 x 4.25 (3 x 3.1 image size).

That isn’t a big problem if the colors and feel of the photos is right: The Polaroid print is more of an object in itself than any other kind of photo. The trouble might be the price. The camera is just $90, but the film costs $10 for a 10-exposure pack (ISO 800). A dollar a print was standard for old Polaroids, but this “fun” design camera is clearly aimed at cellphone-toting kids, who get their pictures free. Still, I’m in. I love Polaroid, and I’m sure that the cost-per-print will keep me from wasting too many frames like I do with digital.

Welcome back, Polaroid! Good to see you again, old friend.

Tuesday, April 27, 2010

"The Big Bang", for me, failed

The television show has done good and placed money in the bank but I gave up on the situation comedy becoming bored with the stereotypes and Leonard's successful capture of Penny. [Remember the electricity between Niles and Daphane in Frasier where the chase was better than the capture?] The characters became over sexed and inundated in science and comic book jargon. Sorry Mr. Overbye, but The Big Bang has failed and fizzled.

"Exploring the Complexities of Nerdiness, for Laughs"


Dennis Overbye

April 27th, 2010

The New York Times

Shudders and groans went around the blogs and coffee rooms of the physics world back in the summer of 2007, when CBS announced plans for a new comedy series about a pair of nerdy physicists and their buxom blonde waitress neighbor.

After all, the characters, Sheldon Cooper, a gangly supremely confident theoretical physicist at a place a lot like the California Institute of Technology, who has an IQ of 187 and entered college at 11, and his roommate, Leonard Hofstadter, whose IQ is only slightly less lofty at 173, and who is instantly smitten by the waitress next door, would seem to embody all the stereotypes that scientists have come to hate: physicists are geeky losers, overwhelmingly male and ill at ease outside of the world of Star Trek.

Not to mention their pals Rajesh Koothrappali, who literally cannot speak in the presence of a pretty woman, and Howard Wolowitz, who can’t shut up, and Penny, who works at the Cheesecake Factory and doesn’t seem to know Newton the Isaac from Newton the fig.

Three years later some scientists still say that although the series, “The Big Bang Theory” (Monday nights on CBS), is funny and scientifically accurate, they are put off by it.

“Makes me cringe,” said Bruce Margon, an astrophysicist at the University of California, Santa Cruz, explaining, “The terrible stereotyping of the nerd plus the dumb blond are steps backwards for science literacy.”

But other scientists are lining up for guest slots on the show, which has become one of highest rated comedies on television and won many awards. The Nobel laureate George Smoot of the University of California, Berkeley, and the NPR Science Friday host Ira Flatow, have appeared on the show.

Lisa Randall, a Harvard particle theorist who has visited the show’s set twice and appeared as an uncredited extra in one scene said, “I do think the writers are genuinely clever.”

Lawrence Krauss, a cosmologist at Arizona State, and author of “The Physics of Star Trek,” said he had changed his initial dire opinion about the program. “First, because it is funny, and continues to be,” he said. “Second, because the characters have developed softer edges, and one of them has the girl!”

Sheldon and Leonard are not cool, but they have turned out to be lovable.

They were born out of brainstorming sessions Chuck Lorre and Bill Prady, the show’s producers, were having for a new program about a young woman out on her own. Mr. Prady started reminiscing about people he had known during his own days as a computer programmer in New York, like the guy who could do complicated mathematical conversions in his head, but could not figure out the tip in a restaurant, because he did not know how to quantify “service.” Their female character, they decided, should be a bridge into the world of such people, who can speak Klingon but do not know how to ask a woman out on a date. The show would be about the feeling, as Mr. Lorre described it, of “not quite fitting and understanding the rules of the road.”

In the show’s hierarchy of nerddom, Sheldon, played by Jim Parsons, is the king, socially clueless and irritatingly, rigidly rational (his former roommate left the words “Die Sheldon, Die” painted on the walls of his room), while Leonard, played by Johnny Galecki, is more of an everyman, trying to break out of his shell. “Leonard is in the most discomfort, he wants to move through the world,” said Mr. Lorre; Sheldon doesn’t care. Leonard’s efforts to establish and then maintain a romantic relationship with Penny have constituted a major part of the narrative arc of the first three seasons.

How does it feel to be such a freak? During a break from rehearsals recently, Mr. Parsons and Mr. Galecki both said they did not understand any of the scientific dialogue in the show. Mr. Parsons said his last interaction with academic science had been when he flunked a course in meteorology at the University of Houston.

But then again, he added, they did not understand all the pop culture references to comic books and “Star Trek” any better. “What am I trying to say by saying this is the most important thing,” he said.

Mr. Galecki said: “A lot of people thought it would be a show that poked fun at smart people, but it has become a show that defends smart people much more often than that. These guys, as socially inept as they might be, are the type of people that are molding our future as a society.”

Mr. Parsons memorizes his lines by writing them out longhand and says he is astonished when people ask if the actors on the show ever improvise. “To veer away from the scripted dialog is a one-way trip to sudden death,” he said.

Still, problems and questions arise, which is where David Saltzberg, a particle physicist at the University of California, Los Angeles, and the show’s scientific consultant, comes in. Besides supplying the equations that appear on whiteboards in Sheldon and Leonard’s living room, he sometimes advises on the plotting and characters’ scientific predilections.

Dr. Saltzberg, who blogs about his activities on the show, said that many of the people who grouse to him about the show have not seen very much of it. His comments were echoed by Mr. Prady, one of the producers, who rejected the notion that the show stereotypes women. “Far from being a dumb blonde, Penny has demonstrated time and again that she possesses above average intelligence and practical knowledge that often far exceeds that of the guys,” he wrote in an e-mail message.

Indeed a perusal of the first two seasons turns up a wider variety of women in science than you might have thought, including Leslie Winkle, another Caltech physicist who seduces Leonard into a one-night stand and corrects an error on Sheldon’s whiteboard in the process.

The point of the show, Mr. Prady said, is to tell small stories. “We are not doing ‘Lost,’ we’re not doing a complex novel for TV,” he said. “We follow the characters, and let them tell us what they’re going to do next. We’re telling stories about outsiders. We all feel like outsiders. Can you find love? Penny pulls Leonard to the outside world; Sheldon pulls him back.”

Mr. Lorre said that the whole “challenge and joy” of a series like this is character development. “Maybe at the end of the day this will inspire some kids to go into physics,” he added, “just like ‘Cheers’ inspired countless young people to go into bars.”

Monday, April 26, 2010

A soul of similar thoughts

Lobster Man from Mars


I am beginning to read more and more similar thoughts.

"Humans on Mars? Forget it"

It's perhaps time to abandon the goal of sending astronauts to the Red Planet


Simon Ramo

April 26th, 2010

Nearly half a century ago, we sent men to the moon because we had to stop the world from thinking that the Soviet Union, having put a man in orbit, had surpassed the United States in science and technology. When Americans walked on the moon, we were back in first place, with the Russians keeping the lead in ballet, caviar and vodka. So we halted continued moon landings.

On July 20, 1989, President George H.W. Bush announced the Space Exploration Initiative, which called for returning astronauts to the moon, this time to stay, and then on to Mars. The initiative died when Congress decided the cost was too high, but the national goal of putting an American on Mars remained. In 2004, President George W. Bush reiterated that objective.

But is this a worthy goal? It appears increasingly doubtful that an astronaut could accomplish something useful on Mars not already being done by robots at far less cost and with little danger to humans.

The U.S. government and private industry have developed a successful, robust partnership launching unmanned satellites. Satellites are used daily for intelligence and reconnaissance, communications, weather monitoring and many other things. None of these applications is in any way dependent on the humans-in-space program.

Consider the enormity of an effort to send astronauts to Mars. When Mars is closest to Earth, the distance is still about 200 times that between Earth and the moon, which means it would take several months to reach Mars. The amount of food, water, oxygen and other basic supplies necessary for such a journey would require a far larger spacecraft than anything built yet. And it's by no means certain that humans could survive the trip.

The astronauts would be exposed to cosmic radiation and other dangers when in outer space or in the Mars environment for two years. If they could survive, consider the serious psychological ramifications of spending two years in a confined space with little ability to communicate normally with loved ones back home. Although traveling at the velocity of light, a radioed comment like "Good morning, how are you?" would not receive a response until many minutes later.

And the physical issues are enormous. Even with vigorous daily exercise, will an astronaut be able to walk on Earth after two years under no gravity? Will the astronaut's digestive system operate properly? What of the heart and other organs? What if there is a medical emergency? Finally, upon arriving on Mars, astronauts would find blood-freezing temperatures (more than 100 degrees below zero Fahrenheit at night, even at the equator) and a suffocating atmosphere of carbon dioxide and no air.

And the logistics are overwhelming, from the massive solar arrays that would be necessary to provide constant electric power to the challenges of resupply and refueling.

The modest International Space Station will have cost about $100 billion by the time it is de-orbited, as planned, in 2016. The price for designing and running the hugely more complicated array of apparatus needed for the Mars mission could easily reach 10 times that figure. When numerous radically new machines must operate together, it is an enormous challenge to attain the failure-free stage. If only one mishap in 100 trips is the acceptable performance, for instance, for a combination of 10 separate machines, then each of those 10 machines must have an even higher failure rate of only one time in 1,000. That would require testing to failure — all the while debugging and redesigning — of a huge amount of apparatus. It is not like finding a fault with an airplane and bringing it back to the engineers to modify it. Americans surely would not tolerate repeated failed trials with loss of lives while we improve designs.

Of course, a Mars landing by an American would create world excitement and admiration for our country, just as our lunar landings did. But if the goal is to raise ourselves up in the world's estimation, there are probably better ways to spend money, such as providing good education for all, speeding up medical research to cure fatal diseases, building plants to desalinate ocean water and boosting clean energy development.

If we don't move forward with manned space exploration, we will always, of course, wonder whether humans might have discovered something phenomenal that robots missed.

Gentry Lee, chief systems engineer at Caltech's Jet Propulsion Laboratory, has put it well: "When there are profound scientific questions that can only be answered by multiple, adaptive interactions with the unknown environment, the intelligence and versatility of a human being might be useful to unravel a very important scientific puzzle. So far, no such cases have emerged in our exploration of the solar system. And even in that special situation, the cost and likelihood of scientific success by sending human beings should be compared to the likely outcome of dispatching a flotilla of robotic spacecraft to the same destination with the same objective."

Some worry that if we allow further conquering of outer space to be by China or Russia, they will become the most respected nations as to exploration initiative and heroism. But should Russia put a cosmonaut on the moon, they merely will have caught up with where America was 40 years ago. And if China tries to send humans to Mars, it is reasonable to guess that they will be bogged down for many years, while our unmanned missions will continue to produce valuable research results.

It is conceivable that radical scientific and engineering developments — like the invention of some sort of "safe atom bomb" rocketry to "blow" an astronaut to Mars quickly — might someday alter the possibilities for space travel. But without such scientific revolutions, the costs — both human and economic — are just too great, especially since it is not at all clear that humans can do in space what can be and is already being done by robots.

The U.S. government should consider announcing that to place humans on Mars is no longer our goal. We should be willing even to consider that the entire humans-in-space idea may now be out of date.

[Simon Ramo was the chief scientist and technical director in the creation of the United States' intercontinental ballistic missile system and a co-founder of TRW Inc. He received the National Medal of Science form President Carter and the Presidential Medal of Freedom from President Reagan.]