Thursday, December 31, 2009

Bob Guccione's "Omni"

Remember when Bob Guccione provided this...

and also...


I never cared for either one...Penthouse was an obvious cheesy imitation of the sophisticated Playboy and Omni verged on science fiction.

"In 2010, We Will Live on the Moon"

Remembering the giddy futurism of Omni magazine.

by

Paul Collins

December 30th, 2009

Slate

Is a robot shoveling your snow?

I once knew the answer to this question. For anyone who was raised in the '70s and never had a date in the '80s or who thought the 2000s would look like a cross between a Yes album cover and Journey concert T-shirt, Omni magazine was essential reading—one with a ready answer to all your robot and rocket questions. And to a 10-year old getting a subscription for Christmas in 1979, Omni was The Future.

The magazine was a lushly airbrushed, sans-serif, and silver-paged vision dreamed up by Penthouse publisher Bob Guccione and his wife, Kathy Keeton. It split the difference between the consumerist Popular Science—which always seemed to cover hypersonic travel and AMC carburetors in the same page—and the lofty Scientific American, whose rigor was alluring but still impenetrable to me. But with equal parts sci-fi, feature reporting, and meaty interviews with Freeman Dyson and Edward O. Wilson, Omni's arrival every month was a sort of peak nerd experience.

"Omni was different," the erstwhile Penthouse publisher mused in his first editorial for the magazine. "It was a creation of pure joy."

Guccione had plenty to joyful about: commercially, Omni really did look like the future. Its October 1978 debut had what was then the largest number of ad pages for any newly launched magazine in history. By the following month, the New York Times and the Economist both had their own Science sections. And when Guccione watched his circulation quickly soar to around 850,000—most in the coveted 18-34 demographic—his magazine looked as if it was here to stay.

But the only place you'll find Omni for sale today is in a junk shop or on eBay. To look over old issues of Omni is to experience equal parts amazement (a science mag by Penthouse's founder interviews Richard Feynman?) and amusement (by 2010, robots will—yes!—"clean the rug, iron the clothes, and shovel the snow.") It was in a 1981 Omni piece that William Gibson coined the word "cyberspace," while the provoking lede "For this I spent two thousand dollars? To kill imaginary Martians?" exhorted Omni-readers to go online in 1983—where, they predicted, everything from entire libraries to consumer product reviews would soon migrate. A year later, the magazine ran one of the earliest accounts of telecommuting with Doug Garr's "Home Is Where the Work Is," which might have also marked the first appearance of this deathless standby of modern reportage: "I went to work in my pajamas."

Then again, that same issue predicted the first moon colony in 2010; supplied with "water in the shadowed craters of the moon's north pole" (not a bad guess), it might be attacked by "space-based Soviet particle-beam weapons."

Amid these features, a fine "games" column, and lush art essays like the "7 Wonders of the Universe"—which included an orbiting Yonkers Airport—the silver-paged Continuum section ran peppy science items not unlike today's In Brief section of New Scientist. (In fact, the two magazines shared some DNA via editor Bernard Dixon.) Not only did Omni sci-fi contributors include a veritable hall of fame—Bradbury, Asimov, Dick, Heinlein, Clarke, Gibson, Sterling—the magazine also featured women writers and editors to a degree that still puts most magazines to shame. Ursula K. LeGuin and Dava Sobel were early contributors, as was Joyce Carol Oates.

That mix of sci-fi and science reporting was telling. Omni's science coverage was built on a sturdy tripod of space exploration, medicine, and computing, but always with a certain fondness for speculative woo-woo. Editor Robert Weil has recalled Guccione's fascination with "stuff on parapsychology and U.F.O.'s," which accounts for the items on alien interference in the Yom Kippur War, a haunted pizza factory, and psychics using tarot cards "to energize their pineal glands." These lived in the fire alarm-red "Antimatter" section, though these Antimatter particles increasingly mingled with the Matter in the rest of the magazine. For a surprisingly long time, this mixture of Matter and Antimatter didn't quite blow up.

There was even a short-lived Omni TV series—still viewable on YouTube with an affably boozy-looking Peter Ustinov hosting—and an Omni Future Almanac published in 1982. The almanac is a retro delight: Its soothsaying ranged from global warming and a wireless "data terminal inside the home, [where] one could have access to all the volumes within the Library of Congress and the New York Public Library" (Google Books!) to predictions of a France-to-Libya water pipeline and 100 mpg averages for cars. "By 2000," it solemnly predicted, "Sweden will have a largely automated robot society." Also, by now we should be having a space crime wave.

Ah, if only.

It's hard to say when Omni lost its momentum: Perhaps it was when the future itself changed and shuttle missions started looking like an expensive way of going nowhere. Perhaps it was Time Inc.'s launch of Discover or the time when two Omni editors—and then Guccione's own daughter—publicly resigned in 1990 in protest over ads on the magazine's cover. The ads, ironically, were probably the newest thing about Omni's look: It hadn't been updated in two decades, and the far-out future once promised to ELO listeners appeared desperately unhip in a never mind era.

Other magazines ate their lunch: Subscribers to the newly launched Wired looked suspiciously like Omni readers who'd moved on and gotten MBAs. By 1995, Omni's thinning page count was stuffed with more marginal stories, and—in the most gloriously desperate move I've ever seen in a collapsing magazine—it ran ads for "a breakthrough in interactive publishing": a 900 line that provided "a direct link to our editorial staff."

That's right: for just 95 cents a minute, you could talk to real, live—and horny, one assumes—"editorial assistants" at Omni.

The print edition folded a year later, though Guccione and Keeton spun this into a startling achievement: Omni became the first major newsstand title to go online-only. Omnimag.com lacked the gloss and heft of old, but the Continuum items read much the same as before, and it continued a strong suit in interviews by expanding them into interactive forums with Omni readers.

And, of course, there were UFOs—lots of UFOs.

Woo-woo science is always fun until someone gets hurt; and, alas, that someone may have been publisher Kathy Keeton. Omni was her baby, and even as it pumped out more Antimatter coverage, Keeton battled the earthly ailment of breast cancer with largely discredited hydrazine-sulfate therapy. Wide-eyed futurism may not have lent itself to judging cancer treatments. Within days of Keeton's death in September 1997, Omni's site fell silent, save for a link directing readers to a Hydrazine Sulfate advocacy site. These days the (now NSFW) Omnimag.com domain redirects readers to ... Penthouse.

And yet old copies of Omni can still stop grown nerds in their tracks: "Oh god, I remember this!" Curiously, so does the Guccione family—son Bob Jr. occasionally speaks of reviving the magazine. If he ever does, I'll look forward to watching my sons reading it. Though some kinds of future, I fear, can only be found in the past.

Click here to view a slide show on Omni magazine.

[Paul Collins teaches in the MFA writing program at Portland State University. His latest book is The Book of William: How Shakespeare's First Folio Conquered the World.]

Wednesday, December 30, 2009

Russians and asteroid Apophis


Did I miss the latest issue of comic book Fools In Space?

"Russians to Save World From Killer Asteroid With ‘Laws of Physics’"

by

Sue

December 30th, 2009

ChattaBox.com

The Russians are coming, the Russians are coming—to save us from impending doom. According to Russia’s space agency chief, the Russians have their eyes firmly planted on the rogue asteroid Apophis, which missed crashing into Earth during its first voyage towards our planet. But now the asteroid has just a four-in-a million chance of hitting Earth. Still, the Russians believe our planet may be in peril from Apophis in 2030 and they don’t intend to sit idly by and wait to be obliterated by a massive space rock.

Anatoly Perminov, the head of Russia’s space agency, revealed his plans during an interview with the Voice of Russia radio, promising to save the world from Apophis, about twenty-years in the future. Perminov said that no nuclear bombs would be used to deflect the killer 850-foot asteroid, but didn’t rule out bombs altogether. The space chief said that “no nuclear explosions” would be used and that the deflection would be accomplished “on the basis of the laws of physics.”

Perminov intends to launch a spaceship to deal directly with Apophis, but he does not intend to destroy the asteroid.

Despite the slim to none chance of Apophis crashing into Earth, Perminov declared that the asteroid “will surely collide with the Earth in the 2030s.”

Of course saving the world from asteroids using “the laws of physics” doesn’t come cheap, but Perminov believes the cost is worth it. “People’s lives are at stake,” said the Russian space chief. “We should pay several hundred million dollars and build a system that would allow to prevent a collision, rather than sit and wait for it to happen and kill hundreds of thousands of people.”

The man may have a point.

Auld Lang Syne


A bit of traditional history here. Auld Lang Syne uttered my many as the old year ends and new one beginning.

HAPPY NEW YEAR TO ALL

Auld Lang Syne

Graduated student's problems are old and universal


Under graduate or graduate student...finding desired employment is rare and one must accept the fact of climbing the ladder of jobs. Some people easily understand this and some don't thus making their success even harder. It reminds me of an old friend, now deceased, who aspired to become a film director. He was stubborn and somewhat naive and refused to enter the business sweeping the studio floor before becoming a director. It was directorship or nothing...and it finalized as "nothing" and spiraled in delusions of grandeur. Nevertheless, there is a fragment of a long lost silent film I Graduated, But ... by famous Japanese film director Yasujiro Ozu best known for Tokyo Story [1953] that deals with a graduated student. Think about it graduates...sweeping floors may be mandatory.

Uploader's comments...

It's about a recent college graduate who can't find a job because he considers an entry level office job beneath him. This forces the wife to have to work to support them, which makes the husband jealous and angry. Finally he is humbled and goes back to the same business to ask for work again, he is hired, and the marriage is restored.

This is the only footage that remains.

I Graduated, But ...

directed by

Yasujiro Ozu

1929




Yasujirō Ozu

Women of science in the Renaissance


Renaissance Women in Science contains the fascinating stories of seventeen scientists who unlocked secrets of the universe and whose discoveries helped change the future of the world. Pursuing careers ranging from astronomy and atomic research to chemistry and medicine, a number of these women went on to win Nobel prizes for their work. All were dedicated to learning and discovery and many contributed to a humanitarian legacy in the form of improved worker rights, environmental protection, and better health care for others. The book reveals the motivations of these extraordinary women and explores the circumstances that allowed them to break through the barriers of their time, race, and gender to pursue their dreams. Their stories will inspire us all to reach beyond the ordinary.

[Louise Q. van der Does is an advanced doctoral student in the School of Public Affairs at the American University. Rita J. Simon is a Professor in the School of Public Affairs and the Washington College of Law at American University.]


Renaissance Women in Science


by

Louise Q. van der Does and Rita J. Simon

ISBN-10: 0761814817
ISBN-13: 978-0761814818

Is the chase more pleasurable than the capture?


Maybe the action of pursuit is more pleasurable than the capture...consider Niles and Daphane.

"The Joy of Physics Isn’t in the Results, but in the Search Itself"

by

Dennis Overbye

December 29th, 2009

The New York Times

I was asked recently what the Large Hadron Collider, the giant particle accelerator outside Geneva, is good for. After $10 billion and 15 years, the machine is ready to begin operations early next year, banging together protons in an effort to recreate the conditions of the Big Bang. Sure, there are new particles and abstract symmetries in the offing for those few who speak the language of quantum field theory. But what about the rest of us?

The classic answer was allegedly given long ago by Michael Faraday, who, when asked what good was electricity, told a government minister that he didn’t know but that “one day you will tax it.”

Not being fast enough on my feet, I rattled off the usual suspects. Among the spinoffs from particle physics, besides a robust academic research community, are the Web, which was invented as a tool for physicists to better communicate at CERN — the European Organization for Nuclear Research, builders of the new collider — and many modern medical imaging methods like M.R.I.’s and PET scans.

These tests sound innocuous and even miraculous: noninvasive and mostly painless explorations of personal inner space, but their use does involve an encounter with forces that sound like they came from the twilight zone. When my wife, Nancy, had a scan known as a Spect last fall, for what seems to have been a false alarm, she had to be injected with a radioactive tracer. That meant she had to sleep in another room for a couple of days and was forbidden to hug our daughter.

The “P” in PET scan, after all, stands for positron, as in the particles that are opposites to the friendly workhorse, the electron, which is to say antimatter, the weird stuff of science-fiction dreams.

I don’t know if anyone ever asked Paul Dirac, the British physicist who predicted the existence of antimatter, whether it would ever be good for anything. Some people are now saying that the overuse of scanning devices has helped bankrupt the health care system. Indeed, when I saw the bill for Nancy’s scan, I almost fainted, but when I saw how little of it we ourselves had to pay, I felt like ordering up Champagne.

But better medical devices are not why we build these machines that eat a small city’s worth of electricity to bang together protons and recreate the fires of the Big Bang. Better diagnoses are not why young scientists spend the best years of their lives welding and soldering and pulling cable through underground caverns inside detectors the size of New York apartment buildings to capture and record those holy fires.

They want to know where we all came from, and so do I. In a drawer at home I have a family tree my brother made as a school project long ago tracing our ancestry back several hundred years in Norway, but it’s not enough. Whatever happened in the Big Bang, whatever laws are briefly reincarnated in the unholy proton fires at CERN, not only made galaxies and planets possible, but it also made us possible. How atoms could achieve such a thing is a story mostly untold but worth revering. The Earth’s biosphere is the most complicated manifestation of the laws of nature that we know of.

Like an only child dreaming of lost siblings, we dream of finding other Earths, other creatures and civilizations out in space, or even other universes. We all want to find out that we are cosmic Anastasias and that there is a secret that connects us, that lays bare the essential unity of physical phenomena.

And so we try, sometimes against great odds. The year that is now ending began with some areas of science in ruins. One section of the Large Hadron Collider looked like a train wreck with several-ton magnets lying about smashed after an electrical connection between them vaporized only nine days off a showy inauguration.

The Hubble Space Telescope was limping about in orbit with only one of its cameras working.

But here is the scorecard at the end of the year: in December, the newly refurbished collider produced a million proton collisions, including 50,000 at the record energy of 1.2 trillion electron volts per proton, before going silent for the holidays. CERN is on track to run it next year at three times that energy.

The Hubble telescope, after one last astronaut servicing visit, reached to within spitting distance of the Big Bang and recorded images of the most distant galaxies yet observed, which existed some 600 million or 700 million years after the putative beginning of time

Not to mention the rapidly expanding universe of extrasolar planets. In my view from the cosmic bleachers, the pot is bubbling for discovery. We all got a hint of just how crazy that might be in the new age of the Internet on Dec. 17, when physicists around the world found themselves glued to a Webcast of the results from an experiment called the Cryogenic Dark Matter Search. Rumors had swept the blogs and other outposts of scientific commentary that the experimenters were going to announce that they had finally detected the ethereal and mysterious dark matter particles, which, astronomers say, make up a quarter of the universe.

In the end, the result was frustratingly vague and inconclusive.

“We want it to be true — we so want to have a clue about dark matter,” Maria Spiropulu, a Caltech physicist working at CERN wrote to me the night of the Webcast.

“And it is not easy,” Dr. Spiropulu said. “The experiments are not easy and the analysis is not easy. This is a tough, tough ride over all.”

Although we might well solve part of the dark matter conundrum in the coming years, the larger mystery winds out in front of us like a train snaking into the fog.

We may never know where we came from. We will probably never find that cosmic connection to our lost royalty. Someday I will visit Norway and look up those ancestors. They died not knowing the fate of the universe, and so will I, but maybe that’s all right.

Steven Weinberg, a University of Texas physicist and Nobel Prize winner, once wrote in his 1977 book “The First Three Minutes”: “The more the universe seems comprehensible, the more it also seems pointless.” Dr. Weinberg has been explaining that statement ever since. He went on to say that it is by how we live and love and, yes, do science, that the universe warms up and acquires meaning.

As the dark matter fever was rising a few weeks ago, I called Vera Rubin, the astronomer at the department of terrestrial magnetism of the Carnegie Institution of Washington, who helped make dark matter a cosmic issue by showing that galaxies rotate too fast for the gravity of their luminous components to keep them together.

But Dr. Rubin, who likes to stick to the facts, refused to be excited. “I don’t know if we have dark matter or have to nudge Newton’s Laws or what.

“I’m sorry I know so little; I’m sorry we all know so little. But that’s kind of the fun, isn’t it?”


Knowledge limitations

Σπυροπούλου [Maria Spiropulu]--physicist

Blue blue moon and partial eclipse ...for the Irish


"New Year's Eve to coincide with eclipse"

December 30th, 2009

RTE News

Astronomy Ireland has urged people to consider beginning their New Year celebrations with an 'Eclipse Party' this year.

For the first time since 1656, New Year's Eve will coincide with a lunar eclipse, which will be visible to everyone in Ireland (and most of the world) between 7pm and 8pm tomorrow.

'Most people have never seen anything like this in their lives so we are urging everyone to take a look between 7pm and 8pm to kick off their New Year parties!' said David Moore.

The eclipse will be visible to the naked eye, with the deepest eclipse at 7.22pm.

The eclipse is caused when the Moon moves into the Earth's shadow.

This eclipse of the Moon is partial, so only 8% of the Moon will actually be covered by Earth's shadow. However, this will look very spectacular to the naked eye with half the Moon 'discoloured' due to it being so close to the deep shadow of the Earth.

Full details of the eclipse are on Astronomy Ireland's website


A "blue moon" on December 31st...literally


Dedicated to the poet




Plastic boxes save energy


This is not a bad idea except for the fact that it does take oil to manufacture them.

"Flat pack"

A more environmentally friendly way to transport goods

December 30th, 2009

Economist.com

OVERHAULING an industry of which you know little is not easy, but neither is it impossible. In 1956 Malcom McLean, a trucker from North Carolina, launched the first “intermodal” shipping container, which could be transferred easily between lorries, trains and ships. It revolutionised the transport of goods by abolishing the traditional (and back-breaking) system of “break bulk” loading, and thus helped oil the wheels of globalisation. Now another outsider to the shipping industry is trying to get a similar change under way.

Rene Giesbers, a heating-systems engineer from the Netherlands, has invented a collapsible plastic shipping container which, he hopes, will replace McLean’s steel design. Because it is made of a fibreglass composite, it weighs only three-quarters as much as a standard container but—more importantly— when it is empty, it can be folded down to a quarter of its size. The composite is more resistant to corrosion than the steel it replaces, is easier to clean and floats. It is also greener to manufacture. Making one of Mr Giesbers’s containers results in just a quarter of the carbon dioxide that would be generated by the manufacture of its steel counterpart.

A collapsible shipping container would be useful for several reasons. Patterns of trade mean that more goods travel from China to America, for example, than the other way around, so ships, trains and lorries inevitably carry some empty containers. If these were folded, there would be more room for full containers and some vessels would be liberated to ply different routes. If collapsed containers were bundled together in groups of four, ships could be loaded far more quickly, cutting the time spent in ports. They would also take up less space on land, allowing depots to operate more efficiently.

Mr Giesbers is not the first to invent a collapsible container. Several models were experimented with in the early 1990s but failed to catch on, mainly because of the extra work involved in folding and unfolding them. There were also concerns about their strength. Mr Giesbers says the Cargoshell, as he has dubbed his version, can be collapsed or opened in 30 seconds by a single person using a forklift truck, and that it is now undergoing tests to see whether it is strong enough to meet the requirements set by the International Organisation for Standardisation.

There are currently about 26m containers in the world, and the volume of goods they carry has risen from 13.5m “twenty-foot equivalent units” in 1980 to almost 140m today. It is expected to reach 180m by 2015. Mr Giesbers aims to have a million Cargoshells plying the seas, rails and roads by 2020, equivalent to 4% of the market.

Bart Kuipers, a harbour economist at Erasmus University in Rotterdam, thinks that is a little ambitious, but he reckons the crate could win 2-3% of the market. He thinks it is the container’s lower weight, rather than its collapsibility, that makes it attractive. It will appeal to companies worried about their carbon footprints—and if oil prices rise, that appeal will widen.

Ultimately, the main obstacle to the introduction of the Cargoshell may be institutional rather than technical. As Edgar Blanco, a logistics expert at the Massachusetts Institute of Technology, points out, “Everyone is vested in the current system. Introducing a disruptive technology requires a major player to take a huge risk in adopting it. So the question will always boil down to: who pays for the extra cost, and takes the initial risk?”

Sunday, December 27, 2009

Japanese toilet tissue and astronomy


No comment...

"Wiping with the stars"

by

Tia Jones

December 2009

Symmetry

Every so often, particle physics communicators from labs around the world gather to swap strategies for getting people interested in science. At the group’s April meeting in Japan, the big hit was toilet paper.

Since 2004, more than 40,000 rolls of toilet paper with flushable facts about the life cycles of stars have found their way into Japanese society. References to Astronomical Toilet Paper litter blogs, and Web photos show the toilet paper visiting historical sites, much like the gnome in Travelocity commercials.

“People nowadays are too busy to think about the universe,” says Naohiro Takanashi, a research fellow at the National Astronomical Observatory of Japan. “We hope busy people use their time in the closed rest room to think about it.”

Tsuyuki Shikou, a company known for manufacturing toilet paper adorned with animals, flowers and vegetables, printed this version for the TENPLA Project, which is dedicated to popularizing astronomy. It follows in a whimsical Japanese tradition of inscribing toilet paper with comic strips, crossword puzzles, novels, and pop culture icons.

“We hope people learn that stars have a life. They are born from a molecular cloud, they become adult and finally they die; it’s similar to our lives,” Takanashi says. “We want people to see the similarity and feel connected to the stars and have an interest in astronomy.”

You can even do a bit of astronomy with the cardboard tube at the center of the roll: Take it outside, look at the sky through the cardboard tube, count the stars in the circle and use a formula to calculate the brightness of the night sky.

That is, if you can bring yourself to use all the paper.

Fusae Miyazoe, spokeswoman for the Institute for the Physics and Mathematics of the Universe at the University of Tokyo, says many of her colleagues tell her it’s a waste to read the roll only once, “so they just leave it on the shelf in the bathroom.”

If the idea catches on, maybe we’ll see toilet paper showing, step by step, how particles zip through a detector. Hmm. What will particle physicists do with the cardboard tube?

Saturday, December 26, 2009

Early brews


Well, it isn't too late. The mistletoe may have to be put away but there is always an occasion for indulging in some fermented and distilled brews...New Year's Eve for example.

"Eight ancient drinks uncorked by science"

by

John Roach

msnbc.com

Throughout human history, alcoholic beverages have treated pain, thwarted infections and unleashed a cascade of pleasure in the brain that lubricates the social fabric of life, according to Patrick McGovern, an archaeochemist at the University of Pennsylvania Museum of Archaeology and Anthropology.

For the past several decades, McGovern's research has focused on finding archaeological and chemical evidence for fermented beverages in the ancient world. The details are chronicled in his recently published book, “Uncorking the Past: The Quest for Wine, Beer, and Other Alcoholic Beverages.”

He argues that the mind-altering effects of alcohol and the mysterious process of fermentation may explain why these drinks dominated entire economies, religions and societies. He’s found evidence of fermented beverages everywhere he's looked, which fits his hypothesis that alcohol "had a lot to do with making us what we are in biological and cultural terms."

The author, shown here examining an ancient pottery sherd, spoke with msnbc.com about his research. Click the "Next" arrow above to learn about 8 ancient drinks uncorked by science.

China: First known brew

While the human relationship with alcohol may trace back to our ancestors, the earliest chemical evidence for an alcoholic beverage dates back 9,000 years to the ancient village of Jiahu in China's Henan province.

Based on the analysis of residues extracted from pottery fragments, McGovern and colleagues concluded that the people were drinking a mixed wine-and-beer-like beverage made with grapes, hawthorn fruit, rice and honey. The finding was published in December 2004. The following year, McGovern collaborated with Sam Calagione and his crew at the Dogfish Head Brewery in Delaware to re-create the millennia-old drink. Their creation, called Chateau Jiahu, won a gold medal at the Great American Beer Festival in 2009.

"We worked hard on getting this interpretation right. Since it does represent the oldest alcoholic beverage, it was really gratifying to get that gold tasting award," McGovern said.

Iran: Earliest evidence for barley beer

Which came first: bread or beer? The question remains unresolved, but evidence suggests barley was first cultivated about 10,000 years ago – the same time humans were abandoning the hunter-gatherer lifestyle and sowing the seeds of civilization. What was the catalyst for the transition? A steady supply of barley bread is one possibility. Brewing copious amounts of barley beer is another.

"From a pragmatic standpoint, the question is really a-no brainer," McGovern writes in his book. "If you had to choose today, which would it be? Neolithic people had all the same neural pathways and sensory organs as we have, so their choice would probably not have been much different."

Some of the earliest chemical evidence for beer comes from residues – calcium oxalate, known as beerstone – inside a jar excavated at the Godin Tepe archaeological site in the Zagros Mountains of Iran that is dated to between 3400 and 3100 B.C.

Turkey: Mixed drink for Midas?

In 1957, University of Pennsylvania Museum researchers working at the Gordion archaeological site near Ankara, Turkey, broke through the wall of an elaborate tomb dated to between 740 and 700 B.C. that research suggests was the burial site of the fabled King Midas, or his father and king, Gordius. Among the remains in the tomb were the body of a 60- to 65-year-old male and the largest Iron Age drinking set ever found: 157 bronze vessels that were presumably used during the occupant's farewell feast.

In the late 1990s, McGovern and his colleagues analyzed residues inside the vessels and found evidence for a mixed beverage of grape wine, barley beer and honey mead. In March of 2000, he challenged microbrewers to make a representative concoction – and in the process prove or disprove that such grog was a plausible, enjoyable drink. Sam Calagione of the Dogfish Head brewery came through with what has become his most celebrated beverage: "Midas Touch."

Phoenicia: Active in the wine trade

Analysis of a pottery jar, or amphora, pulled up from a late 8th century B.C. shipwreck in the Mediterranean off the coast of Israel offers a strong hint that the wine trade flourished as a result of Phoenician enterprise originating from the coast of Lebanon and Syria, according to McGovern.

He and his colleagues discovered that the amphora was filled with a tree-resin-infused wine. What's more, the bottle had been sealed with resin to prevent the liquid from leaking out and oxygen getting in and spoiling the wine. Other Phoenician shipwrecks found throughout the Mediterranean dating to between 1000 B.C. and 400 B.C. also contained vast stores of wine.

"Some of the people working on that area say that the wine trade was really what transferred culture from the eastern Mediterranean to the western Mediterranean, because all of these ships are just chock-full of wine-related artifacts," McGovern said.

Chile: New World’s first fermented drink?

The earliest evidence for human occupation in the New World is found at Mount Verde, Chile, an inland archaeological site that dates to about 13,000 years before present. The discovery of the site in 1977 raised the possibility that the first migrants across the land bridge between Siberia and Alaska took a water route to get to South America, not a slower-going overland trek as previously thought.

For McGovern, another intriguing possibility at Monte Verde is telling hints that these early Americans were drinking a fermented beverage. Though a drinking vessel or jug for chemical analysis has yet to be found, botanical debris at the site includes several fruits and starchy foods that could have been made into a buzz-giving drink.

"Humans are very innovative when it comes to figuring out how to make a fermented beverage, so if you've got fruits or other starchy materials that could be chewed or made into a sweet food or beverage, they'd discover how to do it. ... We just don't have the hard evidence for it yet," McGovern said.

Honduras: Wine and chocolate

Chocolate, almost anyone will attest, is tasty stuff. But long before humans were turning cacao beans into delicious deserts, they were making a wine from the sweet pulp that fills the cacao pods. "The initial motivation for focusing in on the chocolate tree and domesticating it would have been this fermented beverage," McGovern said.

The earliest evidence for this cacao-based wine comes from chemical analysis of pottery fragments recovered at the Puerto Escondido site in Honduras dating to as early as 1400 B.C. Nearly all the fragments tested had the fingerprint compound for cacao, theobromine. And these vessels clearly were intended to hold a liquid or a beverage, McGovern said.

Cacao-based fermented drinks were popular throughout Mesoamerica, evolving into a mixed beverage during Aztec and Mayan times that may have even included the addition of mind-altering substances such as peyote or hallucinogenic mushrooms. Honey, chilis, scented flowers and spices were the usual additives.

McGovern's research once again led to collaboration with Calagione at Dogfish Head to re-create a representative concoction of this centuries-old tradition. The creation, called Theobroma, is brewed with cocoa powder and nibs from the Aztec region of Soconusco, honey, chilis and fragrant tree seeds called annatto – though it lacks the illicit kick.

Peru: Burning down the house

For some reason or other, a pre-Incan civilization known as the Wari abandoned their outpost atop Cerro Baul, a mountain about 50 miles from the Pacific Ocean in southern Peru. Before they departed, archaeological evidence indicates that they had a grand bash replete with ceremonial smashing of mugs full of alcoholic beverage and then literally burned down the house.

The drink of choice for the Wari was made from the fruit of the pepper tree Schinus molle. The largest known production facility for making the beverage was found at Cerro Ba�l. In addition to vats for making the beverage and thousands of pepper-tree seeds and stems, archaeologists found shawl pins worn by women, an indication that they were responsible for making the beverage.

Egypt: Beer helped build the pyramids

For many a manual laborer, even today, few things are as rewarding after a long day's work than a mug of beer. The ancient Egyptians knew this. The workers who built the Great Pyramids, for example, were paid in a daily allotment of bread and beer, noted McGovern. Just how deep in time the Egyptian beer-making tradition goes is uncertain, but pottery remains from Hierakonpolis, in Upper Egypt, suggest that the craft was under way perhaps as early as 3500 B.C.

Chemical analyses suggest that barley was mashed and beer was made at the site and other sites nearby. If so, they would be the earliest breweries in the world. "They seem to be making beer on a very large scale," McGovern said. "It was probably involved in large-scale architectural projects in which the workers, just like at the pyramids, were paid in bread and beer."


Uncorking the Past: The Quest for Wine, Beer, and Other Alcoholic Beverages

by

Patrick McGovern

ISBN-10: 0520253795
ISBN-13: 978-0520253797

Thursday, December 24, 2009

"Nutcracker Suite"--Christmas Fair


Pleasing and seasonal piece of animation from the Walt Disney studios and Pyotr Ilyich Tchaikovsky's Nutcracker Suite Op. 71a. Pure hand work--no CGI.

"Nutcracker Suite"

Fantasia

1940

Walt Disney



Tom and Jerry Cartoon--Christmas Eve


Tis Christmas Eve and Tom and Jerry are at it again but friendship and conscience prevail. It's a seasonal thing. To note: One interesting piece of science is demonstrated. Watch for Jerry attempting to conceal himself by standing in an empty Christmas tree light socket and being harmlessly illuminated and Tom reaching for him and being terribly shocked [he is grounded]. [Similar to birds sitting on an uninsulated power line and not being harmed.]

"The Night Before Christmas"

Tom and Jerry

1941

William Hanna and Joseph Barbera

Wednesday, December 23, 2009

2009 Nobel Prize in Physics challenged

Wilhelm Röntgen [1845 to 1923] was the first person to receive the Nobel Prize in Physics for his discovery of X-rays.

What's the issue here? Are the challengers concerned about prestige, money, ethical integrity?

"Controversy raised about 2009 Nobel Prize in Physics"

December 23rd, 2009

Wikinews

A controversy has arisen regarding this year's Nobel Prize in Physics. Fellow scientists contest whether Canadian George E. Smith and American Willard Boyle deserved the prize. It was awarded for the invention of the charge-coupled device, which is essential in modern digital photography. The main dispute was raised during an October interview by Eugene Gordon and Mike Tompsett, two now-retired colleagues from Bell labs. They claim that Smith and Boyle were not the main scientists behind CCD changing into image capturing technology.

Charge-coupled devices move electrical charge around an array to a location where the charge can be manipulated (e.g. read out). Coupled with a device that converts light to digital signals, this enables digital imaging, which gave rise to modern-day digital photography. The Nobel citation indicates that digital imaging was the basis for the award.

Boyle apparently received credit for the invention because of his position as head of the laboratory, and Gordon claims that he actually had very little, if anything, to do with the CCD's use in imaging. Gordon says that he has "documentation that disproves most of what they are saying and the rest of what they are saying is not at all logical". Tompsett says that Boyle and Smith deserved credit for the original concept, but that he should be credited for its use in imaging, which they had, he said, not even considered.

Gordon and Tompsett claim that the Royal Swedish Academy of Sciences made a mistake by awarding the prize. Gordon said in an interview at his home in Mountainview, New Jersey, on Monday that "they wouldn't know an imaging device if it stared them in the face". They say that the patent used to justify the award does not mention imaging devices, and that they were the first to find such a use for what had previously been a project in memory and electrical storage. This development led to the invention of digital photography, as it meant that images "could now be captured electronically instead of on film".

Boyle and Smith say that to them, it was obvious that CCDs could be used for imaging, however they handed that particular task to Tompsett. Gordon said of Tompsett: "He invented how to do it. He designed the devices. He built them. He demonstrated on them," and suggests that the prize be awarded to him instead.

Boyle and Smith have previously received a number of awards for inventing the CCD, dating as far back as the Stuart Ballantine Medal in 1973 from the Franklin Institute. Other awards came in 1974 and 1999 from the Institute of Electrical and Electronics Engineers up to the Charles Stark Draper Prize in 2006.

The Nobel Prize in Physics has previously been awarded controversially, dating back to 1909 with a disagreement between awarding the prize for inventing the radio to Tesla and Marconi, and the exclusion of Jocelyn Bell Burnell from the 1974 prize for the discovery of the Pulsar, awarded to Antony Hewish.

Technology and better killing machines


"War And Technology"

by

Alex Roland

December 23rd, 2009

RSD Reports

Military technology often seems to be the dark side of innovation, the Mr. Hyde roaming the back alleys of civilization for opportunities to work his worst on society. Its foundational figure in Western civilization is the Greek Hephaestus (whose counterpart was the Roman “Vulcan”), the only god to have been lame and misshapen. But countless inventors and innovators, from Alfred Nobel to Robert Boyle, thought of weapons positively. They believed that they could banish the scourge of war, or at least restrain its excesses, if they could only invent the ultimate weapon, the instrument so horrible that no one would dare use it.

More than six decades into the nuclear age, there is growing evidence that the hydrogen bomb may prove to be the long-sought war-stopper.[1] But should that be the case, it will run counter to the sorry record of prior human civilization, when each new instrument of war contributed to the carnage without altering the human nature Thucydides believed to be at the heart of war. Melvin Kranzberg, a co-founder of the Society for the History of Technology and the founding editor of its journal, Technology and Culture, was fond of observing that technology is neither good nor bad, nor is it neutral. Technology in essence is a process of manipulating the material world for human purposes. Whether it does good or ill depends not on the technology itself but on what humans choose to do with it.

Military machines and instruments can nonetheless be understood using the same concepts and categories that scholars apply to technology in general. Below I put forward four propositions about military technology, but the principles at work could be applied as easily in any realm of technological endeavor. They sometimes have a special relevance or poignancy when applied to war, but they say more about the nature of technology than they do about the nature of war.

In addition to their heuristic value, these concepts also have pedagogical utility. They can help demystify the arcane and often secretive world of military research and development and also clarify the impact on society of all complex technological systems. They offer students a set of conceptual tools for thinking about change in warfare over time and the role that technological innovation has played in that process.

My propositions are these: (1) technology, more than any other outside force, shapes warfare; and, conversely, war (not warfare) shapes technology. (2) Military technology is, however, not deterministic. Rather, (3) technology opens doors. And, finally, (4) these characteristics of military technology are easier to see in the modern period than previously, though they have always been at work.

Technology Shapes Warfare

Technology shapes warfare, not war. War is timeless and universal. It has afflicted virtually every state known to human history. Warfare is the conduct of war. It is the clash of arms or the maneuver of armed forces in the field. It entails what military professionals call operations, whether or not the opposing forces actually unleash their organized violence on one another. War is a condition in which a state might find itself; warfare is a physical activity conducted by armed forces in the context of war. Of course, many kinds of group violence, from gang fights to terrorism, might display some or all of the characteristics of warfare without rising to this definition of war, but more often than not these violent conflicts use instruments of war. To understand the technology of warfare is to understand the technology of most public violence.

Wording is also important in articulating exactly what impact technology has on warfare. A number of verbs suggest themselves. Technology defines, governs, or circumscribes warfare. It sets the stage for warfare. It is the instrumentality of warfare.

The most important verb describing the impact of technology on warfare is that it changes warfare. Technology has been the primary source of military innovation throughout history. It drives changes in warfare more than any other factor. Consider a simple thought experiment. Sun Tzu and Alexander the Great are brought back to life and assigned to lead coalition forces in Afghanistan in 2008. These near contemporaries from the fourth century BCE would understand almost everything they would need to know. Alexander actually fought in Afghanistan, and Sun Tzu (if such a person really existed) fought in comparably mountainous terrain in China.[2] Both were masters of strategy and tactics. What came to be called the “principles of war” are simply the tacit knowledge that all successful commanders throughout history have carried around in their bank of experience: an understanding of intelligence, surprise, maneuver, command and control, concentration of force, unity of command, terrain, etc. Even Clausewitz’s seminal contributions to military art and science—chance, violence, the “fog of war,” and “friction”—were concepts that Alexander and Sun Tzu knew by different names.

The only modern tool of command they would not know and could not readily assimilate would be the technology of war. Airplanes, missiles, tanks, drones, satellites, computers, GPS, and all the remaining panoply of the modern high-tech battlefield would be incomprehensible to them. A sergeant from their operations staff could exploit these resources more fully and effectively than either of our great captains. Sun Tzu and Alexander would be incompetent on the modern battlefield.

The point is even more obvious in humankind’s other two fields of battle—the sea and the air—to say nothing of space, perhaps the battlefield of the future. Naval warfare does not occur without ships, which, through most of human history, were the most complex of human technological artifacts. Of course the same is true of planes for air warfare, missiles for strategic warfare, and spacecraft for star wars. In each case, the vehicle defines the warfare. Horatio Nelson, perhaps the greatest naval commander of all time, would have been powerless to understand the strategy and tactics of World War II’s air warfare in the Pacific or submarine warfare in the Atlantic. The cat-and-mouse contest of Soviet and American attack submarines in the Cold War would have been even more incomprehensible to him. He might have gone back in time and intuited the essence of galley warfare, but he could not command in the age of steam, let alone the nuclear age, without a solid grounding in modern science and technology.

The more modern, or postmodern, the warfare becomes, the more the generalization holds true. Technology defines warfare. Air warfare was not even possible before the twentieth century, save for the vulnerable and inefficient reconnaissance balloons that were pioneered in Europe and America in the nineteenth century. In the twenty-first century, air warfare ranges from strategic bombing to close air support of ground troops to dog fights for air superiority to pilotless drones that carry the eyes and ears, and sometimes the ordnance, of operators hundreds, even thousands, of miles away. The U.S. boasts a missile defense installation that can stop the unstoppable, an intercontinental ballistic missile. Space-faring nations flirt with anti-satellite weapons launched from earth and even the prospect of space-based weapons to fight one another and threaten the earth below. Air warfare differs from naval warfare, not because the strategy and tactics of conflict in those realms differs, but because planes differ from ships. And, of course, both differ from tanks and rockets and satellites. Each technology shapes, defines, circumscribes, and governs a new kind of warfare.

Nor is it just the evolution of weaponry that changes warfare. It is the distribution of the weaponry. Throughout history, states have usually fought one another in weapons symmetry.[3] In the first Gulf War, for example, Saddam Hussein attempted to defeat a conventional, industrialized, mechanized American army with a conventional, industrialized, mechanized Iraqi army. The quality and quantity of the American technology prevailed. In the second Gulf War, however, the insurgents resorted to asymmetrical warfare, fighting the high-tech American arsenal with low-tech instruments of assassination, sabotage, and terror. Only when the United States adjusted its technology to meet the new threat did the enemy tactics lose their edge. Of course training, morale, numbers, will, and politics also contributed to the outcome in Iraq, but the nature of the technology set the stage for the struggle.

Technology Does Not Determine Warfare

However much technology may change warfare, it never determines warfare—neither how it will be conducted nor how it will turn out. Technology presides in warfare, but it does not rule.

The whole notion of “technological determinism” is a red herring.[4] Humans can always resist the historical forces surrounding them. To believe in determinism is to believe in inevitability. This begs the question, “Why”? What historical force or law pushes events to some inescapable outcome? In hindsight, events may appear predetermined or inevitable, but nothing in human activity can be predicted with certainty.

Think about the instances in history when technology appeared to determine the nature and even the result of warfare. Chariots were perhaps the most dominant instrument of warfare before nuclear weapons. Indeed, historian William H. McNeill has called them the superweapon of their day.[5] When they appeared in the Levant in the eighteenth century BCE, they swept all before them. From Egypt to Mesopotamia, states either adopted chariots or ceased to compete in interstate war. The chariot craze bred an international chariot aristocracy, the Maryannu, who sold their services to the highest bidder.[6] States built up enormous chariots corps with attendant supply and maintenance trains, culminating in the battle of Kadesh in 1275 BCE, when the contending Egyptian and Hittite forces committed an estimated 5,000 chariots to a cataclysmic but ultimately indecisive day of battle. Western warfare through most of the second millennium BCE was chariot warfare. The chariot defined, drove, governed, circumscribed ground warfare.

And then it was gone. Within a century after the Armageddon at Kadesh, the chariot disappeared as the dominant technology of Levantine warfare. Just as there is no sure evidence of where the chariot came from and why it ruled, so is its fall from dominance a mystery. Robert Drews notes that it lost power in “The Catastrophe,” the wave of wars, raids, and forced migrations that swept the eastern Mediterranean around 1200 BCE.[7] William McNeill believes that the introduction of iron weapons at just this time gave infantry new power to stand up to chariots.[8] Another possible explanation is state bankruptcy brought on by the arms race in chariots and the horses to pull them. Still another is a change in infantry tactics, perhaps coupled with McNeill’s iron weapons. In any case, the apparent determinism of the chariot evaporated.

Countless other examples through history of seemingly irresistible weapons leading to inevitable triumph have similarly risen and fallen in their turn, from gunpowder through the “Dreadnought revolution” and strategic bombing to the recent enthusiasm for the “revolution in military affairs,” a technological superiority that was to have given the U.S. unassailable military prowess.

The Open Door

A better conceptual model for the technology of war is “the open door.” This metaphor was introduced by medieval historian Lynn White, Jr., in his classic study Medieval Technology and Social Change (Oxford, 1962). Seeking to demonstrate that medieval society spawned its share of technological innovation, White presented a series of interrelated case studies. One revisited and refined the discredited claim by Heinrich Brunner that the appearance of the heavily armed and armored mounted knight on the battlefields of eighth-century Europe had bred feudalism. Brunner had imagined that Charles Martel first conceived the scheme of a feudal array and the social, political, and economic system to sustain it at or immediately following the battle of Poitiers in 732, when his posse of mounted warriors drove off Muslim raiders spilling into southern France from the Iberian Peninsula. But White showed that Martel had begun confiscating church property for distribution to a new class of mounted warriors before the battle of Poitiers. What, then, asked White, might have inspired Martel, if not his victory over the lightly armed and armored Muslim mounted warriors? White’s answer was the stirrup, an Asian innovation just making its first appearance in Europe about the time of Poitiers. White imagined that this technology allowed the heavily armed and armored mounted knight to lean into his lance and overwhelm mounted and unmounted warriors alike with irresistible force.

White’s argument was widely and roundly attacked, especially by Marxist historians. Most of his critics accused him of technological determinism, of arguing that the stirrup produced feudalism. But White had gone out of his way to avoid any such claim. He called the stirrup a “catalyst.” It did not create feudalism out of whole cloth. Rather, when it was added to the complex soup of medieval society, feudalism precipitated out. Other societies with different ingredients and different chemistries would produce different residues. Technology does not determine outcomes, said White, it opens doors. People must decide if they want to pass through. The availability of the stirrup in Europe did not mean that Martel would adopt it to make the heavily armed and armored mounted knight the mainstay of an emergent military system.

The “open door” is a powerful conceptual tool for thinking about all technology, especially military technology. It adds what most accounts of technological innovation lack: human agency. Humans must decide if they are going to, or can, take up a given military innovation. And they must adapt it to their circumstances. Technology is a possibility, not an imperative. The varieties of technology—think, for example, of the different models of automobiles in the world—testify to the countless contexts in which people will apply the same fundamental innovation with differing results.[9] In the years between the world wars, for example, the U.S. and Britain, geographically isolated from continental Europe, developed strategic bombers with which to project their military power, while the major continental powers concentrated on fighter aircraft to contend with each other for air superiority over the battlefields in their back yards.

What White did not discern was that human agency intervenes not once, but twice, at the moment of innovation. When discovering who chooses to walk through a door opened by technological innovation, it is equally important to understand who opened it for them. People invent and innovate. People open the door. Often there is a relationship between those who open the door and those who pass through it. In modern military experience we often think of that relationship as the military-industrial complex. This byproduct of the Cold War reminds us that the door can be swinging, even revolving. People who are anxious to pass through may hire agents to open the door for them. Likewise, those who figure out how to open doors may entice others to pass through. None of this makes the technology itself deterministic, but it can make the social system attending the door self-replicating and self-sustaining. In time, the participants may pass into realms that they would not otherwise have chosen to enter and that do not serve their interests.[10] The reasons for their decisions, even the bad ones, however, lie not in the technology but in the personal, political, economic—in sum, contextual—forces already at work. The catalyst precipitates the consequences dictated by the ingredients and the chemistry.

This raises my fourth point.

Modern Military Technology Is Different

Modern military technology is not different in kind, but in degree. World War II was the first war in history in which the weapons in use at the end of the war differed significantly from those employed at the outset. The atomic bomb is the most obvious example, but the list of military technologies introduced between 1939 and 1945 includes as well jet aircraft, guided missiles, microwave radar, and the proximity fuse, to name just a few. Some military leaders concluded from this experience that industrial production had won the world wars but military innovation would win the next war. Especially in the U.S., the military establishment began to institutionalize research and development, adopting from industry a kind of planned obsolescence that would keep American armed forces a generation ahead of their potential foes. They created what President Dwight Eisenhower called in his farewell address a “military-industrial complex,” a perpetual arms race, not necessarily with any particular enemy, but with the status quo.

The introduction of systematic, institutionalized innovation makes modern military technology seem radically different from all that went before.[11] That difference is simultaneously real and illusory. The reality stems from the accelerated pace of technological change in the modern world and an unprecedented mastery of energy and materials ranging across a dimensional scale from nanotechnology to floating cities like the modern aircraft carrier. The illusion arises from our growing inability to think of war in non-material terms. Modern commanders can hardly imagine how their predecessors thought about science and technology. A career officer in today’s armed forces expects the arsenal at his or her disposal to change constantly over the course of a career. Before the second half of the twentieth century, however, commanders fully expected to retire with the same instruments they took up in their apprenticeship. Personal arms might even pass from father to son. Some innovation intruded on this static picture in the late nineteenth and early twentieth century, but nothing like the sustained hothouse environment of today’s military arsenals.

Even the terms “science” and “technology” are modern, both coined in the nineteenth century. Before these conceptual categories took hold of the modern consciousness, premodern commanders thought of their armies and navies in terms of men (human capital) and material (arms and armor, forts and roads, food and ammunition). Improvements in any of these areas were made not by scientists and engineers, but by craftsmen with little formal schooling. “Engineers” in the premodern world were individuals who built and operated “engines” of war, i.e., ballistae and catapults. There was high-quality steel long before its composition was revealed by crystallography in the 1920s, but it was produced by artisans who passed their techniques from generation to generation through apprenticeship, not by industrialists whose staffs of scientists prescribed formulas that would produce steel of requisite characteristics. A handful of premodern geniuses, such as Michelangelo and Leonardo da Vinci, mastered art and engineering to imagine weapons that were centuries ahead of their time. Mechanics operated machines of war. Sailors were always mechanics operating the most complex machines of their age, be it the galley of classical Mediterranean warfare or the fully-rigged, side-gunned sailing ship of the line in the early modern world. Architects designed and erected fortifications, probably the most influential military technology before gunpowder. These marvels of what we would now call civil engineering—the Great Wall of China, the walls of Constantinople, the Roman limes—shaped countless conflicts throughout history and sometimes ensured that conflicts never happened. Other premodern builders oversaw civil works with equally important military implications, such as the Roman road network, or Caesar’s bridge across the Rhine, or the earthen ramp at Masada.

The material resources that these premodern materialists delivered to their armies and navies were not seen as the result of abstract enterprises like science and technology. The producers of these wonders were simply practicing what Lewis Mumford called “technics,” from the Greek techne.[12] It is related to episteme in something like the relationship between modern technology and science. One was about doing, the other about knowing. One was learned in apprenticeship, the other was gained by the study of knowledge accumulated in a canon. But just as modern technology is more complex and independent than “applied science,” so is technics more subtle than simply craft knowledge. It is perhaps more helpful to think of premodern producers of military instruments as “improvers,” the generic term that Robert Friedel applies to all those people in the last millennium of Western history who have manipulated the material world in search of better ways to do whatever it is that people choose to do.[13]

In short, the tools of war have been evolving slowly throughout the course of human history, but only in the modern world has there been an institutionalized and rationalized mechanism for continuously and systematically innovating military technology. Some tantalizing hints from the ancient and classical world place our modern world in bold relief. One anonymous author from classical Greece offered the opinion that the only real utility for third-order equations was to compute the trajectory of ballistae.[14] In this instance, at least, military technology really was applied science. Dionysius I, tyrant of Syracuse in the early fourth century BCE, recruited knowers and doers from around the Mediterranean to work in his arsenal to develop new machines of war, perhaps the first instance of a research and development laboratory.[15] A Syrian engineer by the name of Kallinikos delivered “Greek fire” to the Byzantines in hopes that they would use it to defeat the Muslims.[16] The Byzantines made of this the only truly secret weapon of the ancient and medieval world. “Secret weapons” as we now think of them were an invention of early modern Europe.[17]

Kallinikos provides a fitting ending for this account. When he is mentioned at all in history books, he usually appears as a kind of “deux ex machine,” an unknown and unknowable historical actor who delivers a war-changing weapon that in turns changes the course of history. Such representations give rise to the belief that “technological determinism” is at work. But in all such instances, it is well to think of a door. Who opened it and who passed through? Who was Kallinikos? How did he come by this formula, and why did he offer it to the Byzantines? Why did the Byzantines take up his new weapon? And why did they treat it as a state secret? “Technological determinism” is a distraction, a rhetorical device that diverts attention from the real historical questions that surround the relationship between war and technology. That relationship is defining, subtle, and evolving. Now, more than ever, it drives innovation in warfare.

Notes

1. John Lewis Gaddis, The Long Peace: Inquiries into the History of the Cold War (New York: Oxford University Press, 1987).
2. One the historicity of Sun Tzu, see Sunzi, Ralph Sawyer, and Mei-chűn Sawyer, The Art of War (Boulder: Westview Press, 1994).
3. Robert O’Connell, Of Arms and Men: A History of War, Weapons, and Aggression (New York: Oxford University Press, 1989), 8-9 et passim.
4. Alex Roland, “Was the Nuclear Arms Race Deterministic?” paper prepared for the 50th Anniversary Workshop of the Society for the History of Technology, Washington, DC, 18 Oct. 2007.
5. William H. McNeill, The Rise of the West: A History of the Human Community (Chicago: University of Chicago Press, 1991), 104–106
6. See Arthur Cotterell, Chariot: The Astounding Rise and Fall of the World’s First War Machine (London: Pimlico, 2004), 67–68, 86; R. T. O’Calligan, “ New Light on the Maryannu as ‘Chariot Warrior,’” Jahrbuch fűr kleinasiatische Forschung 1 (1950-51): 309–24
7. Robert Drews, The End of the Bronze Age: Changes in Warfare and the Catastrophe CA. 1200 B.C. (Princeton University Press, 1993), 209–14, and passim.
8. McNeill, Rise of the West, 117–18
9. George Basalla, The Evolution of Technology (New York: Cambridge University Press, 1988).
10. Donald A. MacKenzie, Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance (Cambridge, MA: MIT Press, 1990).
11. And not just military technology. As Alfred North Whitehead observed many years ago, “the greatest invention of the nineteenth century was the invention of the means of invention.” Alfred North Whitehead, Science and the Modern World (New York: Free Press, [1925] 1953), 96
12. Lewis Mumford, Technics and Civilization (1932).
13. Robert Friedel, A Culture of Improvement: Technology and the Western Millennium (Cambridge, MA: MIT Press, 2007)
14. J. G. Landels, Engineering in the Ancient World (Berkeley: University of California Press, 99–132.
15. Brian Caven, Dionysius: War-lord of Sicily (New Haven: Yale University Press, 1990), 90-97
16. J. Haldon and M. Byrne, “A Possible Solution to the Problem of Greek Fire,” Byzantinische Zeitschrift 70 (1977): 91–99.
17. Pamela Long and Alex Roland, “Military Secrecy in Antiquity and Early Medieval Europe: A Critical Reassessment,” History of Technology 11 (1994): 259–90.

[Alex Roland is professor of history at Duke University. This essay is based on his presentation at “Teaching the History of Innovation,” a two-day history institute for teachers held October 18-19, 2008.]