Thursday, January 31, 2013

Deceased--Patty Andrews

 Patty Andrews
February 16th, 1918 to January 30th013


Maxene, Patty, LaVerne

Participants of the World War II era are slowly fading.



"Patty Andrews Dead: Last Surviving Member Of The Andrews Sisters Dies At 94"

by

Bob Thomas

January 30th, 2013

The Huffington Post

Patty Andrews, the last surviving member of the singing Andrews Sisters trio whose hits such as the rollicking "Boogie Woogie Bugle Boy of Company B" and the poignant "I Can Dream, Can't I?" captured the home-front spirit of World War II, died Wednesday. She was 94.

Andrews died of natural causes at her home in the Los Angeles suburb of Northridge, said family spokesman Alan Eichler in a statement.

Patty was the Andrews in the middle, the lead singer and chief clown, whose raucous jitterbugging delighted American servicemen abroad and audiences at home.

She could also deliver sentimental ballads like "I'll Be with You in Apple Blossom Time" with a sincerity that caused hardened GIs far from home to weep.

"When I was a kid, I only had two records and one of them was the Andrews Sisters. They were remarkable. Their sound, so pure," said Bette Midler, who had a hit cover of "Bugle Boy" in 1973. "Everything they did for our nation was more than we could have asked for. This is the last of the trio, and I hope the trumpets ushering (Patty) into heaven with her sisters are playing "Boogie Woogie Bugle Boy."

From the late 1930s through the 1940s, the Andrews Sisters produced one hit record after another, beginning with "Bei Mir Bist Du Schoen" in 1937 and continuing with "Beat Me Daddy, Eight to the Bar," `'Rum and Coca-Cola" and more. They recorded more than 400 songs and sold over 80 million records, several of them going gold.

Other sisters, notably the Boswells, had become famous as singing acts, but mostly they huddled before a microphone in close harmony. The Andrews Sisters – LaVerne, Maxene and Patty – added a new dimension. During breaks in their singing, they cavorted about the stage in rhythm to the music.

Their voices combined with perfect synergy. As Patty remarked in 1971: "There were just three girls in the family. LaVerne had a very low voice. Maxene's was kind of high, and I was between. It was like God had given us voices to fit our parts."

Kathy Daris of the singing Lennon Sisters recalled on Facebook late Wednesday that the Andrews Sisters "were the first singing sister act that we tried to copy. We loved their rendition of songs, their high spirit, their fabulous harmony."

The Andrews Sisters' rise coincided with the advent of swing music, and their style fit perfectly into the new craze. They aimed at reproducing the sound of three harmonizing trumpets.

"I was listening to Benny Goodman and to all the bands,"
Patty once remarked. "I was into the feel, so that would go into my own musical ability. I was into swing. I loved the brass section."

Unlike other singing acts, the sisters recorded with popular bands of the `40s, fitting neatly into the styles of Benny Goodman, Glenn Miller, Jimmy Dorsey, Bob Crosby, Woody Herman, Guy Lombardo, Desi Arnaz and Russ Morgan. They sang dozens of songs on records with Bing Crosby, including the million-seller "Don't Fence Me In." They also recorded with Dick Haymes, Carmen Miranda, Danny Kaye, Al Jolson, Jimmy Durante and Red Foley.

The Andrews' popularity led to a contract with Universal Pictures, where they made a dozen low-budget musical comedies between 1940 and 1944. In 1947, they appeared in "The Road to Rio" with Bing Crosby, Bob Hope and Dorothy Lamour.

The trio continued until LaVerne's death in 1967. By that time the close harmony had turned to discord, and the sisters had been openly feuding.

Midler's cover of "Bugle Boy" revived interest in the trio. The two survivors joined in 1974 for a Broadway show, "Over Here!" It ran for more than a year, but disputes with the producers led to the cancellation of the national tour of the show, and the sisters did not perform together again.

Patty continued on her own, finding success in Las Vegas and on TV variety shows. Her sister also toured solo until her death in 1995.

Her father, Peter Andrews, was a Greek immigrant who anglicized his name of Andreus when he arrived in America; his wife, Olga, was a Norwegian with a love of music. LaVerne was born in 1911, Maxine (later Maxene) in 1916, Patricia (later Patty, sometimes Patti) in 1918.

All three sisters were born and raised in the Minneapolis area, spending summers in Mound, Minn., on the western shores of Lake Minnetonka, about 20 miles west of Minneapolis.

Listening to the Boswell Sisters on radio, LaVerne played the piano and taught her sisters to sing in harmony; neither Maxene nor Patty ever learned to read music. All three studied singers at the vaudeville house near their father's restaurant. As their skills developed, they moved from amateur shows to vaudeville and singing with bands.

After Peter Andrews moved the family to New York in 1937, his wife, Olga, sought singing dates for the girls. They were often turned down with comments such as: "They sing too loud and they move too much." Olga persisted, and the sisters sang on radio with a hotel band at $15 a week. The broadcasts landed them a contract with Decca Records.

They recorded a few songs, and then came "Bei Mir Bist Du Schoen," an old Yiddish song for which Sammy Cahn and Saul Kaplan wrote English lyrics. (The title means, "To Me You Are Beautiful.") It was a smash hit, and the Andrews Sisters were launched into the bigtime.

Their only disappointment was the movies. Universal was a penny-pinching studio that ground out product to fit the lower half of a double bill. The sisters were seldom involved in the plots, being used for musical interludes in film with titles such as "Private Buckaroo," `'Swingtime Johnny" and "Moonlight and Cactus."

Their only hit was "Buck Privates," which made stars of Abbott and Costello and included the trio's blockbuster "Boogie Woogie Bugle Boy from Company B."

In 1947, Patty married Martin Melcher, an agent who represented the sisters as well as Doris Day, then at the beginning of her film career. Patty divorced Melcher in 1949 and soon he became Day's husband, manager and producer.

Patty married Walter Weschler, pianist for the sisters, in 1952. He became their manager and demanded more pay for himself and for Patty. The two other sisters rebelled, and their differences with Patty became public. Lawsuits were filed between the two camps.

"We had been together nearly all our lives,"
Patty explained in 1971. "Then in one year our dream world ended. Our mother died and then our father. All three of us were upset, and we were at each other's throats all the time."

Patty Andrews is survived by her foster daughter, Pam DuBois, a niece and several cousins. Weschler died in 2010.


"Patty Andrews, last surviving member of The Andrews Sisters, dead at 94"

Dominant 'girl group' had more than 90 chart hits themselves and two dozen more with Bing Crosby.

by

David Hinckley

January 31st, 2013

NEW YORK DAILY NEWS

Patty Andrews, the last surviving member of the greatest "girl group" of all time, died Wednesday at her home in Los Angeles. She was 94.

The Andrews Sisters, who included Patty, Maxene and Laverne, were the dominant female vocal group of the mid-20th century, scoring more than 90 chart hits themselves and two dozen more with their frequent singing partner, Bing Crosby.

Their close harmony style influenced dozens of subsequent groups and singers, from the McGuire Sisters and the Pointer Sisters to En Vogue, Bette Midler and Christina Aguilera.

Their songs became standards of the era, often associated with World War II.

"Boogie Woogie Bugle Boy," later recorded by Midler and others, was a wartime classic. Their other most popular songs included "Bie Mir Bist Du Schon," "Rum and Coca Cola," "Ferryboat Serenade,"

"Shoo Shoo Baby," "I Can Dream, Can't I?" and "I Wanna Be Loved."

Their No. 1 hits with Crosby included "Don't Fence Me In" and "Hot Time In the Old Town of Berlin."

They sold an estimated 100 million records over their career, and were noted for their versatility.

They could sing close harmony ballads, but also country-style tunes, jazz and swing. "Boogie Woogie Bugle Boy" is widely considered one of the first songs in the style that after World War II would be known as rhythm and blues.

 The sisters also appeared in 17 movies, many of them low-budget musicals, but also including "The Road to Rio."

All three sisters were born to an immigrant family in Mound, Minn. They formed a singing group when Laverne was 14, Maxene was 9 and Patty 7, with Patty as the lead singer.

Their early models included the popular Boswell Sisters, and their successful shows at local theaters eventually propelled them to greater success on the road.

They became radio, stage and touring fixtures during the War, and continued performing together until 1951, when Patty left for a solo career without notifying Maxene or Laverne.

This led to a bitter two-year battle that also involved their parents' estate, and reflected the fact the sisters had not always gotten along as well off-stage as they did while performing.

They reunited in 1956 and performed together for the next decade. Their final date together came on the "Dean Martin Show," Sept. 27, 1966. Laverne died of cancer the following year.

Maxene and Patty performed together for a while after Laverne's death, but then split for good in the early 1970s. Maxene continued as a solo artist until her death in 1995 and Patty also performed as a soloist for many years.

They only had a few brief reunions over the years, and Maxene said shortly before her death that her estrangement from Patty was one of her few regrets.

Patty Andrews was married to Terry Melcher for two years, 1947-1949. She married the group's pianist, Walter Weschler, in 1951, and they remained married until his death in 2010.


"Patty Andrews, Singer With Her Sisters, Is Dead at 94"

by

Robert Berkvist

January 30th, 2013

The New Yorl Times

Patty Andrews, the last of the Andrews Sisters, the jaunty vocal trio whose immensely popular music became part of the patriotic fabric of World War II America, died on Wednesday at her home in Los Angeles. She was 94.

Lynda Wells, a niece, confirmed the death.

With their jazzy renditions of songs like “Boogie Woogie Bugle Boy (of Company B),” “Rum and Coca-Cola” and “Don’t Sit Under the Apple Tree (With Anyone Else but Me),” Patty, Maxene and LaVerne Andrews sold war bonds, boosted morale on the home front, performed withBing Crosby and with theGlenn Miller Orchestra, made movies and entertained thousands of American troops overseas, for whom the women represented the loves and the land the troops had left behind.

Patty, the youngest, was a soprano and sang lead; Maxene handled the high harmony; and LaVerne, the oldest, took the low notes. They began singing together as children; by the time they were teenagers they made up an accomplished vocal group. Modeling their act on the commercially successful Boswell Sisters, they joined a traveling revue and sang at county fairs and in vaudeville shows. Their big break came in 1937 when they were signed by Decca Records, but their first recording went nowhere.

Their second effort featured the popular standard “Nice Work If You Can Get It,” but it was the flip side that turned out to be pure gold. The song was a Yiddish show tune, “Bei Mir Bist Du Schön (Means That You’re Grand),” with new English lyrics bySammy Cahn, and the Andrews Sisters’ version, recorded in 1937, became the top-selling record in the country.

Other hits followed, and in 1940 they were signed by Universal Pictures. They appeared in more than a dozen films during the next seven years — sometimes just singing, sometimes also acting. They made their film debut in “Argentine Nights,” a 1940 comedy that starred the Ritz Brothers, and the next year appeared in three films with Bud Abbott and Lou Costello:“Buck Privates,” “In the Navy”and “Hold That Ghost.” Their film credits also include “Swingtime Johnny” (1943), “Hollywood Canteen” (1944) and the Bob Hope-Bing Crosby comedy “Road to Rio” (1947).

After selling more than 75 million records, the Andrews Sisters broke up in 1953 when Patty decided to go solo. By 1956 they were together again, but musical tastes were changing and they found it hard to adapt. When LaVerne Andrews died of cancer in 1967, no suitable replacement could be found, and Patty and Maxene soon went their separate ways. Patty continued to perform solo, and Maxene joined the staff of a private college in South Lake Tahoe, Calif.

Patricia Marie Andrews was born on Feb. 16, 1918, in Minneapolis. Her father, Peter, was a Greek immigrant who changed his name from Andreos to Andrews when he came to America. Her mother, Olga, was Norwegian.

Like her older sisters, Patty learned to love music as a child (she also became a good tap dancer), and she did not have to be persuaded when Maxene suggested that the sisters form a trio in 1932. She was 14 when they began to perform in public.

As their fame and fortune grew, the sisters came to realize that the public saw them as an entity, not as individuals. In a 1974 interview with The New York Times, Patty explained what that was like: “When our fans used to see one of us, they’d always ask, ‘Where are your sisters?’ Every time we got an award, it was just one award for the three of us.” This could be irritating, she said with a touch of exasperation: “We’re not glued together.”

The Andrews Sisters re-entered the limelight in the early 1970s when Bette Midler released her own recording of “Boogie Woogie Bugle Boy,” modeled closely on theirs. It reached the Top 10, and its success led to several new compilations of the Andrews Sisters’ own hits.

The previous year, Patty Andrews had appeared in a West Coast musical called “Victory Canteen,” set during World War II. When the show was rewritten for Broadway and renamed “Over Here!,” the producers decided that the Andrews Sisters were the only logical choice for the leads. They hired Patty and lured Maxene back into show business as well. The show opened in March 1974 and was the sisters’ belated Broadway debut. It was also the last time they sang together.

The sisters got into a bitter money dispute with the producers and with each other, leading to the show’s closing in January 1975 and the cancellation of plans for a national tour. After that, the sisters pursued solo careers into the 1990s. They never reconciled and were still estranged when Maxene Andrews died in 1995.

Patty Andrews’s first marriage, to the movie producer Marty Melcher, lasted two years and ended in divorce in 1949. (Mr. Melcher later married Doris Day.) In 1951 she married Wally Weschler, who had been the sisters’ pianist and conductor and who later became her manager. They had no children. Mr. Weschler died in 2010. Ms. Andrews is survived by her foster daughter, Pam DuBois.

A final salute to the Andrews Sisters came in 1991 in the form of “Company B,” a ballet by the choreographer Paul Taylor subtitled “Songs Sung by the Andrews Sisters.” The work, which featured nine of the trio’s most popular songs, including “Rum and Coca-Cola” and, of course, “Boogie Woogie Bugle Boy,” underscored the enduring appeal of the three sisters from Minneapolis. 


 Boogie Woogie Bugle Boy Of Company B


 I'll Be With You In Apple Blossom Time


In The Mood


Rum and Coca Cola 


The Andrews Sisters [Wikipedia]
 
Boogie Woogie Bugle Boy [Wikipedia]

Stolen petroglyphs from Eastern Sierra recovered



"Petroglyphs stolen from sacred eastern Sierra site recovered"

January 31st, 2013

Los Angeles Times

Petroglyph panels cut and chiseled off an eastern Sierra rock art site sacred to Native Americans have been recovered by federal investigators, U.S. Bureau of Land Management officials announced Thursday.

The suspected thieves have not been identified and the investigation is continuing into one of the worst acts of vandalism ever committed on the 750,000 acres of public land managed by the BLM field office in Bishop.

“Now, the healing can begin,”
BLM Field Office Manager Bernadette Lovato said in an interview. “Recovery was a priority for me, and the public outrage intensified the need for them to be returned.”

Lovato declined to disclose details about the discovery, except to say, “We found all five panels by following an anonymous tip sent to us in a letter.”

“The panels are currently being held as evidence,” she said. “After a prosecution, perhaps they may eventually be put on public display somehow, but that will be up to Paiute-Shoshone tribal leaders.”
“I feels real good to have them come back home,” Paiute tribal historic preservation officer Raymond Andrews said in an interview.

Investigators believe the vandals used ladders, chisels, electric generators and power saws to remove the panels from cliffs in an arid high-desert region known as Volcanic Tablelands, about 15 miles north of Bishop. The thieves gouged holes in the rock and sheared off slabs that were up to 15 feet above ground and two feet high and wide.
The desecration was reported to the BLM on Oct. 31 by visitors to the area held sacred by Native Americans whose ancestors carved hundreds of lava boulders and cliffs with spiritual renderings: concentric circles, deer, rattlesnakes, bighorn sheep and hunters with bows and arrows.

The site, which is still used by the local Pauite for ceremonies, is protected under the Archaeological Resources Protection Act and is listed in the National Register of Historic Places. Authorities said the petroglyphs were not worth a great a deal on the illicit market, probably $500 to $1,500 each.

But they are priceless to Native Americans, who regard the massive tableaux as a window into the souls of their ancestors.

There is a $9,000 reward for information leading to the arrest and conviction of the thieves. Damaging or removing the petroglyphs is a felony. First-time offenders can be imprisoned for up to one year and fined as much as $20,000, authorities said.

Second-time offenders can be fined up to $100,000 and imprisoned up to five years.

Ban double majors...is this really an issue?


"Should Colleges Ban Double Majors?"

by

Kayla Webley

January 31st, 2013

Time

Tucked in a list of suggested reforms issued last week for how U.S. colleges could increase graduation rates is a recommendation that schools “narrow student choice” in order to promote completion. It’s an interesting idea — one that seems to go against the notion of college as a place to explore options and experiment with courses in divergent fields — that is all the more curious since it is included in an open letter from the nation’s six leading higher-education organizations.

“Sometimes we create a culture of dancing for more years than you have to, rather than getting out the door,” said Gordon Gee, president of Ohio State University and chairman of the National Commission on Higher Education Attainment, which issued the Jan. 24 letter. ”I think institutions have a responsibility to reset that balance.”

The recommendation, along with the commission’s less surprising exhortations to create more flexible schedules and make it easier for students to transfer credits from one institution to another, is being put forth in an effort to improve the country’s dismally low college completion rate: just 58% of students who enroll in bachelor’s degree programs at four-year institutions graduate within six years and only 30% of students who enroll in certificate or associate’s degree programs at two-year institutions complete their degree within three years. But are double majors or an overabundance of academic options a big reason why so many students are getting off track?

The commission’s letter highlights a recommendation made by a task force at the University of Texas at Austin to bar students there from majoring in two subjects unless they can complete all the coursework requirements in four years. A second example cited in the letter details how Tennessee’s state technical schools are giving students fewer choices about which classes they can take to get a particular degree. The technical schools also mandated that students finish their degree within a fixed period of time. With these changes in place, many more students ended up completing degrees and in much less time, which is why the state board of regents has decided to bring the model to Tennessee’s 19 community colleges as well.

“We are hoping to achieve a sea change in our student culture (from “more is more” to “on time”),” the task force UT-Austin said in a report issued last February. “Students who do not complete a bachelor’s degree by the beginning of their ninth long semester will be ineligible for dual degree, double major, or certificate programs: Only one degree (and one major) will be awarded — even if the student successfully completes the requirements for additional degrees, majors, programs.”

UT-Austin has not yet restricted double majors, but the mere suggestion is noteworthy at a time when it is increasingly common for students to pursue multiple majors. According to the U.S. Department of Education, the percentage of students who double-major jumped 96% between 2000 and 2008, the most recent year for which data is available. And although the total number of double majors is still small, accounting for only 5.5% of all undergrads in 2008, as many as 40% of students at some colleges are pursuing more than one major, according to a forthcoming report titled Double Majors: Influences, Identities & Impacts from the Curb Center for Art, Enterprise and Public Policy at Vanderbilt University. The trend is particularly evident at selective schools. “Students are seeking a competitive advantage in the job market,” said Steven Tepper, associate director of the Curb Center and co-author of the study. “Many double-major students feel it is not enough to have a college degree — they need to further distinguish and differentiate themselves.”

Whether a second major actually makes a student more attractive to an employer is unclear — little data exists on the subject — but either way, some in the higher-education community are beginning to question whether schools ought to push students in a different direction. “I don’t know if I would go as far as to say we should restrict students, but I think there has to be some work at the institutional level to make sure students understand why they plan on double-majoring and whether it’s really a smart idea,” said Michelle Asha Cooper, president of the Institute for Higher Education Policy, which was not one of the organizations that signed the open letter. “They might not understand that it may delay graduation, may cost additional money and may give them an extra heavy course load. We need to make sure someone in the institution has had a conversation with them to make sure they understand the pros and cons.”

Even Tepper, who says double-majoring has a “huge positive impact” for some students, thinks college administrators need to be more involved. “Universities allow double majors to happen and have stood by while this trend has,” he said. “Institutions should help their graduates be more intentional about their choices.”

At the same time, however, Tepper says his research does not suggest that students who double-major are more likely to drop out of college. He also found that having an additional major increases the time it takes to earn a degree only slightly, if at all. Of the 1,700 students surveyed for the report, 14% of double majors who were currently in their fourth year of college said they planned to be enrolled the following year compared to 12% of single majors. “[Many students] have been packing their resumes since eighth grade and so they know how to balance multiple commitments,” Tepper said. At UT-Austin, where the suggestion was made to eliminate double majors, students with an additional major are actually more likely to graduate. The task force report showed 69% of students with double majors graduate in four years as opposed to 60% of single majors. (To be sure, this does not suggest colleges should push more students into a second major as a way to increase college completion, but rather that those students who choose to double-major may be more motivated to begin with.) Either way, the task force noted, “Taking a double major does not slow time-to-degree.”

So why, then, are prominent figures in the higher-education community promoting the idea of narrowing student choice?

“I’m not sure that the word ‘narrow’ is quite the right word, it’s clarity that we’re really trying to achieve,” said Gee after embarking on a media tour to promote the letter. “I believe very strongly in the liberal arts education. We don’t want to take away those options. We want to provide clarity to students for how they can get through the system much faster — that would be the way that I would put it.”

Destroying your own cultural heritage?...Timbuktu's collection of Islamic manuscripts


"Did Timbuktu's Priceless Archives Really Just Get 'Wiped Out'?"

by

Frud Bezhan

January 30th, 2013

The Atlantic

For centuries, Timbuktu has been home to one of the largest collections of ancient Islamic manuscripts in the world. Now there are fears that those priceless treasures may have been destroyed by fleeing Islamist militants. Below is an interview with Mauro Nobili, a researcher at the University of Cape Town's Timbuktu Manuscripts Project, about the significance of the collection and the possible loss.

What is the latest news coming out from Mali about the reported destruction of Timbuktu's prized collection of Islamic manuscripts?

The news has not been confirmed. We have seen images coming from Timbuktu, but for eight days we have not been able to reach our contacts in Timbuktu. Even a group of researchers from the Ahmed Baba Institute, which has allegedly been destroyed, are in Bamako. We have been able to talk to them, but they have no news about what happened in Timbuktu. We've seen the images that, of course, show some documents have been destroyed. But nobody -- in terms of people who have worked there -- [has] been able to go inside, have a look at the damage, and report anything. Even the SkyNews video of a journalist entering the center with somebody that has been presented as a guy who works at the Ahmed Baba Institute; actually, he is only a tourist guide.

There is contrasting information. Yesterday, we read from the news that manuscripts had been burned. While this morning local news says that the bulk of the manuscripts have been taken out of the center. The situation is very confusing.

Many have described the reported destruction of the ancient collection as a tragedy. What did the sites in Timbuktu contain and how significant would their loss be?

If we have confirmation of the destruction of the manuscripts, it means that a huge fragment of West African history would have been wiped out. The Ahmed Baba Institute hosts -- or at least it did host -- at least 20,000 ancient manuscripts that, according to the most recent estimations, account for one-fifth of all the documents in the Timbuktu area. So, of course, if confirmed it would be a disaster.

Only a few manuscripts have precise dates. Since many of them haven't been studied properly in terms of paper and ink analysis, it's not easy to date them correctly. But people claim that some of these manuscripts go back to the 14th and 15th centuries, even the 13th century. But those that are dated rarely go back further than the 17th [or] 18th century. Every kind of topic is preserved at the Ahmed Baba Institute -- from local histories, global histories, [and] masterpieces of Islamic literature to documents in terms of legal documents, trade documents, and also private correspondences between rulers. The documentation is very diverse. Every kind of documentation you can imagine is represented there.

From the information that you have, do you know whether any Koranic texts have been destroyed by the rebels, as has been reported?

It would honestly be absurd. There might be some documents that for the topics, like Sufism and so on, might be not very [religiously significant] from the point of view of the rebels. But I think it's very, very unlikely to see them burning Korans or even most of the literature that is hosted in the Ahmed Baba Institute. As far as I know, yes [this is the first time militants have targeted ancient sites at Timbuktu]. And even the Ansar Dine [a breakaway faction of Mali's Islamists], who occupied Timbuktu, they have shown pictures of them preserving the manuscripts, showing that in some way they are -- at least they wanted to appear as concerned about -- the preservation of the manuscripts. So they wanted to portray themselves as even the custodians of the manuscripts themselves. At least until one week ago.

The manuscripts survived for centuries in Timbuktu. What measures have the government or the local residents in Timbuktu taken to protect the ancient manuscripts? There have been reports that some collectors have hidden manuscripts in wooden trunks [or] buried them in boxes in the desert or in caves.

[That is why] the institute was built, after cooperation with the South African governmental institutions. The new Ahmed Baba Institute was built according to international standards of preservation, and there were workshops for preserving the manuscripts. And people from Timbuktu used to come to South Africa to be trained in order to improve their skills in preservation. But some of the manuscripts, especially those in private collections, are preserved in a very, very bad condition. In terms of manuscripts, or manuscripts' heritage, that is a phenomenon that only started during the 1990s. There wasn't much attention on West African literary heritage, especially because there was this stereotype that Africa has no written culture, Africa is made up of countries with an oral culture. But since the 1990s a lot of money, projects, and a lot of people from all around the world started to pay attention to Timbuktu.

What other places of cultural heritage have the Malian rebels targeted besides the ancient collections of Timbuktu? Are there other cases in African countries where Islamic cultural sites have been destroyed?

They have destroyed the [Sufi] tombs in Timbuktu -- [in] many of the sacred places of, let's say, saints in Timbuktu. [Religious extremists have also presented] problems in Somalia in a similar phenomenon; [and] also in Libya, after the fall of Colonel [Muammar] Qaddafi; attempts to destroy the tombs in Tunisia. So it seems like everywhere you have these kind of Salafi-oriented movements -- let's say fundamentalist movements -- especially tombs are actually under threat of being destroyed.

From very important saints, even if the term is not very appropriate in an Islamic context, these tombs often become sites for something like local pilgrimages who venerate these saints. And this is one of the problems with, let's say, the fundamentalists -- this idea that venerating the saints will distract people from venerating God.

Monday, January 28, 2013

Mining other worlds




The RASSOR robot climbs a hill during recent testing at NASA's Kennedy Space Center in Florida.

"Engineers Building Hard-working Mining Robot"

by

Steven Siceloff

January 29th, 2013

Kennedy Space Center

After decades of designing and operating robots full of scientific gear to study other worlds, NASA is working on a prototype that leaves the delicate instruments at home in exchange for a sturdy pair of diggers and the reliability and strength to work all day, every day for years.

Think of it as a blue collar robot.

Dubbed RASSOR, for Regolith Advanced Surface Systems Operations Robot and pronounced "razor," the autonomous machine is far from space-ready, but the earliest design has shown engineers the broad strokes of what their lunar soil excavator needs in order to operate reliably.

"We were surprised at what we could do with it," said Rachel Cox, a Kennedy Space Center engineer on the RASSOR team.

The primary challenge for any digging robot operating off Earth is that they have to be light and small enough to fly on a rocket, but heavy enough to operate in gravity lower than that of Earth.

"The lighter you make your robot, the more difficult it is to do this excavating," said A.J. Nick, an engineer on the RASSOR team.

RASSOR tackles this problem by using digging bucket drums at each end of the robot's body that rotate in opposite directions, giving enough traction on one end to let the opposite side dig into the soil.

The team built a weight off-loading harness that simulated working the rover in the moon's 1/6th gravity field.

"We proved that if you engage one bucket, it pulls itself but when you lower the other bucket and rotate it, once they both catch in, it starts digging," Nick said.

Another secret of the drum, inspired by a previous Lockheed Martin design, is the staggered shallow scoops that shave the soil a bit at a time rather than scoop large chunks of it all at once, the way bulldozers do on Earth.

A concept mission for RASSOR would have a 2000 pound payload in addition to the lander, which would be about the size of the Phoenix lander NASA sent to Mars. The RASSOR is expected to weigh about 100 pounds. The remaining payload would be used to process the lunar soil delivered by RASSOR.

The RASSOR looks like a small tank chassis with a drum at either end, each attached with arms. The drums are perhaps the robot's most innovative feature. Because they are mounted on moving arms, they can act almost as legs letting the robot step and climb over obstacles.

The team calls such moves "acrobatics." They point out that the robot can safely drive off the lander and right itself, flip itself over to get unstuck from fine soil and lift the whole body off the ground to let its treads run smoothly to remove built up soil. RASSOR is designed to easily make itself into a Z-shaped position to drop its soil collection into the hopper.

With the drums positioned above the main body of the robot, it stands about 2 1/2 feet tall.

The robot is designed to skim lunar soil and dump it into a device that would pull water and ice out of the dirt and turn their chemicals into rocket fuel or breathing air for astronauts working on the surface. The device would be part of the lander that carries the RASSOR to the moon's surface. So the robot would be the feeder for a lunar resource processing plant, a level of industry never before tried anywhere besides Earth.

Producing water and fuel from the lunar soil would save the tremendous expense of launching the supplies from Earth, since 90 percent of a rocket's mass normally consists of propellant, which can be made on the moon.

"This has been kind of the dream, the mission they gear this around," Nick said.

The concept could work on Mars, too, since its soil also is suspected of holding large amounts of water ice.

"There are some areas at the poles where they think there's a lot of ice, so you'd be digging in ice," Nick said. "There's other areas where the water is actually 30 centimeters down so you actually have to dig down 30 centimeters and take off the top and that depth is really where you want to start collecting water ice."

But in order to provide enough material to the production platform to create usable amounts of resources, the RASSOR would need to operate about 16 hours a day for five years.

It would drive five times faster than the Mars Curiosity rover's top speed of 4 centimeters per second, then shave the moon's surface with a pair of rotating drums and return to the resource processing plant with some 40 pounds of lunar soil for processing.

"Right now, we just want to make sure nothing in our design precludes it from doing that," said Jason Schuler, one of the engineers on the RASSOR project.

Devising a robot for such demands called for numerous innovations, and the team says it has at least one major decision to make before it begins construction of the second generation RASSOR prototype: keep going with tracks like those that tanks use, or switch to wheels.

The tracks showed some flaws in recent testing, mostly relating to the pebbles and sand particles clogging the gears and making the track slip off. The group tried out RASSOR on several surfaces at Kennedy, including the crushed river rock dug up from the crawlerway.

The rock, even though pulverized by the gigantic crawlers, is not a great substitute for lunar soil, the engineers said, but as long as the robot handles that matter well, they say they know it will manage whatever the moon soil offers.

"The mobility was definitely a challenge," Schuler said. "You can't take for granted that it's going to work like it does on grass or concrete or even on sand."

Part of the problem, the engineers said, might be the rubber material the tracks are made of, but a lunar version of the robot would use a different material, possibly metallic. For example, the lunar rover the astronauts drove on the surface used wheels made of stainless steel springs rather than rubber.

"We are studying if we want to invest the time and make a more robust track system or if we want to go to wheels," Cox said.

A 25-foot-square area has been cleared in part of the engineers' workshop to make room for a large area of imitation lunar soil that will allow the robot to be tested in material close to what it will face on the moon.

The team already is designing RASSOR 2, a prototype that would be much closer to what NASA could launch in the future. It's expected to begin testing in early 2014.

Still debating E=mc2


"More than one brain behind E=mc2"

January 28th, 2013

SPACEDAILY

 Two American physicists outline the role played by Austrian physicist Friedrich Hasenohrl in establishing the proportionality between the energy (E) of a quantity of matter with its mass (m) in a cavity filled with radiation.

In a paper about to be published in EPJ H, Stephen Boughn from Haverford College in Pensylvannia and Tony Rothman from Princeton University in New Jersey argue how Hasenohrl's work, for which he now receives little credit, may have contributed to the famous equation E=mc2.

According to science philosopher Thomas Kuhn, the nature of scientific progress occurs through paradigm shifts, which depend on the cultural and historical circumstances of groups of scientists. Concurring with this idea, the authors believe the notion that mass and energy should be related did not originate solely with Hasenohrl.

Nor did it suddenly emerge in 1905, when Einstein published his paper, as popular mythology would have it. Given the lack of recognition for Hasenohrl's contribution, the authors examined the Austrian physicist's original work on blackbody radiation in a cavity with perfectly reflective walls.

This study seeks to identify the blackbody's mass changes when the cavity is moving relative to the observer.

They then explored the reason why the Austrian physicist arrived at an energy/mass correlation with the wrong factor, namely at the equation: E = (3/8) mc2.

Hasenohrl's error, they believe, stems from failing to account for the mass lost by the blackbody while radiating.

Before Hasenohrl focused on cavity radiation, other physicists, including French mathematician Henri Poincare and German physicist Max Abraham, showed the existence of an inertial mass associated with electromagnetic energy.

In 1905, Einstein gave the correct relationship between inertial mass and electromagnetic energy, E=mc2. Nevertheless, it was not until 1911 that German physicist Max von Laue generalised it to include all forms of energy.


Friedrich Hasenohrl [Wikipedia]


E=mc2...Einstein's idea?

"Einstein's Mistakes"--Ohanian's book on the market

Sunday, January 27, 2013

Bill Speare had a pox on his penis?--EGADS!



"Did Shakespeare Have Syphilis?"
by

Jeffrey Brown and Jason Kane

January 25th, 2013

PBS NewsHour

Writing a book is "a horrible, exhausting struggle, like a long bout of some painful illness," George Orwell once said. The literary giant behind "1984" and "Animal Farm" was comparing his life's work to the many illnesses that plagued him from childhood to death.

And though William Shakespeare, Herman Melville and Emily Bronte may not have tied their creativity to poor health quite so explicitly, there's plenty of evidence that disease -- everything from tuberculosis to syphilis and mercury poisoning -- profoundly impacted works like "Moby Dick," "Wuthering Heights" and "Hamlet."

In a new book, "Shakespeare's Tremor and Orwell's Cough: The Medical Lives of Famous Writers," Dr. John J. Ross of Boston's Brigham and Women's Hospital looks at how disease and mood disorder may have infected the lives, creativity and words of some of the world's most beloved authors.

Ross recently sat down with Jeffrey Brown to discuss the book, including what Shakespeare's handwriting might reveal about syphilis, how the Brontë family's fatal collision with a Victorian plague made an appearance in "Jane Eyre," and old medical practices like the use of "mummy," or ground-up human flesh and bone. They also explore why readers today should even care about these old illnesses -- especially when many of them could be treated so much more effectively today. Watch the full interview above.

To get a little more of the book's flavor, keep reading. Ross summarizes some of the more intriguing cases below.

Eight Ways Disease May Have Infected Your Favorite Books, According to Dr. John J. Ross

1. William Shakespeare

The only medical fact known about Shakespeare with certainty is that his final signatures show a marked tremor. Compared to other Elizabethans, Shakespeare had an unhealthy obsession with syphilis. D. H. Lawrence wrote, "I am convinced that some of Shakespeare's horror and despair, in his tragedies, arose from the shock of his consciousness of syphilis."

According to contemporary gossip, Shakespeare was not only notoriously promiscuous, but was also part of a love triangle in which all three parties contracted venereal disease. The standard Elizabethan treatment for syphilis was mercury; as the saying goes, "a night with Venus, a lifetime with Mercury." Mercury's more alarming adverse effects include drooling, gum disease, personality changes, and tremor. (In the eighteenth century, mercury was used in the manufacture of felt hats, leading to the expressions "hatter's shakes" and "mad as a hatter"). Did Shakespeare's writing career end prematurely due to side effects of mercury treatment?

2. John Milton


Milton's greatest poem, "Paradise Lost," was written, or rather dictated, after he had become completely blind. His blindness was probably the result of retinal detachment from severe myopia. Milton is known to have "dabbled in physic," or taken popular medicines of the day in a failed attempt to save his eyesight. These may have included "mummy" (ground-up human bones or flesh), human sweat, cat-ointment, oil of puppies, and sugar of lead, the last of which may have led to Milton's gout and kidney failure.

3. The Brontës

Biographers have blamed the deaths of the Brontë sisters on everything from anorexia nervosa to the work of a fiendish serial killer. In reality, all six of the Brontë siblings died of tuberculosis, a Victorian plague that killed off 1 percent of the English population per year. TB entered the Brontë household after the older girls were infected at the Clergy Daughter's School. This was the place made infamous by Charlotte as the brutal Lowood School in Jane Eyre, where the girls were beaten, starved, and terrorized by tales of hellfire and damnation.

Although the consumptive artist is a tired cliché, there may be some truth in it. The immune system is weakened by emotional turmoil, of which the Brontës had plenty. Charlotte, Emily, and Anne suffered from major depression. Brother Branwell had bipolar disorder and alcoholism. Emily, brainy and strange, probably also had Asperger's syndrome and social anxiety disorder.

4. Herman Melville


Melville probably suffered from bipolar disorder, an illness that affected many of his works. More youthful works, such as "Moby Dick," were written in a near-manic state of exhilaration. He became prone to depression and alcoholism as he aged, perhaps explaining some of the gloom of his later works, such as "The Confidence Man." Melville also had prominent physical symptoms, including disabling attacks of eye pain and low back pain, a rigid spine, and loss of height. Biographers have generally assumed that he suffered from psychosomatic illness, but his symptoms are more consistent with the autoimmune disease ankylosing spondylitis.

5. Jack London


Jack London's manic self-confidence made him the first millionaire author in history, but it also proved to be his undoing. Jack indulged in reckless adventures while in the grip of bipolar disorder, surviving scurvy in the Yukon, alcoholic binges, and a suicide attempt in the waters of San Francisco Bay. When he developed leg ulcers from the tropical disease yaws on a disastrous sailing cruise in the Pacific, he ruined his kidneys by self-medicating with mercury. After returning home, he continued his amateur doctoring, using a silver hypodermic needle to inject himself with opium, heroin, belladonna, and strychnine. He died of a morphine overdose at 40, probably by accident rather than design.

6. William Butler Yeats

Yeats probably had Asperger's syndrome. His awkwardness and eccentricity were legendary. He was fascinated with the occult, lost his virginity in middle age, and according to his wife, "had no interest in people as such, only in what they said or did." Like many persons with Asperger's, he had prosopagnosia, the inability to recognize faces. He could not identify his own daughter, and once inadvertently lectured T. S. Eliot on the many defects of Eliot's poetry.

Yeats nearly died of brucellosis in 1929, but recovered thanks to shots of arsenic and horse serum. His brush with mortality inspired a quest to revive his sex drive, leading to a weird "rejuvenation" surgery. Gossips called him the "gland old man" and "a Cadillac engine in a Ford car."

7. James Joyce

Joyce described himself to Carl Jung as "A man of small virtue, inclined to extravagance and alcoholism." In 1904, young Joyce's fondness for Dublin streetwalkers led to a case of "gleet," or gonorrhea. His "frenemy," Dr. Oliver Gogarty, directed him to a local specialist, who would have given Joyce state-of-the-art therapy: penile irrigation with a purple solution of potassium permanganate. Joyce's gonorrhea was gone, but he soon developed terrible attacks of eye pain and arthritis. This may have been reactive arthritis, an autoimmune illness triggered by genital Chlamydia. Joyce suffered through eleven grueling eye surgeries. Afterwards, leeches were applied to drain the blood and reduce the swelling. Not surprisingly, he developed a terror of surgery.

8. George Orwell

As described above, Orwell once wrote, "Writing a book is a horrible, exhausting struggle, like a long bout of some painful illness." This was literally true for Orwell. His health collapsed after writing Homage to Catalonia, and the heroic effort of writing and revising Nineteen Eighty-Four would ultimately kill him. Orwell was a "chesty" child. He had damaged bronchial tubes (bronchiectasis) from a bacterial infection in infancy. As an adult, he survived four episodes of pneumonia and a bullet through the neck, but succumbed to the tuberculosis acquired during his years of tramping, poverty, and vagabondage. Orwell underwent injections of air into his peritoneal cavity in a failed attempt to collapse the tuberculous lung. This ordeal is reflected in the tortures of Winston Smith in 1984.




Shakespeare's Tremor and Orwell's Cough: The Medical Lives of Famous Writers

by

Dr. John J. Ross

ISBN-10: 0312600763
ISBN-13: 978-0312600761

Those revealing home movies...Eva Braun's contrabution


"The Hitler home movies: how Eva Braun documented the dictator's private life"

Eva Braun was the most intimate chronicler of the Nazi regime, capturing Hitler's private life with her cine-camera. But it was only the obsession of artist Lutz Becker that brought her films to light. Robert McCrum and Taylor Downing uncover the story of the footage that shocked the world

by

Robert McCrum and Taylor Downing

January 26th, 2013

The Observer

Lutz Becker was born in Berlin, he says, "during the anno diabolo, 1941. Mine was the generation that was sent into a dark pit." Meeting this survivor of the Third Reich, now in his 70s and living in Bayswater, London, it's hard to suppress the thought that Becker, a distinguished artist and film historian, has conducted most of his life in a circle of hell.

Becker's childhood passed in the fetid, terrifying atmosphere of Berlin's air-raid shelters as the Allied raids intensified and the city was reduced to burning rubble. He recalls the radio announcements – "Achtung, achtung, ende ende, über Deutschland sinfe bender. Achtung, achtung" – followed by the helter-skelter rush downstairs. When the bombs fell – even far off – "the change in the air pressure was enormous, and extraordinary," he says. "People used to bleed from the ears, the nose and the eyes. I came out deaf, with tinnitus." Today, Becker adds, "I envy children who grow up without fear."

When the war ended in 1945, Becker and his family found "a world in ruins. The bodies of soldiers lay in the streets. When you passed a bombed-out building you could hear the buzzing of bluebottles in the darkness. Death was still underneath the ruins," he remembers. The devastated, malodorous aftermath of the Third Reich left a deep psychological scar. "As a child I had been forbidden to use dirty words. Now I would stand in front of the mirror in my mother's bedroom and repeat 'shit' and 'arsehole'." He laughs at the memory. "But I was thinking of Hitler."

In some ways, Becker has been thinking about Hitler ever since, and what the Führer did to the German people. "I was raised in a world of lies," he declares. As the Second World War morphed into the Cold War, the terrible truth about one of the most evil regimes in history began to leak out. Poignantly, the first Germans to come to terms with the reality of the Third Reich were those children who had somehow survived the fall of Berlin – young men like Lutz Becker.

A gifted abstract German artist and film-maker, Becker discovered his vocation as an artist in the 1950s, when he also acquired a passion for film. In 1965, he won the Gropius prize for art and chose to spend it by transferring to the Slade, first coming to London in 1966 to study under William Coldstream. His contemporaries included the artist and filmmaker Derek Jarman. While researching his thesis, his troubled relationship with his childhood under the Third Reich found a new outlet. "It was in the Bundesarchiv," Becker recalls, "that I first unearthed a photograph of Eva Braun holding a 16mm Siemens cine-camera."

Eva Braun still exerts a strange fascination. Today, 80 years after Hitler became chancellor, Braun is both a symbol of Nordic simplicity, and also a tragic figure whose ordinariness provides a window on to the banality of evil. Postwar fascination with the Nazis means that Eva Braun still has a remarkable grip on our imagination – the little girl in the fairytale who takes us to the horror in the woods.

The woman who holds the key to the domestic face of Adolf Hitler was 17 when she was first introduced to the Führer, who was only identified as "Herr Wolff". This blind date had been set up by Hitler's personal photographer Heinrich Hoffman, for whom Eva Braun worked as an assistant.

Hoffman, who ran a photographic studio in Munich, had been instrumental in the making of Hitler's image. He ensured that Hitler was always seen as a determined, defiant and heroic figure, a man of iron. From the 1920s, Hoffman's photographs were duplicated by the million in the German press, and sold as postcards to the party faithful. When Hitler's mistress, Geli Raubal, committed suicide on 18 September 1931 in the apartment they shared in Munich, there was an urgent need to hush up a potential scandal, and give the Führer's private life the semblance of normality. Hoffman stepped in. Eva Braun bore a striking similarity to the dead woman, and Hitler took comfort in her company after Raubal's suicide. By the end of 1932, they had become lovers.

Braun continued to work for Hoffman, a position that enabled her to travel with Hitler's entourage, as a photographer for the NSDAP (Nazi Party). Her relationship with the Führer was troubled. Twice, in August 1932 and May 1935, she attempted suicide. But by 1936 she was fully established as the Führer's companion. Hitler was ambivalent about her. He wanted to present himself as a chaste hero. In Nazi ideology, men were leaders and warriors, women were housewives. So Adolf and Eva never appeared as a couple in public, and the German people were unaware of their relationship until after the war. According to Albert Speer's memoirs, Fräulein Braun never slept in the same room as Hitler, and always had her own quarters. Speer later said, "Eva Braun will prove a great disappointment to historians." But Speer was wrong. He had overlooked Eva's gifts as a photographer.

Once he found the photograph of Eva with her cine-camera, Becker began to speculate about the possibility of Braun's home movies. If there was a camera there must have been some film, and if there was film, it must have been stored somewhere. The Nazis were nothing if not meticulous record keepers. In the late 1940s there had been reports circulating of a collection of home movies. Becker had heard these stories, but had never pursued them. No one had ever confirmed where such films might be hidden, or even if they existed at all.

Now in London, Becker began to make inquiries. He searched the records of the Imperial War Museum and the National Film Archive. "In those days," he recalls, "there was no great interest in film as historical evidence. Most historians believed that newspapers were more important than film, as testimony. But I had a very sharp need to sort out my own past." Becker would look at anything that helped with decrypting the terrible conundrum of Nazism.

Perhaps only a child of Nazi Berlin could have felt both the need and the determination to do this. It's hard, now, to appreciate how little was known of Hitler's mistress in the 1950s and 60s. It was Becker's research that would change the world's perception of the Führer and the Aryan wife (Braun married Hitler the day before their suicide) who died at his side in the bunker.

Becker's quest took him to the heart of a strange, postwar – predominantly American – society of Nazi obsessives: former veterans, trophy hunters, amateur cineastes and right-wing Aryan fantasists. In April 1970, Becker found himself in Phoenix, Arizona, at a gathering of film buffs, when he was introduced to a retired member of the US army unit responsible for the liberation of Hitler's chalet at Obersalzberg in April 1945. This veteran marine told Becker that, so far as he could recall, he had indeed noticed piles of film canisters in Hitler's mountain lair, but had not understood their significance. This material, he remembered, had been taken away by the US Signal Corps, the division of the American army responsible for the films and photographs retrieved from the ruins of the Third Reich.

Becker's curiosity was roused. Assuming they existed, these film canisters, he reasoned, must eventually have been taken to the National Archives and Records Administration in Washington DC. This was the home of such treasures as, for example, the original Declaration of Independence. With some anticipation, Becker trawled through the National Archive's catalogue, but in vain. He could find nothing that answered to the description of Eva Braun's home movies. For a while, the trail went cold but, he says, "I still had this instinct that there would be some films."

Becker continued to pursue his career as an artist in London, but he could not shake off his reputation as the film historian of the Third Reich. In 1971, he was approached by the producer David Puttnam and Sandy Lieberson, co-founders of the documentary unit Visual Programme Systems. They asked him to act as a consultant on a documentary series about the nazification of Germany in the 1920s and 30s. With some misgivings, Becker signed on, not least because "as a private person, I could not finance my research into Eva Braun's films". Working for Puttnam and Lieberson, Becker now had full responsibility for researching the US National Archives in depth. He could still find no trace of Eva Braun's fabled home movies, but at least he was in conversation with the curators who might be able to help.

Part of Becker's problem in these early days was that his search was for 16mm footage. To the world's film archives, 16mm film was inferior to 35mm, the regular film stock used for official propaganda. The curatorial priority for most film archives at that time was to preserve nitrate footage shot on 35mm film before it disintegrated or disappeared; 16mm film was a lesser priority. Nonetheless, on his visits to Washington, Becker did turn up new information about a National Archives vault of uncatalogued 16mm film held in an old aircraft hangar in a forgotten part of Maryland, just outside Washington DC.

One fine day, in the spring of 1972, Becker drove out of DC to this vault and began searching through a rusting and discarded heap of old film canisters. It was, apparently, a fruitless quest. Most of the material seemed to be Japanese. None of it was 16mm stock. But then, as he turned over these uncatalogued cans, he spotted something no one had noticed before – a set of cans with German labels. With rising excitement, he opened the first can and drew out a few frames of film to hold them up to the light.

Amazingly, it was colour film, and – even more astounding – there was Adolf Hitler with several senior Nazis (Albert Speer, Joseph Goebbels, Joachim von Ribbentrop), relaxing in the sunshine on the terrace of the Obersalzberg. These were indeed Eva Braun's home movies. Here, finally, were the overlords of the Third Reich at home, and at play.

Braun's home movies, mostly shot in Hitler's fortified chalet in Berchtesgaden, in the Bavarian Alps, have a naive innocence. She captures in the private life of the Nazi high command what Hannah Arendt called "the banality of evil". In Braun's footage, we see Hitler and his cronies relaxing on the terrace of his chalet. They drink coffee and take cakes; they joke and pose for the camera. Hitler talks to the children of his associates, or caresses his Alsatian, Blondi. The camera (in Eva Braun's hands) approaches Hitler in rare and intimate close-up. Occasionally, when a visitor from outside the party elite appears, the camera retreats to a more respectful distance. Mostly, however, Braun's cine-camera is among the party circle, at Hitler's side, and at his table. Most of the footage is in colour, with an extraordinary immediacy. Braun's films offer a remarkably unmediated view of the Nazi leadership and of Hitler himself. This was not the image presented by his propaganda team, or by Leni Riefenstahl, "Hitler's favourite film-maker", but the man as he actually was.

Braun's films chart the Führer's career up to the zenith of Nazi success, the summer of 1941. At that moment, with the eastern divisions of the Wehrmacht racing into the heart of the Soviet Union, it was reasonable to conclude, as many did, that Germany would win the war. But then came Pearl Harbor in December 1941, followed by Stalingrad and the defeat of Rommel in North Africa. Once Russia was fighting back, undefeated, and once America was committed to the Allied cause, the Third Reich was doomed, and Eva Braun ceased filming.

In the apocalyptic chaos of Hitler's downfall, the final days in the bunker and the dramatic suicides of Adolf and Eva, Braun's home movies, never widely known, became forgotten. Until Becker came on the scene.

"I asked for a Steenbeck [editing machine]," he recalls, "and began to watch. In my excitement, it was as if my life had a sense of purpose. I had been very angry about those Nazis. Now I could channel that anger in a positive way."
In film-history terms, the moment Becker opened those first canisters was the equivalent of peering into the tomb of Tutankhamun. He had finally identified the treasure that many had spoken about but none had found. Adolf Hitler's image would never be the same again.

By chance, Becker's discovery – soon after viewed at the National Archives in Washington with great excitement – coincided with the making of one of television's greatest documentary series, The World At War, a project produced and masterminded by Jeremy Isaacs at Thames TV in London. In keeping with the spirit of the age, the TV history of the Second World War would not just be a military history, featuring admirals, generals and air marshals. It was to include the common man and woman: Berlin housewives, London Blitz survivors, Russian peasants and Japanese civilians. Isaacs wanted not only to describe the victory of the west, but also to tell the story of how the whole planet had become engulfed in conflict.

Becker, meanwhile, was discovering the limits to the public's appetite for the home life of Adolf Hitler. Taking the best of the Eva Braun footage, the documentary he worked on for Puttnam, entitled Swastika, was premiered at the Cannes Film festival in May 1973. The audience was outraged, booing and whistling at the screen, with cries of "Assassins!" The presentation of the Führer as a friendly uncle, a petit bourgeois figure in a suit and tie, popping in and out of a family gathering, was intolerable. The iron-clad image of Hitler so carefully shaped by Heinrich Hoffman still exerted a fierce grip on the public imagination.

The production team for The World At War soon heard about Becker's material, and wove it into the series in a manner less contentious than in Swastika. Now British and American television audiences could have a new perspective on the Third Reich and its leaders. Initial outrage softened into a more mature understanding. It became easier to come to terms with the horrors of the past if its demonic protagonists were seen not as monsters but as ordinary – sinister emissaries from humanity's dark side, but recognisably human.

Becker is still tormented by the first reactions to Eva Braun's films. "I was punished for puncturing a negative myth. People saw something that was banal in action, and banal in its colour." He believes that many had become comfortable with the carefully composed, black-and-white propaganda images of the Nazis. "People hate it when you tinker with their mythologies," he says. Over a generation, however, perceptions have changed.

Today, Becker's research, inspired by the need to make peace with the past, has, paradoxically, had the effect of historicising it. There were many equally evil 20th-century regimes – Stalin, Mao, Idi Amin, Pol Pot – but none of these exert quite the same cultural and psychological charge as Nazism. Becker himself finds it painful to review Braun's home movies. He says, looking back, he has learned "to develop a sense of responsibility, and to see that [my research] could not be a howling triumph, but at best an armistice. I was able to see the ghosts of the past put into the history books. The Nazis were no longer spooking my psyche. My journey was over."


The home movies in four parts...





Deceased--Donald Hornig

 Donald Hornig
March 17th, 1920 to January 21st, 2013

"Donald Hornig, Last to See First A-Bomb, Dies at 92"

by

Douglas Martin

January 26th, 2013

The New York Times

In a small shed at the top of a 100-foot-tall steel tower deep in the New Mexico desert, Donald Hornig sat next to the world’s first atomic bomb in the late evening of July 15, 1945, reading a book of humorous essays. A storm raged, and he shuddered at each lightning flash.

It was his second trip to the tower that day as part of the Manhattan Project, the secret American effort to build an atomic bomb. He had earlier armed the device, code-named Trinity, connecting switches he had designed to the detonators.

But J. Robert Oppenheimer, the scientific director of the project, had grown nervous about leaving the bomb alone. He told Dr. Hornig to return to the tower and baby-sit the bomb.

A little after midnight, the weather had improved, and Dr. Hornig was ordered down from the tower. He was the last man to leave and the last to see the weapon before it changed human history.

A little more than five miles away, Dr. Oppenheimer and others waited in a bunker to see if the device they called “the gadget” would actually go off. After Dr. Hornig joined them, he took his position for his next task: placing his finger on a console switch that when pressed would abort the blast, should anything appear awry. The countdown began, his finger at the ready.

The bomb was detonated at 5:29:45 a.m. on July 16 as Dr. Hornig and the others watched from the bunker. He later remembered the swirling orange fireball filling the sky as “one of the most aesthetically beautiful things I have ever seen.”

Three weeks later, an atomic bomb was dropped on Hiroshima. Three days after that, another fell on Nagasaki.

It was the dawn of the nuclear age and also of a career that took Dr. Hornig to the White House as science adviser to President Lyndon B. Johnson and to academic eminence as the president of Brown University in Providence, R.I., where he died on Monday at 92, his family said.

Dr. Hornig worked under Johnson from 1964 to 1969, conferring with him on space missions and atom smashers as well as on more practical matters, like providing sufficient hospital beds for Medicare patients and desalting water for drinking.

He had actually been President John F. Kennedy’s choice for science adviser. Kennedy had asked him to take the job shortly before his assassination in 1963, and Johnson followed through with the appointment.

Working for Johnson was reportedly not easy. The president was said to disdain scientists and academics after so many of them had voiced opposition to the Vietnam War, which made it difficult for his science adviser to lobby for them.

But when a power blackout hit the Northeast in 1965, the president turned to Dr. Hornig for guidance, as he did when earthquakes hit Denver. After Senator Robert F. Kennedy was assassinated in 1968, Johnson sought Dr. Hornig’s advice on ways to detect concealed weapons.

Under Johnson, Dr. Hornig doubled the budget of what is now the Office of Science and Technology Policy, which he led, and pushed for federal research in housing and transportation. He also helped kill a proposal to put giant mirrors into orbit over Vietnam to spotlight the enemy at night.

As the president of Brown from 1970 to 1976, Dr. Hornig established a four-year medical school. He oversaw the merger of Pembroke College, Brown University’s women’s school, with Brown College, the men’s undergraduate school. He faced student protests, including a 40-hour sit-in at Brown’s administrative building, over cost cutting, minority admissions and other matters.

He met some student demands but later declared that the university would never again negotiate with students occupying a building. He described his presidency as “bittersweet.”

Donald Frederick Hornig was born on March 17, 1920, in Milwaukee and attended Harvard, earning his undergraduate degree there in 1940 and his Ph.D. in 1943, both in chemistry. His dissertation was titled “An Investigation of the Shock Wave Produced by an Explosion,” and he went to work at the Underwater Explosives Laboratory of the Woods Hole Oceanographic Institute in Massachusetts.

He joined the Manhattan Project after his boss at Woods Hole passed along a mysterious invitation asking him to take an unspecified job at an unspecified location. No explanations were offered, and Dr. Hornig declined. James B. Conant, the president of Harvard, helped persuade him to change his mind.

Dr. Honig and his new wife, the former Lilli Schwenk, bought an old Ford with frayed tires and puttered to New Mexico. His wife, who also had a Ph.D. in chemistry, worked for the project as a typist, and then as a scientist.

Dr. Hornig is survived by his wife, as well as two daughters, Joanna Hornig Fox and Ellen Hornig; a son, Christopher; a brother, Arthur; a sister, Arlene Westfahl; nine grandchildren; and 10 great-grandchildren. His daughter Leslie Elizabeth Hornig died last year.

After World War II, Dr. Hornig was a professor and a dean at Brown and then moved to Princeton as the chairman of the chemistry department. While at Princeton, he was on President Dwight D. Eisenhower’s scientific advisory committee.

Dr. Hornig was briefly a vice president at the Eastman Kodak Company before accepting Brown’s presidency. After leaving Brown, he taught at Harvard’s public health school, retiring in 1990. He was one of the youngest scientists ever elected to the National Academy of Sciences.

 
"Donald F. Hornig, scientist who helped develop the atomic bomb, dies at 92"

by

By Matt Schudel

January 23rd, 2013

The Washington Post

Donald F. Hornig, who as a young scientist once “babysat” the world’s first atomic bomb and who later became Brown University president and the top science adviser to President Lyndon B. Johnson, died Jan. 21 at a nursing home in Providence, R.I. He was 92.

He had Alzheimer’s disease, his son, Christopher Hornig, said.

Only a year out of graduate school, Dr. Hornig was recruited in 1944 for the top-secret Manhattan Project in Los Alamos, N.M.

The World War II project, directed by J. Robert Oppenheimer, was designed to produce an atomic bomb. Dr. Hornig led a team that developed a device called the “X unit,” the firing mechanism for the bomb.

The first nuclear bomb — called “the gadget” by scientists — was scheduled to be detonated near Alamogordo, N.M., on Monday, July 16, 1945.

“On the Sunday before the test, shortly before 9 p.m.,” Dr. Hornig recalled in a 1995 article in the Christian Science Monitor, “Oppenheimer decided someone should be in the tower to baby-sit the bomb because of the possibility of sabotage. Maybe because I was the youngest, I got the job. In the darkness, amid heavy rain, lightning, and strong winds, I climbed the ladder to the top of the 100-foot tower.”

To take his mind off the bomb beside him, Dr. Hornig attempted to read a paperback novel under a 60-watt bulb.

“I stopped frequently to count the seconds between the sound of a thunder clap and the lightning flash,” he told the Monitor, “and tried not to think of what might happen if the tower got a direct hit and the gadget went off. At least I would never know about it.”

Early the next morning, Dr. Hornig climbed down from the tower and took his place beside Oppenheimer in a control room more than five miles away.

“My right hand was poised over a control switch that would stop the test if something went wrong,” he said in the interview. The bomb exploded at 5:29:45 a.m.

“My first reaction, having not slept for 48 hours, was ‘Boy, am I tired,’ ” Dr. Hornig recalled. “My second was, ‘We sure opened a can of worms.’ Nobody knew where this would lead, but I had no regrets.”

Donald Frederick Hornig was born March 17, 1920, in Milwaukee. He was a 1940 graduate of Harvard University, where he also received a doctorate in physical chemistry in 1943.

After working at the Woods Hole Oceanographic Institution in Massachusetts, he joined the Manhattan Project because of his expertise on shock waves produced by large explosions.

Dr. Hornig joined the faculty at Brown University in 1946, then moved to Princeton University in 1957. He served on science advisory panels of four presidents, beginning with Dwight D. Eisenhower.

In November 1963, Dr. Hornig was tapped as the top White House science adviser by President John F. Kennedy, who was assassinated two weeks later. Dr. Hornig became head of the White House Office of Science and Technology in January 1964, after Johnson had assumed the presidency.

Business Week reported that the job “is expected to be one of the hottest seats in the Johnson administration,” but Dr. Hornig often found himself having to defend federal research programs from budget cutters in Congress.

At the other end of Pennsylvania Avenue, he said he sometimes had a hard time getting Johnson’s attention. He left at the end of Johnson’s term, in January 1969.

“I was never on easy personal terms with the president,”
Dr. Hornig told Science magazine at the time. “There’s always been a certain gap in attitude and approach between a Texas rancher and an Ivy League professor. I was on much easier terms with Kennedy, who asked me to serve in the first place.”

Dr. Hornig was named president of Brown in 1970. He merged an affiliated women’s college, Pembroke, with the all-male Brown and helped establish a medical school at the university.

He also wrestled with a $4 million deficit and a confrontational student body that went on strike in 1975 to protest cuts in popular classes and programs. Dr. Hornig ordered a 15 percent cut in spending that included the dismissal of many faculty members. By the time he resigned in 1976, he was unpopular with students and professors, but he was eventually credited with steering the Ivy League university toward financial stability.

Dr. Hornig finished his career at the Harvard School of Public Health, where he established an interdisciplinary program in public health. He retired in 1990.

Survivors include his wife of 69 years, Lilli Schwenk Hornig of Providence; three children, Joanna Fox and Christopher Hornig, both of Washington, and Ellen Hornig of Shrewsbury, Mass.; a brother; a sister; nine grandchildren; and 10 great-grandchildren.

A daughter, Leslie Hornig, died in 2012.

In a 1964 article examining the state of science in the Johnson administration, a writer for Time magazine questioned whether Dr. Hornig could accomplish much because he “is a virtual stranger on the Washington scene.”

In a letter to the editor, Dr. Hornig’s 10-year-old son took exception. “My father has served for three Presidents,” Christopher Hornig noted, “and is in Washington so much that by now he is a virtual stranger to me.”

 
"Donald Hornig: Babysitter for the A-Bomb"

by

Patrick Kiger

January 24th, 2013

AARP

Donald F. Hornig was a top science adviser to three Commanders-in-Chief — Dwight Eisenhower, John F. Kennedy and Lyndon Johnson — and taught at Princeton and Harvard, in addition to serving a six-year stint as president of Brown University. But he achieved his greatest measure of fame  as a young chemist a year out of graduate school, when he joined the Manhattan Project, the U.S. government’s top-secret effort during World War II to develop an atomic bomb.

Hornig, who died on Jan. 21 at age 92 in Providence, R.I., was such a gifted young scientist that, despite his inexperience, he rose to the level of team leader, supervising a group that developed the “X unit,” the mechanism that actually triggered the nuclear bomb. Here are five fascinating facts about the scientist who helped to end World War II and launch the nuclear age.

In a 1995 Christian Science Monitor article [below], Hornig recalled that he was working as a physical chemist at the Woods Hole Oceanographic Institute in Massachusetts when he got a call from his old Harvard professor George Kistiakowsky, who said he needed Hornig’s help at a lab where he was working. When he agreed, Kistiakowsky told him to pack his bag immediately, and said no more. It wasn’t until Hornig got to Los Alamos, N.M., in May 1944, that he was told that he would be working on the effort to build an atomic bomb.

Hornig made his most important contribution to the bomb after just a month on the job. In June 1944, J. Robert Oppenheimer, the project’s chief scientist, held a meeting to discuss a critical technical problem: how to trigger an implosion of the outer shell of the bomb’s plutonium, which would drive the fissionable material together and create the critical mass that would lead to a nuclear blast. Using explosives would destroy the firing mechanism, which would make testing difficult. Hornig suggested setting off electrical spark gaps, which would act as very precise switches that would trigger the process in a fraction of a microsecond. Oppenheimer was sufficiently impressed that he told Hornig to get some people together to work on the idea, and by October, the decision was made to use Hornig’s gadget.

The first full-scale test of the bomb — scheduled for Monday, July 16, 1945, the day before President Truman’s crucial meeting with Winston Churchill and Josef Stalin at Potsdam, Germany — had high stakes, but it was no sure thing. “Many involved in the project doubted that the complicated implosion-bomb system would work effectively and with a large enough explosive yield,” Hornig recalled. The scientists’ nerves were rattled all the more when heavy clouds over the desert test area created a powerful electrical surge that might have set off the bomb prototype, had its trigger been set.

On the evening before the test, Oppenheimer, who was worried about sabotage, assigned Hornig to go up in the bomb tower and keep watch over the nuclear device. Hornig climbed a 100-foot ladder to the top of the tower, with a paperback book to keep him occupied. It proved to be a hair-raising experience. The tower was battered by heavy rains from a thunderstorm, and Hornig anxiously counted the seconds between the thunder claps and lightning flashes. He recalled that he “tried not to think of what might happen if the tower got a direct hit and the gadget went off.” As a consolation, it occurred to him that if the bomb went off, “at least I would never know about it.”
Just before daybreak, Hornig sat in a concrete bunker about five and a half miles away from the bomb tower, watching control panels. Just before 5:30 a.m., he saw a brilliant flash, and then — ignoring the possible risks — ran outside to get a better view. He saw “a great fireball rapidly rising, with peach, green, and rosy red colors, gradually transforming into a mushroom cloud.”

Hornig, who was exhausted from going without sleep for 48 hours, had a subdued reaction to the event. “Boy, am I tired,” was his first thought, he later recalled. It took a moment for the significance to sink in. “Nobody knew where this would lead,” he later recalled. “But I had no regrets. If it ended the war without the tremendous casualties an invasion of Japan would cause, it was worth it.”

Here’s declassified film footage shot by the government of the historic bomb test: 





"Atom-Bomb Scientist Tells His Story"

Chemist Don Hornig kept his hand on the switch to stop the test

by

Donald Hornig and Robert Cahn

July 11th, 1995

The Christian Science Monitor

IN the spring of 1944, my life became a mystery novel. I was working at the High Explosives Research Laboratory at the Woods Hole (Mass.) Oceanographic Institute when I got a call from Harvard Prof. George Kistiakowsky, a scientist whom I greatly admired. He was working at a new laboratory, he said, and needed me badly.

That was all I needed to say yes. I was advised to leave as soon as possible, without telling anyone, and report to 109 East Palace Ave., Santa Fe, N.M., where I would be given further instructions.

At Santa Fe, I was directed to Los Alamos, a former boys' school on a mesa at the base of the Jemez Mountains. I found that I was to join Kistiakowsky and others under J. Robert Oppenheimer working on the Manhattan Project, the crash program to build an atomic bomb. I was 23 years old, a physical chemist of limited experience with a newly acquired PhD who had done research on shock waves produced by large explosions.

To maintain secrecy, the project's overall director, Maj. Gen. Leslie Groves, insisted on strict compartmentalization: Scientists would have access to information only on a need-to-know basis.

Oppenheimer persuaded Groves that the only hope of resolving the enormous scientific and technical problems facing his group was to let scientists discuss these challenges with one another.

When I arrived in May 1944, two ways to create a critical mass were being considered. The first used a high velocity gun to shoot two sub-critical chunks of enriched uranium (U-235) together. This would not work with plutonium, though, which requires much faster assembly to avoid a premature explosion. The solution proposed was ''implosion'': Surround a subcritical shell of plutonium with explosive charges (called ''lenses'') shaped to focus the detonation waves toward the center of the plutonium shell and drive all segments together.

Only enough U-235 would be available for one bomb. But if the implosion concept worked, several such bombs might be ready by the end of 1945.

ENOUGH was known about the gun method to justify going ahead without a full test. But the implosion method was still in a preliminary stage; among its problems was that of a firing system that could initiate all the explosive lenses at once. The only idea for doing so was an explosive switch, but that could not be tested without destroying the firing mechanism. Oppenheimer raised the problem in June 1944 at one of his seminars.

After listening to the discussion, I suggested developing triggered spark gaps to act as very rapid switches. They might make it possible to initiate the explosive lenses within a fraction of a microsecond. Oppenheimer told me to get a few people together and work on the idea. By the end of the summer, we had accumulated enough evidence to say that this method was feasible. After much debate and many misgivings, we were given a go-ahead in October to incorporate triggered spark gaps into a firing unit for ''the gadget,'' as the bomb was called.

Many months of work remained before the firing mechanism would be ready. The ''X-unit'' had to initiate at 32 points with astonishing simultaneity. If it did not, the explosive yield would be low.

The full-scale test of the bomb was planned for pre-dawn Mon., July 16, the day before President Harry Truman was to meet with Joseph Stalin and Winston Churchill at Potsdam, Germany. By then, we felt we were ready, but many involved in the project doubted that the complicated implosion-bomb system would work effectively and with a large enough explosive yield. Several days before the firing date, the doubts were exacerbated when heavy clouds passed over the desert test area near Alamogordo, N.M., and an electrical surge set off the X-unit. It would have set off a live bomb.

On the Sunday before the test, shortly before 9 p.m., Oppenheimer decided someone should be in the tower to baby-sit the bomb because of the possibility of sabotage. Maybe because I was the youngest, I got the job. In the darkness, amid heavy rain, lightning, and strong winds, I climbed the ladder to the top of the 100-foot tower. Pulling out a paperback and sitting under a 60-watt bulb, I read to keep my mind off the lightning and the bomb. I stopped frequently to count the seconds between the sound of a thunder clap and the lightning flash, and tried not to think of what might happen if the tower got a direct hit and the gadget went off. At least I would never know about it. I was never happier than when the phone rang just before midnight, and Kistiakowsky told me to come down.

Shortly before daybreak at a concrete bunker 10,000 yards south of the tower, my final task was to watch the control panels to assure that all the condenser banks reached the 15,000 volts needed to initiate the explosion. My right hand was poised over a control switch that would stop the test if something went wrong. At 5:29:45 a.m. the gadget went off. Through the open door, looking away from the tower at the mountains, I saw a brilliant flash, then ran outside to see a great fireball rapidly rising, with peach, green, and rosy red colors, gradually transforming into a mushroom cloud.

My first reaction, having not slept for 48 hours, was ''Boy, am I tired.'' My second was ''We sure opened a can of worms.'' Nobody knew where this would lead, but I had no regrets. If it ended the war without the tremendous casualties an invasion of Japan would cause, it was worth it.

* After a career as a chemistry professor at Brown and Princeton universities, Donald F. Hornig became a science adviser to Presidents Eisenhower, Kennedy, and Johnson. Later he was president of Brown and concluded his career at Harvard.


Donald Hornig [Wikipedia]