Two days and counting into infinity...DTV is here and appears that the transition was fairly seamless despite the fact that many have been left out in the cold and the whole process was a nightmare. Stephen Bates [teacher at the Hank Greenspun School of Journalism and Media Studies at the University of Nevada, Las Vegas] wrote the following in the Spring 2008 issue of The Wilson Quarterly.
"The Day the TV Died"
"'DTV' IS COMING (AND SOONER THAN YOU THINK!)" proclaims the Federal Communications Commission website. With, perhaps, a touch of panic?
by
Stephen Bates
Spring 2008
The Wilson Quarterly
"'DTV' IS COMING (AND SOONER THAN YOU THINK!)" proclaims the Federal Communications Commission website. With, perhaps, a touch of panic?
by
Stephen Bates
Spring 2008
The Wilson Quarterly
Digital television—DTV—will replace the venerable analog format on February 17, 2009, at midnight, time zone by time zone. The changeover won't affect most Americans who get TV by cable or satellite, or those who already own digital TVs or converters, or the handful of iconoclasts (less than two percent of households) who don't own TVs at all. But that leaves millions of people at risk of severe entertainment deficit.
By various estimates, between 10 and 15 percent of American households watch over-the-air programming exclusively, relying on rabbit ears, rooftop contraptions, and other gear from the I Love Lucy era. A third of these People of the Airwaves don't know about the digital shift, according to a poll by the Consumer Reports National Research Center. If DTV arrived tomorrow, some 20 million Americans would turn on American Idol and find Randy, Paula, and Simon replaced by snow.
People will be able to continue watching over-the-air broadcasts on analog TVs if they get converter boxes, which currently cost around $60. The Commerce Department's National Telecommunications and Information Administration is offering every household two coupons, each worth $40 toward a converter. According to the poll, though, three-fourths of Americans haven't heard about the coupons.
That's not the only complication.
After DTV Day, some households—about one in 20, according to FCC chairman Kevin Martin—will have to spring for new antennas to pick up the same channels.
The February 17 deadline doesn't apply to the nation's 2,100 low-powered TV stations, which include many rural and university-run outlets. If you want to watch those channels with a converter, you'll need a special one with "analog pass-through," or an antenna switch, or the patience to disconnect your antenna from the converter and connect it to the TV.
The digital converter replaces your TV's tuner, so it comes with its own remote. Unless you buy and program a universal remote, you'll use the converter's remote to change channels and the TV’s remote to adjust volume—alongside, perhaps, the armamentarium of DVD and VCR remotes. Speaking ofVCRs, if you want to record one program while watching another, you'll need two converters. If each of the converters happens to be, say, a Zenith model DTT900, then using the remote to change channels on the TV converter will also cause the channels to change on the VCR converter. "Yeah, that's a problem," a clerk at Circuit City told me.
The DTV disruption comes by federal decree, and Congress has its reasons. Because the digital format is far more efficient, it can produce sharper images, enable broadcasters to offer additional channels or data services, and free up spectrum space for wireless broadband, public-safety communications, and other uses. The United States isn't alone in this process. In Great Britain, the transition is already under way, region by region, to be completed in 2012. Finland went all-digital on September 1, 2007. In most of the world, digital will soon be the television standard.
These transitions can be ticklish, as we’re likely to discover next February. But standards themselves mostly make life easier. Standardization is "the liberator that relegates the problems that have been already solved to their proper place," insurance expert Albert Whitney wrote in 1928. Back then, standardization had a long way to go. Imagine buying sheets when beds came in 78 sizes. Washers in household faucets, The New York Times said in 1927, were "almost impossible to replace, even in supply stores in large cities."
At the time, an engineer was striving to tame the anarchy. Shortly after President Warren G. Harding made him secretary of commerce, in 1921, Herbert Hoover established the Division of Simplified Practice. ("Simplified" was chosen, a Commerce official explained, because "standardized" sounded "Prussian.") The agency helped establish national standards for everything from pickaxes to grape baskets.
The federal simplifiers aimed to facilitate rather than dictate. In 1923, for instance, they assembled some 200 representatives of the lumber industry, who decided that the standard construction board would be 25/32 of an inch thick. Hoover called it "a splendid example of industry solving its own problems."
But the solution wasn't universally acclaimed. Companies that manufactured thinner boards, 24/32 of an inch, wanted their measure to be the norm. In "the battle of the 32nd"—Hoover's jocular phrase—they lost.
That's the common pattern. When standards are set, somebody wins and somebody loses.
In the major skirmishes over broadcast standards, David Sarnoff was the winner. Born in Russia in 1891, Sarnoff arrived in the United States at the age of nine, unable to speak any English. In his teens, he studied Morse code and landed a job at American Marconi Wireless Telegraph Company, first as an office boy and then as a telegrapher. When Guglielmo Marconi visited his company's U.S. branch, Sarnoff connived to meet him and soon became his protégé. In 1919, the federal government voiced misgivings about British-owned Marconi controlling vital American technology. General Electric bought American Marconi and shifted its key assets to a new firm, Radio Corporation of America. Among those key assets was Sarnoff. He soared through the company's ranks, reaching the presidency a decade later.
Well before most executives and engineers, Sarnoff saw the future of TV—some reporters called him a "televisionary"—and he plotted to put RCA at the forefront, whatever it took. "Competition brings out the best in products and the worst in men," he once remarked. Within the company, broadcast historian Alex McKenzie writes, Sarnoff's rages "could strike like a thunderbolt." Sarnoff sought not just to make history but to ensure his place in it. Horatio Alger comparisons irked him, because he considered his ascent unparalleled. At RCA, he had copies of his own memos bound in leather.
In 1930, when Sarnoff became head of RCA, a handful of American television stations operated on an experimental, noncommercial basis. But would-be viewers, mostly shortwave hobbyists, had a hard time tuning in. "Various frequencies were used, some in the standard broadcast band and some in the shortwave bands," historians John Ryder and Donald Fink write in their encomium to electrical engineering, Engineers and Electrons (1984). "There were no standards for the number of lines or number of pictures per second used by these stations, and no attention was paid to the bandwidth required to carry the transmissions."
Commercial TV, Sarnoff often said, was "just around the corner." But the corner kept receding. In 1936, at one of Sarnoff's demonstrations, E. B. White watched the TV image jitter and undulate. "President Roosevelt's face not only came and went," he observed, "it came and went under water." The picture traveled from the studio by wire to RCA's transmitter, then by ether to the screen in front of White. "The magical unlikelihood of this occasion," he wrote, "was not lessened any by the fact that a stranger wearing a telephone around his neck was crawling about on all fours in the darkness at our feet. This didn’t make television seem any too practical for the living room of one’s own home, although of course homes are changing."
Regulators agreed: TV wasn't yet living-room ready. Members of the Federal Radio Commission and its successor, the FCC, hesitated to set broadcast standards. "Rigid adoption of standards at this state of the art," the FCC said in 1939, "may either 'freeze' the television industry, and thus retard future development, or may result in a high rate of obsolescence of equipment purchased by the public." A year later, the commission warned that Americans "should not be inflicted with a hodgepodge of different television broadcasting and receiving sets."
But without standards, a hodgepodge already existed. In 1938, Communicating Systems Inc. marketed TVs for $150 (three-inch screen) and $250 (five-inch screen), with one-year guarantees—not that the sets wouldn't break, but that they wouldn't become wholly obsolete. By the end of 1939, American Television, DuMont, Andrea Radio Corporation, General Electric, and RCA were selling TVs too. Everything from the number of channels to the hue of the images (black and white or black and green) varied from set to set. Gutsy early adopters bought TVs, but dueling standards and scant programming scared off most consumers.
In 1939, the FCC authorized television to take a small step forward. Experimental stations could adopt "limited commercialization"—that is, sell ads to pay for creating programs but not for broadcasting them. Commercial TV was to have a soft rollout. "No interests should be permitted to raise public hopes falsely," the FCC stressed. In particular, "nothing should be done which will encourage a large public investment in receivers which . . . may become obsolete in a relatively short time."
But self-restraint wasn't Sarnoff’s style. He bought full-page ads touting discounted RCA sets, which could receive RCA's shows but not those of some other broadcasters. The era of home TV wasn't just around the corner, he said—it was here. This was precisely the sort of hype the FCC had tried to prevent. The commission suspended the limited commercial broadcasting and chastised RCA.
Sarnoff was unapologetic. In his view, consumers would willingly risk obsolescence in order to enjoy television. A TV buyer, he said, "is paying for the unique privilege of seeing what is important or interesting today in a program of news, information, entertainment, education, and sports events which he cannot witness tomorrow or next year, however great the technical improvements. . . . The miracle of sight transmitted through the air should not be treated on the [same] basis of obsolescence as a spring hat."
A television costing hundreds of dollars is no spring hat. But under pressure from RCA, the FCC reversed course and decided to establish TV standards, even if they might later have to be changed. The commission sought recommendations from broadcasters. The understanding was that if broadcasters could achieve consensus, the FCC, like Herbert Hoover's Division of Simplified Practice, would go along.
Broadcasters formed the National Television Systems Committee. The NTSC had barely gotten started when CBS muddied the picture by announcing the development of color TV. Created by engineer Peter Goldmark, the CBS camera captured images through a rapidly spinning disc with red, blue, and green filters. This disc was synchronized with a disc spinning inside the TV. When the camera transmitted its red signal, the red filter on the receiver’s disc was positioned in front of the picture tube's electron beam.
In NTSC sessions, Zenith and Stromberg-Carlson urged the adoption of CBS color, whereas RCA, General Electric, and other companies opposed it. Technical matters were secondary to "rival corporate interests," Joseph Udelson writes in The Great Television Race (1982). As newcomers to TV manufacturing, Zenith and Stromberg-Carlson could gear up to make color TVs with no added costs. The other firms had spent heavily on equipment to produce black-and-white TVs. They didn't want to have to start over. In the end, the majority ruled: Color was, for the time being, kaput.
Of the other issues confronting the NTSC, the most contentious was the number of lines per TV image. The more lines, the greater the clarity. RCA wanted 441, Philco wanted 800, and DuMont argued that with the technology still evolving, flexibility would be ideal—televisions should be capable of picking up broadcasts of anywhere between 400 and 800 lines. (Some of the first TVs had used 60 lines, producing an effect somewhat like watching actors through half-closed venetian blinds.) The NTSC rejected the wide range proposed by DuMont, saying it would make TVs pricier and images fuzzier, and set a standard of 525 lines. According to The Tube (1996), by David Fisher and Marshall Jon Fisher, the number had no technical advantage. It was simply a compromise.
Over DuMont's protests, the FCC adopted the NTSC standards—the basic rules that have governed analog TV ever since—and authorized stations to go fully commercial. RCA opened 10 service centers around New York where consumers could take their TVs to be retrofitted, without charge, to conform to the new standards. On July 1, 1941, RCA station WNBT (later WNBC) went on the air. Its first ad, which was sold for $4, showed a Bulova clock as the second hand circled the face, to the tune of the Minute Waltz. The ticking seconds marked the beginning of the television era.
While the fight over television standards was raging, Sarnoff was also boosting TV through what Fortune later called "the biggest and bitterest behind-the-scenes fight" in radio's history—against a friend.
As an undergraduate at Columbia University, Edwin Howard Armstrong made commercial radio feasible: He found a way to amplify the output so that broadcasts could play through speakers instead of stethoscope-like earphones. The Armstrong method also allowed receivers to pick up transmissions from previously inconceivable distances, across the Atlantic and the Pacific. During World War I, Armstrong developed the "superheterodyne," with which radios and later televisions could be tuned precisely to broadcasts.
One of the people who saw Armstrong demonstrate his amplifying invention, in a basement lab at Columbia, was Sarnoff. The two became friends. Armstrong frequently visited Sarnoff's house for morning coffee; Sarnoff attended Armstrong's wedding—to Sarnoff's former secretary. RCA bought licenses for some of Armstrong's patents and disputed the validity of others, which made for an odd friendship. When a decision in Armstrong's favor was handed down in one lawsuit, Sarnoff congratulated him on the ruling even as RCA denounced it.
"I wish that someone would come up with a little black box to eliminate static," Sarnoff remarked to Armstrong at one point in the early 1920s. In Empire of the Air (1991), Tom Lewis speculates that Sarnoff envisioned a filter between a radio's receiver and its speaker. Instead, Armstrong found that he could eliminate static by modulating the frequency of a broadcast instead of its amplitude—that is, by using what we know as FM instead of AM. Like a 441-line screen and an 800-line one, FM and AM were incompatible. FM sets couldn't get AM, and vice versa.
In 1933, Armstrong demonstrated FM to Sarnoff, who recognized the discovery as both ingenious and perilous. RCA sold AM sets and owned two AM networks. With its clarity, FM could kill AM. (Armstrong thought it would.) In addition, Sarnoff was pushing television. FM and TV would inevitably compete for spectrum space and for consumer dollars.
Armstrong fine-tuned the technology, and experimental FM broadcasts began. Time rhapsodized, "The enthusiasts say that they hear music faithful to the topmost tweet, the bottommost woof; that speech seems to come from the next chair, instead of the next telephone booth; that if an announcer should scratch a match, listeners would hear it burst into flame; that between numbers there is no hum, no crackle, just black, velvety nothing." The FCC assigned FM to the spectrum just below TV and authorized commercial operation in 1940. The commission also ruled that television broadcasts would use FM for sound. For the duration of Armstrong’s patent, TV manufacturers would have to pay him royalties.
FM took off. Hundreds of thousands of people bought receivers. But after Pearl Harbor, the government halted production of FM radios. And, like the Soviet Union, the FCC switched sides during the war. Having earlier championed FM, the commission now skewered it.
On June 27, 1945, the FCC announced that sunspots and atmospheric conditions were interfering with FM broadcasts. An expected sunspot flare-up in 1949 and 1950 could prove disastrous. The commission had a point, according to Dale Hatfield, a telecommunications professor at the University of Colorado, Boulder. Sunspots have caused static on channel 2 of analog TVs, and FM was lower on the spectrum, where interference was likelier.
In response, the commission took three steps. First, it transferred FM up to its current bandwidth and assigned most of the old spectrum to television. Second, it ordered all FM stations to change to the new frequencies by the end of 1946, a lightning-fast transition. (Digital TV, by contrast, has been in the works for more than a decade.) Finally, and inexcusably, the commission refused to let stations broadcast on both old and new frequencies during the transition. (Currently, most TV stations are broadcasting in digital as well as analog.)
Although RCA was on record opposing the change, many people discerned the hand of Sarnoff. Build a better mousetrap, he once remarked, and somebody will develop "a virulent poison which is death on mice and there will be no longer any demand for mousetraps."
The FCC said it wanted to boot FM up-spectrum "before a considerable investment is made by the listening public in receiving sets and by the broadcasters in transmitting equipment." In truth, a considerable investment had already been made. As of mid-1945, there were 53 FM stations, and they were broadcasting to a half-million receivers. The change of technical standards meant that broadcasters’ and consumers' equipment would become obsolete soon and suddenly.
Perhaps coincidentally, perhaps not, the chair of the FCC resigned to become general counsel at NBC, a division of RCA, after the spectrum decision. It later emerged, too, that RCA had sent FCC commissioners free televisions during the FM hearings.
Armstrong and others fought the FCC’s decisions to no avail. In order to receive FM programming without interruption, a consumer needed two receivers—one for the old bandwidth and one for the new—or a receiver designed to get broadcasts on both bandwidths. Broadcast stations had to retool their equipment, at an average cost of more than $1 million apiece. Starting from scratch, most FM stations in the postwar years simply simulcast AM broadcasts, a move that diminished FM’s appeal to consumers and advertisers alike. "Despite its later resurgence," Hans Fantel wrote in The New York Times in 1981, "FM never fully recovered from this blow. Neither did Armstrong."
The prediction that sunspots would devastate FM had come from Kenneth Alva Norton, a War Department engineer. When the sunspots flared, the interference on the old FM spectrum was minor. Armstrong asked Norton if he'd made a mistake. "Oh, certainly," Norton replied. "I think that can happen frequently to people who make predictions on the basis of partial information. It happens every day."
Armstrong's troubles weren’t over. RCA next took the position that his FM patents were invalid, so he wasn't entitled to royalties from the sale of televisions that used FM for sound. Armstrong filed suit in 1948. During his deposition, David Sarnoff said of Armstrong, "We were close friends. I hope we still are," then insisted that RCA engineers had invented FM.
On January 31, 1954, with the legal struggle dragging on, Armstrong wrote a note to his wife, Marion: "God keep you and may the Lord have mercy on my soul." Wearing a suit, scarf, and gloves, he jumped out the window of his 13th-floor apartment.
Shaken upon hearing the news, Sarnoff unthinkingly told a friend, "I did not kill Armstrong." A few months later, RCA settled the lawsuit and agreed to pay Marion Armstrong $1 million. RCA had licensed other companies to manufacture FM equipment, and those companies continued to fight, but one judge after another upheld the validity of the Armstrong patents. In the end, Marion Armstrong collected another $10 million.
In the 1950s, another change in standards loomed. Though not at first, Sarnoff ultimately won. No fatalities were recorded.
The FCC in 1940 had cited "promising experiments with color television"—namely, the color disc developed by Peter Goldmark—but, heeding the NTSC’s recommendation, declined to establish color standards. CBS tried again in 1946, but regulators adopted the same position they had in the 1930s concerning black-and-white TV: "The Commission must be satisfied not only that the system proposed will work but also that it is as good as can be expected within a reasonable time to come."
CBS color technology was incompatible with existing black-and-white TVs. If the FCC adopted CBS standards, older sets would continue to receive black-and-white programs but not color ones. To see those programs, owners would have to buy either a color TV or a converter projected to cost at least $100.
When CBS again asked the FCC to adopt its system, in 1949, RCA announced that it had almost perfected a compatible, all-electronic version, without any "horse and buggy" spinning discs. The FCC demanded a head-to-head comparison, to Sarnoff's consternation. On RCA prototypes at that point, colors were produced separately and combined by a system of mirrors. The mirrors were easily jostled, throwing the colors askew. On test day, Sarnoff later recalled, "the monkeys were green, the bananas were blue, and everyone had a good laugh."
The prospect of obsolescence hadn't troubled Sarnoff in the past. With black-and-white TV, he had argued that consumers deserved to enjoy the technology right away, even if the equipment might soon be obsolete. The spectrum shift for FM, which seemingly had Sarnoff's backstage blessing, had rendered a half-million receivers outdated. Now Sarnoff found himself in a different position. RCA made TVs, and NBC broadcast to them. Compatibility was essential. "A compatible system in television," Sarnoff told U.S. News and World Report, "is more or less the same as a compatible marriage, where the husband and wife see the same thing at the same time and don’t get into a lot of wavy motions."
Like Sarnoff, the FCC flip-flopped. "Obviously, it is essential that all receivers be capable of receiving all transmissions," the commission had said in 1941, concerning black-and-white TV. Now it contended that, though compatibility would be optimal, consumers wanted color and CBS had the better technology. Goldmark's disc became the official standard. Sarnoff fought the FCC decision all the way to the Supreme Court, unsuccessfully.
With color TV, as with FM a decade earlier, Sarnoff profited from war. When the Korean War escalated in 1951, the government barred nonmilitary uses of cobalt, a component in CBS color TVs. Manufacture of the new TVs had to be postponed.
Meanwhile, RCA engineers continued working. By the end of the war in 1953, RCA color was about the same as CBS color, and it was compatible with black-and-white TVs. RCA achieved compatibility by transmitting brightness and color separately. Black-and-white sets got brightness; color sets got both.
While developing compatible color, Sarnoff had also slashed prices and sold black-and-white TVs as fast as he could, in order to raise the political costs of an incompatible system. "Every set we get out there makes it that much tougher on CBS," he said. The number of black-and-white TVs soared, from nine million in 1950 to 23 million in 1953.
The NTSC reconvened, compared the two systems, and recommended RCA color. In a gracious gesture, Peter Goldmark seconded the motion to kill his invention. Late in 1953, the FCC adopted the RCA standard. David Sarnoff had won again.
For the typical technology, death comes slowly. Consumers chose VHS over Betamax two decades ago, but individual Betamax players continued to work. Though Toshiba, maker of HD DVDs, surrendered to Sony and its Blu-ray technology this winter in the high-definition DVD war, early adopters who guessed wrong still have functioning machines. Prerecorded HD DVDs may be few, and blank discs, like Betamax tapes, may disappear from stores, but the life span of the players won’t be affected.
Consumer products—video recording devices, vacuums, answering machines—die every day, but not by the thousands or millions, all at once. Mass obsolescence can occur when devices exchange information or value, and a central authority, usually the government, alters the standards of exchange. On many subway systems today, you can't go anywhere without a fare card. Tokens, like prewar FM receivers, have quit working. The old standards of exchange no longer apply.
Decisions about standards frequently reflect the clout of their proponents. "New machines are not accepted because they are, in some abstract sense, 'better,'" Steven Lubar, a professor of American civilization at Brown University, writes in InfoCulture (1993). "They're accepted because they fill the needs of some individual or group; and they are fought by people who feel that their economic or intellectual interests are at stake."
By some accounts, CBS's system of the early 1950s produced sharper color than RCA's but broke down more frequently. "It's very hard to say what’s a superior technology," Lubar observes. "What we think is superior in retrospect is often what we’re used to, after a lot of money has been invested in it." Following this pattern, the superiority of digital TV won't be apparent to a lot of Americans next February—quite the contrary—but most of them will come around.
"We’re the pipes," Sarnoff once said of broadcasting. He helped set the specs for the analog pipes that have served American TV for nearly 70 years. And no less important, he helped decide what flowed through them. During his lifetime (he died in 1971), the choices were limited: Uncle Miltie or Aunt Bea, Car 54 or Agent 99, Captain Kirk or Colonel Klink. Sarnoff thought the future would be different. The TV itself, he wrote, would be "a thin, flat-surface screen that will be hung like a picture on the wall." As for programs, "every form of art and every type of entertainment will be readily accessible in the home. The range and variety . . . will embrace everything created by the human mind."
On this, the televisionary's vision was clear. Today we have CNN, HBO, MTV, VHS, DVD, Blu-ray, MPEG, Netflix, and YouTube. Unwatched TiVo shows pile up like unread New Yorkers, network websites offer full-length programs, and iPhone users peer at 3.5-inch screens. Program choices are virtually limitless. That turns out to be bad news for Sarnoff's NBC, along with CBS and ABC—their audience share has plummeted—but great news for viewers.
Some Americans, who have tuned out the urgency of the FCC in favor of the urgency of CSI, will be startled by static at midnight on February 17. For everybody else, digital TV will be pretty much the same as analog TV, just a bit sharper, with a few more channels.
The digital changeover is revolutionary for broadcasters. But for viewers, the revolution began years ago.
2 comments:
Save that old TV - there's a message in the 'snow'
By Paul Saffo
Digital TV is wonderful, but there is one feature from the age of Analog TV that I will miss - snow. Yes, that warm white hiss that appeared whenever reception was bad, or a station went off the air. Back in the rabbit ear days before cable, snow was an irritating distraction or downright annoyance when it interfered with the Super Bowl or an episode of Gilligan's Island, but even then, snow was misunderstood.
Consider the cause of snow. A TV antenna is a sponge for radio energy, collecting lots more than just the desired signal. Snow is the result of the TV attempting to turn stray signals into an image, signals from radio stations, emissions from power lines, transformers or appliances, or even from the electrical noise of the circuits in the TV itself. The result is the strangely-calming ant-dance of black on white that we call snow.
But snow has another source, a source far from this planet in both time and space. Mixed in with the noise of Earthling civilization are radio echoes of the Big Bang, the moment of the Universe's creation 13 Billion years ago. The universe started out very small and very hot, and has been expanding and cooling ever since. As it cools, the Big Bang's fossil radiation sheds radio energy in the same way a cake on a cooling rack gives up heat. And when those indescribably ancient radio waves run down the rabbit ears and into your analog TV, the TV's circuitry interprets it as an image, and voila! - Snow.
Full article here:
http://www.sfgate.com/cgi-bin/blogs/saffo/index?
Mesmerizing? Sure, and so is a "dial tone". "Snow"..."dial tones"...government plot for mind control--hehe.
Post a Comment