upload
Routledge, Taylor & Francis Group
Industry: Printing & publishing
Number of terms: 1330
Number of blossaries: 0
Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
The first modern Olympics (Athens, 1896) were envisioned by their founder, Baron Pierre de Coubertin, as a way to transcend international divisions of politics and commerce through the celebration of amateur sport, based loosely on the games of Ancient Greece. Despite these ideals, however, the modern Olympics have, from the outset, been beset by shifting political rivalries and economic circumstances. For Americans as hosts and participants, the Olympics crystallized the major global issues and struggles of the twentieth century. For their first three decades, the modern Olympics were linked to expositions, world’s fairs and other large-scale organized spectacles that emphasized the march of humankind’s progress, as well as the qualitative differences between nations. Indeed, the second and third Olympics took place in coordination with the 1900 Paris Universal Exhibition and the 1904 St. Louis, MO World’s Fair respectively. Like these expositions, the Olympics celebrated the divide between the so-called civilized societies of Europe and North America, which fielded teams, and the subject peoples of the colonial world, which did not. (Loosely organized “popular” Olympics tried to overcome this division as well as that which separated elite sports from more inclusive pastimes). By contemporary standards, the first few Olympics were relatively modest, although each Olympiad saw the inclusion of new sporting events and new national teams. While winter sports were included since the 1900 Games, in 1920, separate Winter Olympic Games were inaugurated. These events have been held in the US three times: in Lake Placid (New York) in 1932 and 1980 and in Squaw Valley (California) in 1960. The scandalridden 2002 games are scheduled for Salt Lake City, UT. Americans have not gained as many medals as in the winter games, nor do the winter games promote tourism and imagery in the same way as summer games. Nonetheless, interest has been created around political events symbolized by the US men’s hockey team’s 1980 gold medal or by the dramas of Nancy Kerrigan and Tonya Harding in 1994. Still, the Summer Olympics remain larger, more inclusive and more prestigious than the Winter Games. The 1932 Los Angeles Olympics were the first to link the Games to local economic development, as well as civic pride on the part of the host-city and nation. They were also marked by an unprecedented amount of international mass media coverage that served to heighten the political importance of the Games. The Berlin Olympics of 1936 were particularly noteworthy in this regard, as the Nazi regime envisioned a vehicle for promoting the image not only of a resurgent Germany, but also of Aryan superiority. These latter dreams were effectively dashed, however, by the stellar performance of Jesse Owens and other African American athletes on the US team. The Second World War led to the cancellation of both the 1940 Olympics (Tokyo) and the 1944 Olympics (London, hosted in 1948). Following the entry of the first team from the Soviet Union in 1952, the Games became a surrogate arena for the Cold War between the “Free West,” led by the United States, and the communist bloc, led by the Soviet Union. In the 1960s, the changing scale and nature of the Games challenged some basic Olympic principles. Extensive state subsidies provided to athletes from Soviet-bloc nations were seen to undermine the basic principles of amateurism. Meanwhile, the advent of television coverage in 1960 intensified prospects for commercialization. For much of the next two decades, the International Olympic Committee (IOC), under the leadership of Avery Brundage, fought a successful rearguard action against both professionalism and commercialization. Still, the scale and international visibility of the Olympic Games intensified their importance as symbolic arenas. Awarding of an Olympics bestowed international recognition on former pariah nations (Tokyo, Japan 1964; Munich, Germany 1972; and perhaps Barcelona, Spain 1992) and on developing nations such as Mexico (1968) and South Korea (1988). The Mexico City Games of 1968, however, became noteworthy for the military’s brutal repression of student protestors. Meanwhile, at their medal ceremony American sprinters Tommy Smith and John Carlos raised gloved fists in the Black Power salute, for which they were suspended from the US Olympic team and kicked out of the Olympic Village. Global politics intensified thereafter. The 1972 Munich Games were marred by the murder of members of the Israeli Olympic team by Palestinian guerrillas. Many African nations boycotted the 1976 Montreal Games to protest New Zealand’s participation (their rugby team had played a tournament in South Africa). In 1980 the United States boycotted the Moscow Games in protest of the Soviet Union’s invasion of Afghanistan, which in turn provoked a Soviet-bloc boycott of the 1984 Los Angeles Games. While the increasingly manifest politicization of the Games tarnished the luster of the Olympic movement, their growing scale also placed extreme financial burdens on the host-city and nation, notably Montreal’s more than $1 billion debt in 1976. Thus, Los Angeles was the only city to bid on the 1984 Games. The Los Angeles Olympics proved noteworthy for not only the triumph of US teams in the absence of the Soviet bloc, but also for the manner in which they were staged. Prior to 1984, public entities mounted the Olympics. The Los Angeles Games were the first to use only private funds. Organizers minimized costs, eschewing new construction in favor of temporary venues and rehabilitated facilities from the 1932 Games. Raising capital through corporate sponsorships and unprecedentedly expensive television rights, the Los Angeles Olympic Organizing Committee generated a profit in excess of 1200 million. The potential profit to be realized from the Olympics undermined the IOC’s historic resistance to commercialism; ensuing Olympiads have been marked by ever-increasing commercialism. Both the Seoul Olympics of 1988 and the Barcelona Games of 1992 were accompanied by massive, state-financed urban redevelopment projects in their host cities. The Atlanta Olympics of 1996, however, manifested a return to extreme entrepreneurialism. Unlike Los Angeles, however, Atlanta organizers planned for significant new construction of Olympic venues, funded by private sources including the sale of television rights, commercial sponsorships, tickets and merchandise. Lacking a publicly subsidized safety net, Atlanta’s preparations were often on precarious financial footing. Atlanta’s problems with fundraising, along with criticisms of management and overcommercialization, made the IOC declare it would never again award the Olympics to a city without a significant public-sector financial commitment. With the end of the Cold War in the early 1990s, that particular political dimension of the Olympics faded into history although American media coverage continues to stress national triumphs (and American stories of family, individual dedication and community support). In many ways, the commercialism of the Barcelona and Atlanta Olympics already celebrated the triumph of capitalism over socialism. Beginning in the 1980s, the IOC also began to retreat from its insistence on amateurism and allowed professional athletes to compete—critical to American dominance in basketball, for example. Hence by the end of the twentieth century, the spirit of Olympism was radically revised to embrace both professionalism and commercialism, creating a vast hypermediated spectacle exemplifying the global economy.
Industry:Culture
The Barbie doll is famous the world over as the leggy, blonde teenage model toy for girls. The doll was first released by the Mattel Corporation in the spring of 1959, ultimately becoming not merely a toy but a cultural phenomenon. A clear departure from the babydoll toys which were popular during that period, she was one of the first dolls to offer a window into an “adult lifestyle,” replete with such fashion outfits as “Dinner at Eight,” “Enchanted Evening” and “Picnic Set.” She spawned the creation of her steady boyfriend Ken, along with countless additional product tie-ins, including the Barbie game, the Barbie van and numerous doll outfits, fashion accessories and doll sidekicks (younger sisters and African American and ethnic clones). As a powerful cultural icon, she helped shape the way generations of children conceptualize gender, class, race and ethnicity in contemporary society. Many feminists have argued that Barbie reinforces gender stereotypes, but consumers continue to support the ponytailed doll wholeheartedly. Although Barbie was in effect “born” in 1959, the Mattel Corporation was founded in 1945 in Southern California by art student Elliot Handler, his wife Ruth and their onetime foreman Harold Matson (hence the name “Mattel,” formed by fusing the names Matson and Elliot—though Matson sold out in 1946). Barbie was essentially invented by a working woman—Mattel co-founder Ruth Handler. Both Barbie and Ken have real-life counterparts, since they are named after Ruth and Elliot’s actual children. Although Barbie is in many ways synonymous with rigid, unrealistic, Eurocentric ideals of feminine beauty, it is important to note that Barbie was from the start a “career girl” (complete with requisite fashion accessories of course). Although she eventually found Ken, she has never been a wife or a mother—though she did have a bridal outfit. Barbie is truly all-American, for she has roots in foreign lands. She was modeled after “Lilli,” a plastic German sex toy for men (in turn modeled after “Lilli,” a character in a popular comic strip in a German newspaper) that Ruth discovered while travelling in Europe. This may account for her impossible dimensions if projected life-size. Ruth’s then teenaged daughter Barbara spotted the doll in a store and wanted one, and Ruth realized that this was the doll she had been imagining. While observing her daughter play games as a young girl in which she and her friends imagined their “grown-up” lives, Ruth knew that an adult woman doll for young girls would be a huge hit. Lilli’s sultriness was toned down to make way for Barbie’s Californian wholesomeness, and Barbie doll was born. Since that time, there have been numerous efforts to create ethnically diverse and global Barbie counterparts, but Barbie remains an enduring symbol of the deeply entrenched “tall, thin and blonde” ideal of female beauty in American culture.
Industry:Culture
Term denoting a person who merely watches rather than participates in sports, sits on the sofa surrounded by pizza, potato chips and beer, and phones in his or her bet to out-ofstate gambling brokers. Often stereotyped as males between the ages of eighteen and fifty advertisers target the couch potato constituency with commercials during Monday Nïght Football, Sunday afternoon football games, or during televised college basketball games. The commercials, often for beer, feature men sneaking off to watch sports (while wives complain bitterly).
Industry:Culture
Since Thomas Edison first spoke the words “Mary had a little lamb” into his phonograph in 1877, the recording industry has been driven by a mixture of technological innovations and the pursuit of profit—each feeding off the other. While Edison failed originally to see the popular entertainment uses of his phonograph (preferring the device as an office dictation machine), by the 1920s he was locked into battle with Columbia and Victor Records for talent and audience. In the 1920s, also, radio broadcasting of live and recorded music spread across the nation. Publishing rights to songs began to be licensed through ASCAP in 1914 and BMI in 1939. A dazzling succession of technological breakthroughs, especially the introduction of electrical recording, allowed for a greater warmth and personality in recorded music, spawning modern popular music. Now, performers like Bing Crosby seemed to be singing directly to each individual listener. While the Great Depression marked a low point for the recording industry (with record sales plummeting from a height of 100 million discs in 1927 to only 6 million in 1932), the onset of the Second World War brought two major changes to the business. On the technological side, wartime research in electronics led to improvements in sound quality through magnetic tape recording, the “unbreakable” long-playing rpm) and 45 rpm discs and “high-fidelity” playback equipment; after the war the invention of transistors by Bell Labs in 1948 revolutionized the radio. The other key to the transformation of the recording industry was the expansion of the consumer base with the baby boom. Teenagers in the 1950s spent an estimated 10 billion dollars anually much of that on records and radios. The technological and market growth helped transform the nature of the music industry as well. Since the 1920s, the business had been largely controlled by a few dominant corporate media conglomerations. While that is the case to this day cheaper recording costs and an expanded marketplace allowed more independent record labels to flourish. Especially with the advent of rock ’n’ roll as a mass-market phenomenon in the 1950s, smaller labels like Chess, Sun and Atlantic were able to achieve a degree of success in introducing popular music that the major labels ignored to varied audiences. While the majors caught on to the new musical genres and new markets (RCA, for example, bought out Elvis Presley’s contract from Sun in 1955 for the then astronomical sum of $35,000), independents have continued to fill the niches ignored by the majors. Technological improvements continued to transform recorded music, from multi-track recording and Dolby noise reduction to the advent of digital recording in 1978 (though audiophiles might quarrel over whether digitization is an improvement). The compact disc and DAT in the 1980s and MP3 technology in the 1990s transformed recorded music into patterns of sound waves encoded as a sequence of numbers. The promise is of superior sound quality but the real revolutionary possibility rests with the new methods of distribution. While the recording industry is still controlled by a handful of multinational corporations (which seemingly change ownership weekly), the transmission of recorded sound over the Internet threatens to break their dominance.
Industry:Culture
The Port of Houston (1998 population estimate 1,630,864) dominates oil and gas shipping on Texas’ Gulf of Mexico coast and has underpinned the modern growth of this Sunbelt metropolis. Houston began in 1836 as speculative real-estate venture; today the city sprawls across highways, far-flung suburbs, edge cities and malls (metropolitan population nearly 4 million). Houston proves more cosmopolitan than many other Sunbelt cities, with an important historic black population and many Latinos (old Tejano families, recent Mexican immigrants and Central American refugees), as well as global migrations of Asians and Africans. Citizens also include Sunbelt white-collar migrations, national and international, associated with shipping, oil and other activities. Houston boasts major medical centers and universities (Rice, University of Houston), as well as the Menil Museum. Houston’s enclosed stadium, the Astrodome, hosted professional baseball (the Astros); it figured in Robert Altman’s Brewster McCloud (1970), but is now obsolete. The city also has football (the Oilers), basketball (the Rockets), hockey and a major rodeo. The National Aviation and Space Administration (NASA) stands closeby. Houston hosted the 1992 Republican presidential convention and is one of the homes of the presidential Bush family.
Industry:Culture
That film, television and video can record and transmit “real events” created their initial popularity. Despite the triumph of Hollywood and other narrative media, this realization also underpins a long history of American documentary production, criticism and distribution. Yet, documentary is hardly a simple or unchanging category The “public” looks for “real” events in news, “reality television” and classroom movies, while documentarians debate more abstract truth in both form and criticism. A documentary is also a story even if one claims a higher purpose: treating important social issues, forgotten, exotic or famous people, or great historical moments. Yet, documentarians manipulate all these while sharing technical and narrative frames and distribution with other media. Documentaries also have evolved with media technologies and institutional support. Among the most important ancestors of American documentary are early ethnographic filmmakers like Edward S. Curtis, recording Native Americans, and Robert Flaherty (1884–1951), whose well-known Nanook of the North (1922) was complemented by more “American” films like Louisiana Story (1948). Both documentarians relied on “acted-out” sequences as they recorded “real life.” Margaret Mead and others continued this ethnographic tradition, aimed primarily at academic markets. Documentarians could also draw on decades of social photography including Jacob Riis, Walker Evans and WPA photographers, as well as wartime newsreels and propaganda. Major changes came by the 1960s with readily portable cameras and synchronized sound recording. These gave the illusion of “real life” in action (devices now imitated in fiction through moving cameras and jump cuts). This technology facilitated direct cinema, which stressed unmediated observation. Here, important documentarians include the Maysle brothers (Albert 1926–, David, 1932–87), D.A. Pennebaker (1926–) and Richard Leacock (1921–), while major works include Leacock’s Primary (1960), Craig Golbert’s An American Family (1972) and films by Frederick Wiseman (1930–), such as Titticut Follies (1967) and High School (1968). The intimacy and pervasiveness of the documentary eye and the use/reading of these films evoked questions about intrusion into private life: the Loud family responded angrily to their depiction on Public Television’s An American Family (reprised in interesting ways with the 1999 Public Broad-casting System (PBS) serialization of an American Love Story), and Titticut Follies was barred from public showing for decades. Television also changed documentary distribution and audiences. While documentaries (especially exotic or “nature” films) had occasionally played in theaters, their distribution more generally was limited to schools, museums, or other specialized settings. Television broadcasted documentaries to a large audience via PBS, including controversial films, such as Marlon Rigg’s Tongues Untied (1989), which graphically treated gay sexuality across racial lines. Yet, in the 1990s, its independence faced pressure from government and conservative social lobbies; Ken Burns’ Civil War (1990) exemplifies alternative public television documentaries with high production values, popular audiences, commercial tie-ins and a very safe subject. Commercial and cable networks also offer news, news magazines, star documentaries, biography (the title of a popular Arts & Entertainment channel series) and MTV shows like Real Lives. Commercial documentary however, must sell to audiences and sponsors—hence it tends to avoid controversy as well as formal complexity. Nonetheless, classic television documentaries, especially via network broadcasting, have been in decline since the mid-1970s because of shrinking ratings. In the twentieth century news magazines, talk shows and “reality” shows replaced earlier, more sober television documentaries. Cheap and fast to produce, they focus on emotional and sensational subject matters while concealing fundamental mediations. With new distribution and materials, documentaries flourished as both a practice and a theoretical field in the 1980s and 1990s. Film schools teach through documentary exercises, while disciplines such as anthropology, history and sciences drew in documentaries. Industrial films, news television and political genres played with the form and implications of truth associated with documentary sobriety while documentary makers and critics explored reflections on the claims and form of the genre, epitomized by Bill Nichols, Michael Renov, Trinh T. Min-Ha and others. Documentaries today incorporate different genres, styles and relations to subjects and audiences. Bill Nichols, for example, elaborates four modes of documentary representation—expository observational, interactive, reflexive. The expository mode teaches through direct address, exemplified in many educational products. Truth is obviously controlled by the film-maker, with heavy narration and silent subjects. Observational genres try to observe the subject without interference, seeking “unmediated” truth. As in direct cinema, filmmakers sought to be “flies on the wall,” presuming that the subject would become accustomed to the camera. Wiseman, however, reinterpreted objectivity by claiming his films are his visions, while asserting that what he saw actually did happen. Interactive documentaries involve cooperation and questioning, even within the film; hence they may be linked to film-maker reflexivity as well. Theorist-film-maker Trinh T. Minh-Ha, for example, questions documentary practices, like syncsound and real-time (as in long takes), that promote authenticity Hence, in Surname Viet Given Name Nam (1989), Trinh took pains to reconstruct interviews that viewers could perceive as staged, complicating any reading of being Vietnamese in Vietnam and America. Documentarians choose among these modes, depending on purpose, subject and audience—high-school films on butterflies or messages of environmental concern tend to be expository while politicized explorations of identity seek radically reflexive tones. Another recognition that emerged at the end of the twentieth century was that the subjects of documentaries have lives outside the film. Even in documentaries on distant historical subjects, people are connected to the subject by geography and shared national, ethnic, gender or class backgrounds. Subjects in documentaries and filmmakers face consequences beyond the text. People of power generally control their images and can challenge unfavorable representations. For social documentaries (and news), subjects often occupy lower socio-economic positions. While this may record the forgotten and effect change, it may also categorize victims. Here, grassroots or community videography represents an alternative appropriation of truth by groups who affirm their own truth by their limited and highly contextual documentaries. While less significant in the market (or in academic criticism), these videos affirm the genre’s flexibility amid demands made by claims of truth and power.
Industry:Culture
The US catastrophe in Vietnam gave rise to the popular belief in the 1970s that presidents would hesitate to commit American troops abroad again, fearing the indignation of electorates if they became embroiled in wars and once body bags returned with American GIs. This syndrome, if it ever contributed to military inaction, was certainly short-lived. Within five years of the end of the Vietnam War, American troops were being sent into action in Grenada, and all presidents after Carter would commit troops abroad. A more realistic syndrome reflects an inducement to throw caution to the wind in international affairs. George Bush endeavored to counter the “wimp factor” by using force on several occasions, while Bill Clinton, who had the charge of being a draft dodger made against him during the 1992 election, proved more willing to commit American forces than any other president. After trouncing Saddam Hussein in 1991, Bush announced that Americans had “kicked the Vietnam syndrome once and for all.” But the syndrome (as with much of the war itself) concerns credibility something that cannot be kicked.
Industry:Culture
When the words “under God” were added to the Pledge of Allegiance in 1954, it highlighted a profound contradiction in the culture of America as a nation built on the separation of church and state. In a period of change, religion could nonetheless be added to civic duties rather than stripped from them in the name of democracy, science or modernity. The pluralism of religions in the US, despite controversies over doctrine, status and conversions, also creates a fundamental division between believers and nonbelievers. These general views were intensified in the Cold War by the identification of atheism with “godless communism.” Even those rebelling against American civic religion often have turned to alternate spiritualities based in Asian religion, “nature,” feminism or reinterpretations of a Judaeo-Christian deity in terms of science, cult personalities or imagery. This “prescriptive relationship between religion and the everyday lives of the populace” (see religion) fosters a widespread intolerance towards atheism. Meanwhile, atheists are forced to deny shared civic practice, political invocations of deity, social rituals at tables and holidays and constant minutiae of American piety on a daily basis. Atheists, in general, have been silent nonbelievers, lacking the institutional support or public symbols adopted and even flaunted by American believers. Even dividing lines can be unclear. One may see the fish symbol labeling Christian cars mimicked and altered into a four-legged beast tagged Darwin, but this may indicate opposition to scientific creationism rather than a declaration of atheism. Moreover, Catholics, Jews, Muslims and Protestants have objected to Protestant pieties in schools or manipulative affirmations in sports and politics without denying religion per se. American flexibility within belief systems also makes atheism more difficult to delineate—are Buddhists, Taoists, Unitarians (not to mention polytheists) somewhere beyond the pale of American belief in a monotheistic god? Atheism may be presumed in discourses of science or social reform on the left (as more than one horror movie has said “but you are a man of science—surely you don’t believe in all this superstitious mumbo-jumbo!”). Yet, even here, silence covers a range of beliefs and compromises without creating public declarations of alternatives. In political life, meanwhile, invoking some god and showing up at some ritual events are normative, whether or not politicians confess to any beliefs or responsibilities associated with them. “Angels” are often normalized as jewelry and brica-brac without religious significance, except to those who would choose to actively object, which would seem to many to be petulance rather than belief. Among the most vilified are those who have crusaded for constitutional guarantees, such as Madalyn Murray O’Hare who successfully argued for the removal of prayer from schools in the 1950s. Similarly, the American Civil Liberties Union, which deals with a range of constitutional issues, has been branded atheist for countering strategies of the Christian Right to introduce “prayerful silences” into schools. Electronic communication has opened a new space of expression and communion among Internet infidels. Societies like the Rocky Mountain skeptics, the Philadelphia Association for Critical Thinking, the Skeptic Society (with its journal Skeptic) and the Freedom from Religion Foundation have flourished on the web. While the editor of The Skeptical Review argues for dramatic changes since the religious establishment can no longer control and “suppress” information, the pervasiveness of civic religion makes the future of atheism difficult to read.
Industry:Culture
While Florida has marketed sun and beaches for generations, no other area has distilled this into the jazzy cosmopolitan styles of Miami, Miami Beach, Fort Lauderdale and other nearby communities constituting a metropolitan population of 3.5 million. These cities’ many roles and images as capitals of the new Caribbean, trendsetters for youth, retirement havens for the elderly and glamorous celebrity resorts have nevertheless clashed with ethnic and racial divisions as well as the fragility of the Everglades ecosystem, which development continually threatens. Miami’s 1920s boom was crushed by a hurricane and the Depression, but the city rebounded after the Second World War as a resort for the urban Northeast. Among the diverse communities which expanded its Southern African American and white populations were Jews from New York City, NY and underworld figures with links to Cuba. In the winter season, entertainers as well as poorer “sun-birds” enlivened the beaches and palm-lined streets of the city while elite enclaves took shape on exclusive islands and northwards in Palm Beach. Castro’s triumph in Cuba transformed Miami with the arrival of large numbers of Cuban Americans in the 1960s and again in the 1980s. These Cubans and their descendants have richly contributed to culture, music and politics while galvanizing other Latin American communities. Haitians also arrived in great numbers, complicating the racial panorama that trapped many native African Americans in ghettos and dead-end jobs beside the booming service-sector city Miami style—combining the art-deco buildings of South Miami Beach, the luxuriant foliage and tropical colors of homes and the cultural diversity of old and new populations—was distilled in movies like Scarface (1983), television’s Miami Vice (1984–9) and crime fiction from John D. McDonald to Carl Hiassen. Despite an emphasis on problems of drugs, crime and ethnic tension, the fast-paced, transnational style these showcased underscored Miami’s continuing attractions for celebrities, tourists and new migrants. With the popularity of Fort Lauderdale for college vacations and the continuing elite renovation of areas like Palm Beach, Lake Worth and Miami’s Coral Gables, this maps social and cultural complexity onto the still straining ecology of South Florida development.
Industry:Culture
The mobile home (trailer or recreational vehicle) would seem successfully to synthesize two great American twentieth-century obsessions: freedom/ mobility and automobiles. Certainly, the emergence of automotive campers and trailers before the Second World War as well as accommodations for these compact homesteads offered this promise. Yet, during wartime shortages and their aftermath, a second vision came to dominate the industry—that of mobile homes as pre-manufactured alternative housing. By the 1980s, 90 percent of mobile homes remained stationary after their initial move; moreover, they represented one out of every three new single-family homes sold. Rather than mobility ironically they are identified with the lack of it, socially as well as literally in retirement complexes, marginal housing or, at best, temporary or secondary dwellings. “Trailer parks,” communities built to accommodate these homes, have gained especially negative imagery reinforced by urban restrictions and limitations on financing. The American origins of these homes preceded automobiles—in the covered wagons of westward migrations (commemorated in a 1929 commercial trailer), in elite train cars or even gypsies (the “motorized Gypsy van” was another prototype name). Generally early mobile homes were individual modifications on existing transportation. Commercial production expanded in the 1930s, including the classic Airstream with sleek modern lines. Interiors were cramped, although aerodynamic fittings provided basic house services in ingenious ways. Some were residential but others roamed expanding highways and tourist sites—the 1939 New York World’s Fair offered 1,200 trailer spaces in the Bronx. The Second World War restricted gas and travel. Yet population displacements for industry and military meant new uses for the trailer home as “temporary housing.” Truly temporary usage continued through the 1990s on construction sites, film locations, dorms and classrooms. Yet permanence also changed production and use. From the 1950s onwards, production moved towards larger units with private rooms and more extensive facilities. These included expandable structures and homes based on multiple sections. Once moved by trucks to a site they became permanent—sunk into the ground, built onto (carports, additional rooms) and landscaped. By the late 1990s, this manufactured housing cost from 110,000 to more than $100,000; even the latter remains “affordable” compared to new home construction. In the 1990s, nearly half of all mobile homes resided in 24,000 “parks” scattered from expanding Sunbelt and metropolitan fringes to New York City (Staten Island). Parks have grown from 40–60 lots in the postwar era to an average of 200 sites. These developments often show concern with fostering community through decoration, streetscapes or recreational facilities. Half of mobile homes occupy private lots, including second homes for resorts or retirement. Settlements have not eliminated movement, although these faced further setbacks with the fuel crisis of the 1970s. Compact recreational vehicles like the Winnebago, campers and modified of pickup trucks have restated the mobile-home ideal in terms of access to the outdoors through automotive freedom. Yet the social and media connotations of these homes remain clear—although they foster privacy and domesticity they are non-standard, cheaper, impersonal and associated with transience. News coverage emphasizes weather damage and frailty (the American Association of Retired Persons, in 1999, said 75 percent of owners had complaints). Fictional media use them as settings for immigrants, the poor or outcasts; the epithet “trailer trash” conveys a sneer that might not be voiced politely in ethnic or class terms.
Industry:Culture
© 2024 CSOFT International, Ltd.