- Industry: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
Post-war growth overwhelmed small towns but never erased their symbolic appeal. Small towns began losing residents in the 1920s, but postwar educational opportunities lured still more, and younger, townspeople away. Sprawling suburban developments also transformed independent towns into mere bedroom communities for nearby cities, as new highways enabled routine commuting from residential neighborhoods created virtually overnight. Suburban tracts offered privacy and autonomy hitherto unattainable outside cities, while the relocation policies of corporate employers discouraged ambitious employees from forming any sentimental attachments to place. This postwar culture of education, mobility, growth and anonymity sapped the civic energy inter-generational connection and community feeling of small-town life.
Pre-war townspeople purchased a wide range of personal services from familiar merchants, shopkeepers and mechanics, many of whom began to close up in the 1960s and 1970s. New interstate highways bypassed thousands of towns, stranding “mom-andpop stores” that had traded with motorists traveling the local roads. New mass-market retailers offered large inventories of cheaper goods to local shoppers and tourists alike.
Unable to compete, local stores closed, diminishing the uniqueness that was each town’s pride and treasured inheritance.
Small-town services, neighborliness and civic unity had always involved uneven quality intrusiveness and unchallenged prejudice. But by the 1970s and 1980s, new suburban communities, with names like “Hometown,” “Pleasant Valley” and “Littleton,” traded on rosy nostalgia for small-town life. Developers of “new towns” featuring artificial centers with the same retailers that lined the interstates advertised a perfect mix of town-style community and suburban prestige, themes also seen in New Urbanism.
Surviving small towns began in the 1990s to assert their own version of the past, reaching out for tourist dollars by rediscovering local history and refurbishing fine old buildings. Nineteenth-century town boosters had eagerly exaggerated local get-up-and-go and promised that their growing town would be a new Colossus within the year. In the 1980s, town leaders, seeking new corporate and light industrial employers, sold their communities as quiet, traffic-free, family friendly and homogenous.
The poorest towns, lacking political strength, acquired the most problematic new economic resources. In the 1970s and 1980s, Appalachian towns, along with southern African American, southwestern Hispanic and Native American reservation towns, began spending local economic development funds to build facilities for imported hazardous waste. In the 1990s, as mandatory sentencing swelled the prison population, jobstarved towns also reached out for new federal prison-building contracts.
Visible reverse migration from cities to towns began in the 1980s, notably among African American families. Parents who had left southern towns for northern cities before 1940 saw their adult children return to look after elders or family land, or simply to escape what had not, after all, been a promised land up north.
Old towns, dead towns, ersatz towns and new towns cover the late-twentieth-century landscape. Though nothing has restored the pre-war American town, small-town life remains a compelling ideal.
Industry:Culture
Postwar women can expect to live as much as one-third of their lives after menopause—a longevity that has stimulated reconceptualizations of “the change.” Once a source of shame, the menstrual cycle from beginning to end has become both a medically treatable condition and a source of female dignity and empowerment.
Medical research linked the declining estrogen levels incident to menopause with osteoporosis, reproductive-organ cancers and heart disease in older women. Hence, physicians advocated hysterectomy to shorten menopause, relieve pain and pre-empt cancer. Doctors also prescribed artificial estrogen to prevent osteoporosis and to moderate hot flashes, headaches, irregular bleeding and declining libido, considered the most distressing symptoms of menopause.
Feminists, patients and some doctors resisted this model of menopause as a “disease,” viewing rising rates of elective hysterectomy as evidence of doctors’ continuing disrespect for women’s bodies. Linkages between artificial estrogen and cancer spurred women to inform and treat themselves, increasingly sharing information through the Internet. Though, for many women, surgery remained a welcome option, better hormone replacements and herbal supplements, exercise and nutritional therapies gained in popularity Women also affirm menopause as a positive liberation from the physical and social requirements of fertility. Aging baby-boom women and their daughters offer a large market for these new ideas. Representations of menopausal women now range from images of vital, sexually active independence to re-affirmations of ancient wise women and powerful crones.
Industry:Culture
Potato chips, candy, popcorn and other snacks are recognized by most Americans as high-calorie foods based on fat, salt and sugar. Yet obsessive snacking is part of American daily mobility Use of food as individual escape as well as at parties, in sporting events and movies or on the street contributes to widespread problems of obesity and associated diseases. Manufacturers and marketers nonetheless make sales alluring through market placement and advertising; concessions are also important moneymakers at cinemas and sporting events. Junk food, like comfort food, can be taken as a guilty secret, yet its costs are both obvious and dangerous.
Industry:Culture
Preplanned, post-Second World War communities of low-cost, tract, massmanufactured, single-family homes built by Levitt and Sons, Incorporated, which became worldwide construction models for suburbs and edge cities. The first Long Island Levittown had 17,447 houses. Their purchase, largely by veterans using federally insured mortgages, marked the beginning of white flight from New York City.
Restrictive racial covenants and discriminatory sales practices barred African Americans. For purchasers, Levitt homes fulfilled lifelong dreams. Social critics viewed Levittowns as harbingers of the quintessential suburban lifestyle, places of monotony conformity social isolation and bigotry the epitome of Malvina Reynold’s “Little Boxes.”
Industry:Culture
Presbyterians constitute approximately 2 percent of the American population. The Presbyterian denomination of the Protestant Church traces its roots to the sixteenth century when John Calvin in Geneva and John Knox in Scotland shaped its basic theology and governance. The Westminster Confession of Faith, the creed of Presbyterianism for more than 300 years, was drafted in the mid-seventeenth century in England.
Scottish immigrants who espoused the Presbyterian way came to the Americas in the seventeenth century. They started churches in the Middle Colonies, and in 1706 the first presbytery was organized. A presbytery is a representative body formed of lay leaders, or elders, and pastors. Collectively they govern the church in a given area. Throughout the rest of the century and into the next, Presbyterians played significant roles in the Evangelical Awakening in Colonial America, in the Revolutionary War and in the beginning of the world missionary movement that grew out of the second Evangelical Awakening at the turn of the nineteenth century.
The 1967 Confession of Faith and the 1990 Brief Statement of Faith affirm the continuity of the Reformed Confessional tradition and are found in the Book of Confessions.
Presbyterians have historically taken their faith so seriously that disagreements over theology or the application of faith in society have led to church splits in each of the last three centuries. Equally Presbyterians have been committed so strongly to the basic unity of the church universal that they have worked for reunification within the Presbyterian family on the one hand, and they have been at the forefront of such ecumenical movements as the World Council of Churches and the World Alliance of Reformed Churches, on the other.
Currently there are nine Presbyterian denominations in the United States with an aggregate membership of 3.2 million. The largest body is the Presbyterian Church (USA), with slightly less than 2.6 million members. It was formed in 1983 as a merger of the United Presbyterian Church (USA) and the Presbyterian Church (US), two bodies that had been separate since the American Civil War.
Other significant Presbyterian denominations include the Cumberland Presbyterian Church and the mostly African American Cumberland Presbyterian Church in America.
The former, with about 90,000 members, arose in the early 1800s out of disagreements over the doctrines of election and reprobation as taught in the Westminster Confession of Faith. The latter, with about 15,000 members, was established by African Americans to give themselves a place for leadership in Presbyterian churches.
The Presbyterian Church in America, founded in 1973 as a split-off movement from the Presbyterian Church (US), was joined in 1982 by the Reformed Presbyterian Church, Evangelical Synod. The church’s membership today is approximately 300,000. The Evangelical Presbyterian Church formed out of churches dissatisfied with the United Presbyterian Church (USA) in 1978. Its membership is about 60,000.
While most Presbyterians in the United States are white, increasing numbers of Korean immigrants are changing the face of the various Presbyterian denominations. Having received the gospel from (mostly Presbyterian) missionaries, Koreans are doing mission work of their own in the United States. The Presbyterian Church (USA), for example, has over 300 Korean American congregations, and a separate Korean Presbyterian Church in America has several hundred congregations.
Industry:Culture
President Bill Clinton’s 1998 visit to Africa, while taken by some as a distraction from domestic scandals, marked a potential breakthrough in American relations with Africa.
The US, without the colonial entanglements of Europe (except for Liberian resettlement schemes) had subsequently become involved in Africa through missionary work there by both whites and blacks.
After the Second World War, foreign aid and development projects, including the Peace Corps, increased US presence, as did political intervention, from boycotts of the apartheid regime in South Africa to disputes with individual regimes. This rarely involved the military exercise of Somalia or the bombing of Sudan, although Africa was a constant market for US arms, as well as a site for CIA activity motivated by imagined Cold War exigencies (as in the 1965 assassination of the Congo leader, Patrice LeMumba, followed by support for Mobutu).
Yet all this intervention has often been based on a continuing sense of distance tinged with super-iority, even if African and Afrocentric studies, from ethnography to politics, have begun to bring home the rich history and cultures of a continent. Hence, while Bosnia received daily media attention, the horrors of Rwanda or Sierra Leone evoked no active intervention or adoption of refugees. These relations with Africa are complicated by millions of descendants of those torn from Africa by slavery, for whom the continent may be a distant albeit unfamiliar homeland. W.E.B. Du Bois chose to end his distinguished life in Ghana, and Afrocentric scholarships and cultural revivals have made often-generalized clothing and food more mainstream. African music has probably been the area of deepest crossover. Other African Americans have found that profound cultural, religious and social gaps make Africa a deeply unfamiliar place, in which they are outsiders or even considered “white.” Modern African migration to the US has been extremely small, with sub-Saharan immigrants accounting for only 2 percent of all immigrants in 1985, long after the watershed of immigration reforms. These were often students and professionals, a “brain drain” from African nations, as well as intellectual and political exiles like Wole Soyinka.
Illegal immigrants have become associated in the 1990s with peddling and ethnic resources. The documentary In and Out of Africa (1995) reveals the dialectic of African and American goals and attitudes in the arts trade, while the Amadou Diallo shooting in New York City, NY underscored the racial settings into which African migrants fit.
Industry:Culture
President from 1933 until his death in 1945, Roosevelt was the architect of the New Deal and the leader who guided the United States to victory over Japan and Germany in the Second World War. His legacy remained great through much of the second half of the twentieth century Elected to the presidency in 1932, Roosevelt and the Democratic Party benefited from Herbert Hoover’s failure to respond to the plight of many Americans suffering the economic consequences of the Great Depression. In fact, Roosevelt’s own response to the Depression was neither radical nor systematic. Many of his policy initiatives were ones that Hoover had tried unsuccessfully to implement. Roosevelt was not an ideologue committed to the idea of a welfare state. Instead, he was a pragmatist who moved with the times and carried out different policies as conditions warranted. Unlike Hoover, he realized that Americans wanted action, if only to give the appearance that something was being done. As he said during the 1932 campaign, “the country demands bold, persistent experimentation.” In March 1933, therefore, when inaugurated, Roosevelt immediately set about establishing what he called a New Deal for the American people, including a plethora of acts establishing different bodies to administer the economy (e.g. the National Recovery, Social Security, Agricultural, and Works Progress Administrations). While both liberals and conservatives at the time proclaimed its revolutionary character, more revisionist views have recognized there was very little that was systematic about the New Deal. It included many ad hoc initiatives to deal with particular problems, some of which cancelled out or undermined others. Marked by the pragmatic and reactive politics of which Roosevelt was the master, the reforms were essentially conservative in nature, endeavoring to re-establish stability for corporate capitalism in the United States.
This lack of an ideological justification for welfare was both a strength and a weakness in the reforms. The New Deal actually failed to accomplish its major goal (re-establishing economic health for the country), but this was achieved by wartime prosperity. However, its apparent success and Roosevelt’s obvious popularity gave some of the programs a longevity that they might not have had, had they been more ideologically grounded.
Consequently following the Second World War, presidents from Truman to Carter remained committed to the New Deal. By the same token, when the prosperity of the postwar years was threatened during the 1970s, there was no ideological commitment to welfare that might safeguard it from the onslaught of a Reagan, who argued that it was a millstone weighing down American capitalism.
As the president who guided the United States through the Second World War, Roosevelt was able to enshrine Wilsonian internationalism at the heart of American foreign policy. However, while he was able to secure American support for the United Nations, his internationalism was also more prag-matic than ideological. Bringing together a wartime coalition to defeat Hitler, he was unable to guide this alliance towards a new relationship with the Soviet Union that could survive their postwar competition.
Consequently, his repudiation of isolationism fed into a greater commitment to engagement abroad to counteract the Soviets rather than to a philosophical commitment to internationalism.
Industry:Culture
Prestigious women’s colleges in the Northeast, often paired socially with the older and richer male Ivy League schools. The group includes Radcliffe, Wellesley Smith and Mt Holyoke in Massachusetts, Vassar (Poughkeepskie, New York), Barnard (New York City) and Bryn Mawr (outside Philadelphia, PA). Women’s education and environment in these private schools have ranged from a high academic focus including graduate programs (Bryn Mawr) to the urban partnerships of Radcliffe/ Harvard and Barnard/Columbia to the rural vistas of Mt Holyoke. In the 1960s, Ivy League coeducation put pressure on both admissions and mission. Vassar went co-ed, after unsuccessful merger talks with Yale, while others entered wider consortia; Radcliffe merged with Harvard in 1999. These colleges have become focal points in rethinking women’s education and civic roles in subsequent decades—both Barbara Bush and Hillary Clinton attended Wellesley.
Industry:Culture
Prior to the late 1960s, African Americans rarely had a voice in how they were represented in Hollywood films. With several years of declining box-office profits, along with the rise of the Black Power movement, Hollywood began to court black audiences with a series of inexpensive urban crime dramas, which “exploited” the audience’s desire to see black heroes and heroines with nearly superhuman physical powers. White men were depicted as sniveling weaklings or corrupt businessmen. Black directors Melvin Van Peebles, Gordon Parks and Ossie Davis made, respectively Sweet Sweetback’s Baadasssss Song (1971), Shaft (1971) and Gordon’s War (1973).
Industry:Culture
Prior to the Second World War, Paris remained the most important center of modern art, while American art labored under a sense of relative cultural impoverishment. By the mid-1950s, however, with the emergence on the international scene of abstract expressionism, a specifically American form of avant-garde painting (also known as the New York School), dominance moved from Paris to New York City, NY. The central individual figure in this shift was the painter Jackson Pollock, whose canvases covered in “allover” compositions of dense skeins and swirls of dripped and thrown paint remain the most familiar icons of abstract expressionism. His stylized machismo and early death in a car accident provided a powerful image of the American artist as a hard-drinking, hardliving individualist driven by existential angst.
The abstract expressionists and a subsequent generation of American abstract artists were championed by the critic Clement Greenberg, whose essay “Modernist Painting” (1961) became the most influential account of a version of modernism that seemed largely devoid of social content—its criticality lay in the degree of autonomy it could achieve through the rigor of its reflection upon the essential qualities of a specific medium.
Pollock himself had been associated with the Works Progress Administration’s New Deal public art projects, and with leftist Mexican muralists like David Siqueiros, as well as the conservative American regionalist painter, Thomas Hart Benton. But both Pollock’s rugged individualism and Greenberg’s formalism were able to be used, in the political context of the Cold War, to promote the dynamism of American liberalism and to generate cultural capital to match America’s military and industrial dominance. Hence the interest of the CIA in such exhibitions as the survey of modern art in the United States that toured Europe in 1955–6 (see Guilbaut 1985).
The case of abstract expressionism and the ideological service into which it was pressed set the tone for the role of the visual arts in American culture after 1945. The visual arts have frequently provided stakes in contests, ostensibly to do with questions of artistic form or social engagement, which were as much political as aesthetic.
The political content of American visual arts has not always been explicit. Younger American artists in the mid-1950s, among them Robert Rauschenberg and Jasper Johns (see neo-Dada), and later in the 1960s Andy Warhol, unburdened themselves of the portentousness of abstract expressionist heroism. They opened their work up to the contingency of the everyday contesting the separation of high and mass cultures. In the mid-1960s, the minimalists’ simple, geometrical objects deflected viewers’ attention away from themselves towards the immediate physical conditions of aesthetic perception.
This facilitated subsequent investigations of the institutional production of aesthetic value, in opposition to traditional, ideologically weighty notions of universal or transcendent values.
In other instances, however, it has been very clear what was at stake in battles over the value of particular forms of representation. Artists provided imagery in support of the Civil-Rights movement and anti-Vietnam War protests. The Feminist Art movement of the 1970s, including artists such as Judy Chicago and Miriam Schapiro, directly challenged assumptions about who could make art, and what constituted appropriate subject matter. This introduced such taboo or undervalued aspects of women’s experience as menstruation and domestic labor. They were especially critical of the ways in which women were commonly represented, both in art and mass culture, and concerned with asserting a right to self-representation. Some artists staged confrontational performances in public to dramatize and criticize conventional representations of women, while others, such as Mary Kelly made theoretically sophisticated work examining the psychological mechanisms of patriarchal domination.
Ronald Reagan’s presidency saw the elaboration of various forms of critical visual arts practice, grounded in developments of the 1960s and 1970s. Individually or in collaboration with activist organizations like ACT UP (AIDS Coalition To Unleash Power), artists attempted to turn massmedia techniques to different, politically oppositional purposes. But the 1980s also saw a resurgence of traditional styles of painting (neoexpressionism), and the art market boomed along with the stock market, as if to re-emphasize art’s historical relation to privilege.
The stock market fell and the boom subsided. In the late 1980s, a body of work emerged that was grounded in “identity politics,” that is, in the specific experiences of artists who saw themselves as members of ethnic and other minorities. This pointed to a history of limited access to the institutions of art, but it had the unintended consequence of specifying and fixing identities in categories.
The election of Republican George Bush, after two terms of Reagan, encouraged Democratic politicians to begin to move towards centrist positions on economic issues.
This would eventually contribute to the election of Democratic President Bill Clinton.
Meanwhile, riding an ideological tide propelled by a conservative Christian minority and claiming to be disgusted by artistic representations of anti-normative identities, Republican leaders including Senator Jesse Helms seized upon “traditional values” and “morality” issues for political leverage. One highly visible result of this was the “culture war” of the late 1980s and early 1990s (though it might be observed that few Democratic politicians defended culture very strongly). Sexually explicit art by openly gay artists such as Robert Mapplethorpe and David Wojnarowicz, which had appeared in institutions or exhibitions supported in part by federal funds, became a political football as conservatives attacked the funding organization, the National Endowment for the Arts. They succeeded in cutting the NEA budget dramatically and in changing its procedures for making awards.
Federal support for the arts in America was already relatively very low among developed nations (American artists and institutions rely relatively heavily on foundations, the philanthropical support of corporations and the wealthy). So, in the light of the electoral success of congressional Republicans in the mid-1990s, perhaps what was most significant about the “culture wars” was that they demonstrated again the relationship in the American context between the desire to control representation (who represents whom and what, and how), and the desire to control, influence or maintain social values. In the 1990s, when American arts institutions must compete globally for tourist revenues and are confronted by increasingly diverse local audiences, such contests seem bound to become increasingly complex.
Industry:Culture