- Industry: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
Since 1917, relations between Russia (or the Soviet Union) and the United States have been marked by extraordinary hostility occasionally tempered by periods of cooperation.
The US refused to recognize the new Bolshevik government when it assumed power in 1917, and even contributed several thousand troops to an international military force intended to dislodge the new regime from power from 1918 to 1920. These actions left a legacy of hostility on the part of the Soviets who treated the invasion as proof of the determination of capitalist countries to subvert the Soviet Republic. The United States withheld diplomatic recognition of the USSR until 1933.
Relations between the two countries remained tense until the United States entered the Second World War, at which time they became allied against the Axis powers. In the US, sympathy and affection for “Mother Russia” and “Uncle Joe”Z Stalin developed. From the Soviet perspective, however, American unwillingness to open a second front in continental Europe until 1944, after the Red Army had been combating Nazi armies for three years at the cost of nearly 20 million Soviet lives, left the impression that the US was willing to let Soviet citizens fight Hitler alone in order to spare American casualties.
Resentment helped to fuel Soviet suspicion of American postwar intentions at the outset of the Cold War.
Any lingering benevolent wartime feelings between the two countries evaporated during the Cold War, which periodically flared into “hot” wars, on the Korean peninsula and in Vietnam, and nearly over the Cuban missile crisis (1962) and the Berlin Crisis (1958–62). American suspicion of an expansionist, aggressive and Marxist Soviet Union exporting revolution and instability to capitalist nations friendly to the United States was a foundation of American foreign and domestic policy until the late 1980s.
The 1970s were largely a period of détente, a warming of relations highlighted by increased cultural and student exchanges and tourism. In light of an essential parity of nuclear weapons arsenals, arms-control treaties limited, but by no means stopped, the arms race. Nevertheless, much of the American media continued to depict the Soviet Union as a ruthless and cunning enemy, as did most of Congress, which in 1979 refused to ratify the SALT II arms-control treaty negotiated by President Carter.
The first half of the Reagan presidency (1981–9) brought renewed rhetorical battles and a reheated Cold War. Reagan coined the term “Evil Empire” to describe the Soviet Union, and effectively ended the 1970s détente. American spending on the development of new missile systems (and anti-missile systems) increased dramatically and the Soviets attempted, unsuccessfully to match American capabilities.
Only the unexpected ascension of the reformer Mikhail Gorbachev to the post of General Secretary of the Communist Party in 1985 created the conditions for a new thaw in Soviet—American relations. Gorbachev, with his commitment to reducing Soviet expenditures on defense, which were severely damaging an already weakening Soviet economy to increased “openness” (glasnost) in the press, and his rhetoric of liberalizing the socialist system, eventually became enormously popular among Washington elites and the American public, softening anti-Soviet attitudes. His efforts to introduce “socialism with a human face” to the USSR met with an extraordinary enthusiasm in the US, not matched in his own country. His refusal to crush with force the Eastern European revolutions in 1989–91 also met with approval and raised him—and the Soviet Union— in the eyes of many Americans.
Gorbachev’s downfall, made inevitable with the failed right-wing putsch of August 1991, and the subsequent dissolution of the Communist Party by Russian President Boris Yeltsin, marked the end of Soviet—American relations.
After the collapse of the Soviet Union, Russian—American relations temporarily improved but have since cooled significantly. Many Russians blamed American economists for imposing upon Russia “shock-therapy” measures. These measures, including the privatization of state-owned property, the end to price subsidies and foreign investment, have been blamed for the sharp decline in living standards for the majority of Russians, while a small minority have quickly accumulated enormous wealth. More recently, America’s support for the expansion of the NATO military alliance into the countries of the former Eastern bloc, followed by the 1999 war with Serbia, provoked the worst American—Russian relations since the end of the Cold War. Many Russian leaders and citizens have accused America and its European allies of “aggression” in an attempt to isolate and humiliate Russia during this period of internal turmoil and economic collapse.
Industry:Culture
Since 1945, new medical techniques have extended active life for many Americans, while the proportion of older people in the population has increased and earlier systems of elder care have eroded. These changes have given older Americans new choices and powers, but a continuing cultural privileging of youth still marginalizes and endangers the aging.
The 1950s and 1960s saw efforts to develop ethical and affordable ways of housing and caring for seniors outside of families. With Social Security providing reliable retirement income, elders began moving away from established family homes into new Sunbelt apartment, retirement communities, some of which actively discouraged younger families from locating nearby. Even without relocating, many elders still had fewer family members to care for them. Family size was shrinking, either by design as safe, reliable birth control became available, or, especially among poorer or newer Americans, through the exigencies of mobility, immigration and high mortality. Moreover, more frequent divorce and remarriage could multiply the size of the kin group, making young families responsible for more elders than ever before.
Elders have increasingly used new political and cultural powers based on their numbers in the population to claim care from the whole society. Since the 1970s, activist organizations, like the American Association of Retired Persons (AARP), the American Society on Aging, and the Gray Panthers, eventually supported by the large and aging postwar baby-boom generation, have lobbied successfully for Social Security benefits, medical rights and legal protections for older Americans.
Wide-spread anxiety including feminist concern, about intrusive and expensive healthcare, the disproportion of poor women among the elderly unethical nursing-home practices and indignities surrounding death in hospitals have spurred movements for public regulation of care facilities, greater physical independence for elders and respect for age itself, and hospice and homecare programs for the terminally ill. The 1990s especially saw a revitalized Right-to-Die movement, dramatized by the assisted-suicide campaign of Dr Jack Kevorkian.
Wealth and family background strongly shape the expectations of aging. Affluent, especially white, elders seek “independence,” and plan their finances, activities and choice of retirement home so as to avoid “burdening” their children. Their wealth and leisure time has supported new consumer markets in special safety equipment, exercise plans and machines, medical innovations and plastic surgery. By contrast, less affluent seniors, including many non-whites, more often seek to avoid sundering community ties, and tend to rely on their children, neighbors and church community to assist and protect them. This preference varies less with wealth among non-white seniors than among whites; Asians in particular are shocked by the isolation of the elderly.
Neither has offered a perfect solution. While independent seniors can suffer from isolation or emptiness, seniors who stay rooted in their communities also sometimes endure neglect, danger, or financial loss in securing needed care. The new opportunities for American seniors have not erased the society’s persistent discomfort and impatience with the late stages of human life.
Industry:Culture
The American income maintenance system is in reality a patchwork of programs aimed at different populations and financed and administered by federal, state and local governments. Further, it is divided into “insurance-like” programs, for which eligibility is based on a history of wage earning, and “welfare” programs, for which eligibility is based on “means-testing,” the determination of need according to levels of income and wealth.
“Welfare reform” denotes sweeping policy changes in the means-tested programs, mainly in terms of initial eligibility and continuing assistance.
Until passage of the Social Security Act in 1935, welfare was a state and local matter.
The Social Security Act, and many subsequent amendments, made federal and state governments partners in the administration and financing of welfare benefits for families and the elderly blind and disabled. Federal rule-making, buttressed by US Supreme Court decisions, attempted to ensure a rough equity on a national basis. In 1974 benefits for the elderly blind and disabled were consolidated in one federally administered and financed program. The states continued to share administration and financing of the program for families. After 1988, and gaining momentum with the Republican ascendancy in Congress after 1994, administrative control of the family aid program began to shift towards the states, but remained tethered to overarching federal rules. The most dramatic change of this sort was accomplished by the Personal Responsibility and Work Opportunity Reconciliation Act of 1997 (PRWO).
The PRWO eliminated the entitlement to benefits in the family program. Previously any eligible family had a legal right to benefits, regardless of budgetary considerations.
Now, states receive “block grants,” fixed sums of federal dollars calculated by formula.
When these funds are exhausted, aid may be denied to eligible families by states unwilling to spend their own funds. The PRWO also limits federal benefits to five years in a lifetime for most family aid recipients and allows states to adopt shorter terms.
Further, it mandates the states to enroll increasingly large proportions of adult family aid recipients in work or training programs. Failure to meet scheduled goals results in the loss of a percentage of block grant funds.
Historically welfare programs incorporated a suspicion of poverty and welfare programs were overlaid with “morals testing.” Until the 1970s, putatively dissipated or licentious poor folk frequently were denied aid, restricted to institutional care, or had their benefits carefully supervised. The PRWO reinvigorated this tradition by imposing on recipients of family aid “behavioral requirements” for initial eligibility and continuing assistance. (The PRWO also allows the states to impose others.) The most significant of these make ineligible or impose penalties on those who commit drug felonies or use illicit drugs, fail to have their children immunized or attend school regularly or refuse work or training opportunities. In a similar spirit, though by different legislation, substance abuse was eliminated as an eligible impairment in both the insurance-like and welfare versions of the federal disability program.
In sum, in its present cast, welfare reform is intended to limit federal financial liability force welfare parents into the labor market and discipline the behavior of poor people. Its results remain to be seen.
Industry:Culture
Sociology as a field of study focuses on the interactions of individuals, groups and organizations within a social structure. The discipline of sociology has always been shaped by the political and social climate of the times and can be traced back to the works of Karl Marx (1818–83), Emile Durkheim (1858–1917), Max Weber (1864–1920) and W.E.B. Du Bois (1868–1963), whose political and social philosophies on labor, capital, religion, culture, morality and modern life continue to influence the field of sociology as well as other disciplines.
During the early twentieth century the University of Chicago was known to be the hub of American sociology. While at Harvard University around the mid-century, Talcott Parsons (1902–79), considered the founder of American sociological theory worked on general theories of social action, helping make that institution a center of sociological research. Likewise, Robert Merton’s work at Columbia University helped to forge sociology into an intellectual science with its own terminology of concepts and standards of application, making Columbia the other recognized center for sociology.
Up until the mid-twentieth century the field of sociology was dedicated mostly to structural analysis. Social changes occurring in the 1950s inspired the emergence of new sociological concepts. With the Civil Rights movement and other social movements, the political and social climates of the United States were altering quickly and sociologists responded by focusing their research less on structural analysis and more on political and moral concerns prevalent during the late 1950s and 1960s. Younger sociologists began to put into practice those theories they had learned by researching less advantaged groups in society like women, African Americans and Latinos. As sociologists embarked on this new direction, their influence increased with summons coming from the White House, Senate and Congress to advise on the plethora of social problems facing the country In the 1990s, most colleges and universities housed a sociology department, but the prestige of these departments had diminished significantly from the heyday of the 1950s and 1960s. Partly this was a result of the backlash against many of the policies and programs that they had recommended to politicians, many of them associated with President Lyndon Johnson’s “Great Society” Backing away from public policy sociologists have in a way synthesized the earlier impulses, studying individuals and the social structures within which they operate. Areas of research vary from feminism to education to ethnic identity to the HIV/ AIDS epidemic. With a broader, more complex field of study most sociologists still utilize the classic theories to understand the intricacy of contemporary problems.
Industry:Culture
The American common law system contains a complex array of federal, state and local courts with exclusive and overlapping jurisdictional powers. Each state has a separate, comprehensive judiciary organized in a trial and appellate court hierarchy that parallels the structure of the federal courts.
In the process of interpreting and applying the law’s generalities to decide specific cases, judges exercise a major responsibility for shaping public policies. Common-law jurisprudence enables judges to create law. The breadth of this power is demonstrated by the different constitutional approaches, especially in matters of individual rights and federalism, of the Supreme Court under Earl Warren, in a period of progressive judicial activism, and under William Rehnquist. State judges influence society equally for example, by expanding legal theories of tort and products liability to encompass the heath risks of tobacco.
Administrative duties also emphasize judicial autonomy Renowned federal judges and chief judges of the states’ highest courts are powerful voices for court reform. The Supreme Court, subject to congressional approval, prescribes rules for procedures and practice for the federal courts, generating changes followed by many state courts. The ethical conduct and performance of judges is highly self-regulating; questions are resolved by judicial panels. Public confidence in the soundness of the rule of law is strengthened by political separation and independence of the judiciary from other government branches. Judicial insulation from short-term political repercussions tempts politically vulnerable executive and legislative bodies to leave resolution of compelling and contentious cases to receptive judges.
The litigious nature of American society pressures the judiciary. In the past twenty years, despite the growth of alternative forms of dispute resolution, federal cases have increased by 60 percent, and approximately 100 million state court cases are filed each year. This crowds dockets, delays trials, fosters rudeness and escalates controversies involving the election and appointment of judges.
Federal judges are nominated by the president and confirmed by a majority of the Senate. Supreme Court choices can have the most lasting impact of any presidential action: appointments are made for life and justices have served for over thirty years. State and local judges are elected or appointed. Election campaigns for the highest courts in large states require millions of dollars, exposing winners to accusations that their rulings are influenced by the large campaign contributions of lawyers, powerful law firms and business groups. State judicial appointments are usually made by elected officials on a partisan basis, constrained somewhat by the recommendations of local bar associations that evaluate aspirants’ qualifications.
Most of the public gleans its knowledge and understanding of the judiciary from trials regularly televised on cable stations and though extensive mass-media coverage of celebrity cases. Popular “judge” shows, with mock trials of consenting parties, also shape perceptions. Still, the judiciary especially the Supreme Court, remains America’s most highly respected and trusted profession.
Industry:Culture
The anti-communist movement associated with Joseph McCarthy was underway well before his election to the Senate in 1946. Though the American government had long worked to obstruct the presence of communists and socialists in American politics, this campaign took on unprecedented urgency after the Second World War.
The search for internal subversion was carried out by Congress through its House Committee on Un-American Activities (HUAC), organized in 1938 and not abolished until 1975. Though relatively ineffective during the New Deal, after the war the committee stepped up its work, in part a response to President Harry Truman’s own increasingly strident anti-communist rhetoric and new internal security program. In fact, many scholars have argued that McCarthyism owed as much energy to the rivalry between the Democratic and Republican parties as it did to any “real” threat of communist influence over governmental policy.
Nationwide attention was given to HUAC in 1948 when, in front of a committee chaired by freshman California member of Congress Richard Nixon, Whittaker Chambers, a journalist and former member of the Communist Party accused Alger Hiss of passing classified documents to the Soviets during the 1930s. Hiss, a highly respected Democrat who held influential positions in the federal government during the New Deal, and who accompanied President Roosevelt to the Yalta conference in 1945, could not be tried for subversion; the statute of limitations had expired. But Hiss was convicted of lying to Congress, and to many the possibility of his guilt legitimated the search for subversives within the government. To those who doubted his guilt, however, the hearings resembled a witch-hunt devoid of any hard evidence linking Hiss to the accusations.
The search for communists was not limited to the federal government, but also included the entertainment industry, higher education and the nation’s literary community. Other targets of investigation and intimidation were American homosexuals, considered by some to be a poisonous presence by virtue of their sexual identity, just as communists were suspect for their political identity. In all, hundreds of lives were ruined by the careless and savage attacks made by a few overzealous men wielding substantial power.
The anti-communist crusade was linked to Joseph McCarthy in early 1950, after he claimed in a speech given in Wheeling, West Virginia, to have a list of 205 known communists working in the federal government. Though the number of “communists” McCarthy claimed to have identified changed over the next few years, what proved relevant for most Americans was the accusations themselves rather than their validity.
McCarthy’s influence ended suddenly in 1954 when he made the mistake of investigating the presence of communists in the United States Army. A more absurd accusation could not have been made. The fact that a Republican was now in the White House signaled to many Republicans that McCarthy was now more of a liability than an asset to their party. Televised hearings revealed McCarthy to be a vulgar and disrespectful man, and his basis of support quickly evaporated. Censured by the Senate in 1954, McCarthy receded into obscurity and died an alcoholic in 1957.
Industry:Culture
The Oscars, 13½ inch statuettes awarded by the Academy of Motion Picture Arts and Sciences, first appeared in 1927 when Wings captured Best Picture while German-born Emil Jannings won Best Actor. Oscar nominations are decided upon within specialist branches before the final vote of all academy members. As the ceremony has moved from a hotel auditorium to wider audiences, professional roles have been taken by comedian hosts (Bob Hope, Johnny Carson, Billy Crystal, Whoopi Goldberg), and sets and production numbers have become increasingly elaborate—as have jewelry hair and costumes for stars. While moments of political intrusion are often remembered (Marlon Brando’s Native American substitute, George C. Scott’s refusal, Richard Gere’s pleas for Tibet), the Oscars tend to reaffirm the priority of Hollywood as the entertainment capital.
Indeed, workers in technical fields and “lesser-interest” awards are relegated to earlier, less publicized ceremonies so that the narrative of the Oscars focuses on the final naming of the highest categories—Best Picture and Best Director. Foreign films were added as a category in 1947.
Industry:Culture
The United States has sent troops into Lebanon on two occasions. Eisenhower’s plan to resist any communist aggression in the Middle East (known as the “Eisenhower Doctrine”), led to the sending of 14,000 troops there at the request of the Lebanese government, even though the threat came not from communists but from internal opposition. During Reagan’s administration, following conflict in Lebanon between Israel and the Palestinian Liberation Organization backed by Syria, Secretary of State George Schultz brought about a peace agreement, which stipulated that the US would commit 1,500 troops as part of a multinational peacekeeping force. In late 1983, however, a Muslim terrorist drove a truck full of explosives into the unprotected US Army barracks, killing 241 soldiers. Reagan at first kept the US force in Lebanon, not wanting to surrender to terrorism, but because of the unpopularity of this decision changed his mind. Eisenhower’s involvement in Lebanon highlighted one aspect of the Cold War—the way in which the mask of anti-communism embroiled Americans in conflicts of a nationalist nature. The more recent case shaped later engagements of troops, especially as peace keepers, requiring that the US role be clearly defined, that peace actually be established and that American forces be sufficiently well-protected.
Industry:Culture
The use of drawings and photographs of three-dimensional objects to tell stories preceded the technologies of film, television and computer with which they now coexist. Winsor McKay’s pioneering Little Nemo (1908) and Gertie, the Trained Dinosaur (1909) and, later, New York-produced characters, such as Felix the Cat, excited both audiences and distributors. Yet, Walt Disney’s studio, style and sales have set the model for American animation. Through Disney studios, animation has become a specialty for children and families, relying on color and music, as well as personality and narrative. Many competitors, in fact, have emerged from divisions within the Disney studio rather than from alternative traditions. Technological innovations from the use of celluloid to liveaction modeling to computer-assisted design have changed the production and quality of the medium itself, yet it remains within this general paradigm.
Disney set himself apart from early competitors by his awareness of the values of character, narrative, sound and color, as well as the possibilities of linking onscreen features to off-screen commercialism. With the introduction of sound in 1927, Disney characters like Mickey Mouse and Donald Duck, and works like Silly Symphonies commanded attention. In 1937, Disney gambled successfully on the lush musical feature Snow White and the Seven Dwarfs, which set the standard for the genre thereafter. By Pinocchio (1941) the level of animation multi-planing and thematics became even more complex, while Fantasia, a box-office failure at the time, wed classical music with creative animation.
Other studios competed with Disney in short films that accompanied theatrical features. In the 1940s, disgruntled Disney employees founded United Productions of America, whose style in cartoons like John Hubley’s Mr Magoo offered a more “modern” feel that would have an international impact. Tom and Jerry was created by William Hanna and Joseph Barbera. Other competitors included Warner Brother’s Looney Tunes stable of aggressive, even violent cartoons, including Bugs Bunny (Chuck Jones) or Tweety Pie and Sylvester (Fritz Freleng). These characters, and others, found new life on Saturday morning children’s television.
American animation, even at Disney, declined in the 1960s and 1970s because of rising costs and other production decisions. Computer animation and new Disney initiatives sparked a 1980s renaissance with features like The Little Mermaid (1989), Beauty and the Beast (1991) and especially The Lion King (1994), which grossed over $300 million.
These not only took on attributes of musicals—stories, songs, even “star voices”—but reappeared as live productions on Broadway. Other competitors for this reborn market include the Spielberg collaborators of Dreamworks, who produced Prince of Egypt (1998) and The Road to El Dorado (2000), and Warner Brothers’, who produced The Iron Giant (1999). The combination of animation and live action (as in Mary Poppins, 1962) has again filled the screen since Who Framed Roger Rabbit (1988), including the synergy of sports celebrity and cartoons of Space Jam (1996). By the end of the decade, however, FOX closed its studios and Disney remained champion.
Both in creative features (using television showcases as well as theaters) and as components of live-action films, animation continues to provide an alternative imagination of reality, from The Simpsons or South Park to animated political cartooning.
Ralph Bakshi’s Fritz the Cat (1972), for example, raised issues of sexual adventure far from Disney. While the possibilities of animation continue to excite independent producers, the sheer marketing and cultural permeation of the Disney feature and its imitators continue to dominate the primary meanings and readings of this art form.
Animation techniques also continue to develop, especially through use of computer animation that has already created a new look in hits like Toy Story (1995) and Toy Story II (1999). These techniques can also be used to add vivid imagery to live-action movies like Titanic (1998) or Gladiator (2000). At the same time, live backgrounds have also been incorporated into animated features, as in Disney’s Dinosaur (2000). These hybrid creations suggest changing boundaries of genre, as as well as reminding us that animation is not “just for kids.”
Industry:Culture
The first Earth Day, held on April 22, 1970, was a national day of rallies and teach-ins expressing support for environmental protection. Events were held in major cities as well as on most college campuses. In the end, some 20 million people took part, calling attention to the swelling support for environmentalism across the country Later annual celebrations still revolve around this concept, with local clean-up efforts and political speeches. They have had limited results; yet, without being decisive in any specific issue, each Earth Day underscores the extent to which environmental issues have become a national priority See also: deep ecology/Earth First!
Industry:Culture