- Industry: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
When Lance Armstrong won the Your de France with the US Postal Service team in 1999 and 2000, American media focused on the personal odds he had overcome in beating cancer. For most Americans, competitive bicycle racing remains a foreign sport, even though American Greg Lemond won the tour in 1986, 1988 and 1989 (overcoming his own hardships); such triumphs probably annoy Europeans more than they elate Americans. In Breaking Away (1979), in fact, the best American cycling movie, the Midwestern hero pretends to be an Italian exchange student to explain his affiliation. The narrative of suffering also dominates Olympic bicycling coverage, where human interest stories deal with America’s failure to win medals. Bikes, then, form part of American life rather than a specialized sport. As such they are both ubiquitous and, at times, dangerously invisible to drivers and policy-makers.
Automobiles ended the bicycle’s turn of the twentieth century golden age as a primary vehicle. In the postwar period, though, bikes remain fundamental features of growing up, as well as of adult recreation. While sales peaked in 1973 at a postwar high of 15 million, they have remained steadily above 10 million per year. Tricycles, training wheels (and their removal) and multi-speed bikes track maturing independence for many American children. Schwinn’s banana-seat Sting-Ray dominated suburban childhoods in the 1960s and 1970s, later giving way to the sportier MBX, with motocross features. Adult tricycles have also been promoted for exercise and independence in old age.
For teenagers, bikes compete with cars in enlarging social worlds or as a convenience on a college campus. For them, as for adults, increasingly expensive bicycles offer recreation alternatives and, occasionally, a commuter choice. This popularity has been shaped by innovations that include the rise of ten-speed touring bikes in the 1960s, followed by trail bikes with balloon tires and stronger frames (pioneered in Northern California in the late 1970s). Sophisticated multi-gear hybrid bikes dominate the market, along with mountain bikes, in the 1990s. Meanwhile, bicycles have entered professional worlds via bike messengers who specialize in artful movements through dense, congested cities; these messengers became the heroes of the 1995 sitcom, Double Rush.
For many years, these bicyclists would have ridden American bikes like Schwinn (founded 1895) and Huffy While Americans design racing and innovative bicycles, production often concentrates overseas, sometimes with American assembly. Americans have also been innovators in recumbent bikes since the 1980s. Both cheap and prestigious foreign models absorb 30 percent of the American market.
In many areas, urban streets and suburban roads have been lobbied for bicycle lanes in an effort to decrease automobile congestion and pollution while protecting bicyclists from collisions. Meanwhile, many parks and beaches are transformed at weekends into cyclists’ worlds, while off-trail areas may sustain serious damage from the growth of mountain biking. Workplaces, schools and homes also accommodate security concerns, while twenty magazines emerged between the 1970s and 1990s dealing with bike interests. Nonetheless, bikes account for only 10 percent of daily trips, in comparison with 30–40 percent in Europe.
Industry:Culture
The theoretical foundations of modern computing machines were laid in the early twentieth century when mathematical philosophers in Europe and the United States, spurred by the invention of internally consistent, non-Euclidean geometries, explored problems of rationality provability and logic machines. These explorations culminated in the 1930s with the invention of idealized, hypothetical, general computing machines. The exigencies of the Second World War brought state funding to these mathematicians, and electronic calculating machines were built based on their theoretical designs. In Britain these machines were used to break German codes; in the US research was geared towards atomic-bomb production.
After the Second World War, British development of calculating machines languished, while in the US developers created private corporations and sought markets for their products. However, the secrecy of the previous research, the enormous government funding behind it and the narrow focus of their application had produced machines which were huge, complex, expensive and difficult to adapt or program. Markets for these machines were difficult to find and at first limited to government agencies, including the Department of Defense and the Census Bureau. In an attempt to create market consciousness, Remington Rand lent one of its machines, the UNIVAC, to CBS to assist in predicting the outcome of the 1952 presidential election. When it forecast the landslide results more accurately than human experts, the computer entered popular consciousness as an omniscient “electronic brain.” Its use spread to large corporations in dataintensive industries such as banking and insurance. International Business Machines (IBM), renowned as the epitome of white, male, crew-cut, button-down efficiency quickly became the dominant manufacturer of computing equipment. Payroll management became one of the earliest data-processing service industries. During this period, the instruction sets which guided the computer’s operations, and the data on which the computer operated, were stored on “punch cards.” These were pieces of cardboard, measuring about 2.5 inches by 6 inches, through which small rectangular holes were punched. The pattern of the holes represented a particular instruction or data point. They were fed into the computer by high-speed mechanical devices which frequently jammed. To prevent such jams, punch cards had to be handled carefully and were often imprinted with the phrase “Do not fold, spindle, or mutilate.” These cards became the mediator between millions of people and the world’s largest and most powerful institutions. They became symbolic of computers themselves—vast storehouses of information—used by people who didn’t really understand them to perform calculations of a complexity far beyond human capabilities, producing inscrutable and incontestable decisions. They were the embodiment of bureaucratic oppression. Bumper stickers and T-shirts proclaimed “I am a human being. Do not fold, spindle, or mutilate.” In the late 1960s and early 1970s, several technological and social changes occurred which altered the popular involvement with, and perceptions of, the computer. The first of these was the development of transistors, integrated circuits and microchips which permitted miniaturization, standardization and mass production of processors. The second was a development of a play rather than work, culture around computers. This latter development proceeded, in part, from the increased availability of computers to college students on a time-share basis. As these students began to experiment with programming languages, humanmachine interfaces and multiple-user machines, they developed very simple two-person games. Hobbyists also began to buy computer kits publicized through popular magazines and to build machines, which, though rudimentary, had an adaptable design and public technical specifications. Thus computer use spread from corporate culture into the middle-class, college-educated, young male culture of the early 1970s.
In 1977 two of these men, Steve Jobs and Steve Wozniak, produced the Apple II in a suburban garage. At first marketed through hobbyist clubs, it became the first massmarket personal computer (PC). Originally useful only for word processing and game playing, it was not until the invention of business-oriented spreadsheet programs that the “PC revolution” started to take off. In 1984 Jobs and Wozniak introduced the Apple Macintosh, marketing it to both home and office users. Symbolically positioned against institutionalized, even totalitarian, bureaucratic power, the Macintosh was advertised as “the machine for the rest of us.” This marketing approach was fabulously successful.
Fortunes were made in computers, software and peripherals, and the new money was conspicuously young, male and west coast.
IBM, in a hurried attempt to extend their dominance from mainframe computing into the new realm of PCs, entered into non-exclusive license agreements with Intel (for microprocessors) and Microsoft (for operating-system software). IBM branding provided the assurance necessary to convince millions of users to make the substantial economic investment that a personal computer represented, and the Intel/Microsoft configuration became an industry standard, competing with Apple for the hearts and minds (and dollars) of US personal computer users. By the mid-1990s, IBM had lost its market share of PCs to other manufacturers, even though the technical standard was still referred to as “IBM-compatibility.” Despite much of the hype surrounding the “PC revolution,” the social diffusion of these machines in the early twenty-first century remains predominantly white, middle-class and male.
As PCs replaced mainframes in offices, internal networks linked individual machines to central data servers, reasserting centralized surveillance and control. Bill Gates, as the founder and principal stockholder of Microsoft, became the richest man in the US, his fortune rivaling those of Rockefeller, Carnegie and the Vanderbilts. Thus PCs, originally imagined as machines for freedom and individuality are again implicated in historically deep-seated reactions against big money and corporate power.
These tensions between centrality of power and diffusion, between freedom and domination, are exacerbated as processors are further miniaturized and incorporated into such amenities as cars and appliances, and as networking technologies and practices increasingly link these processors in various topologies of communication and control.
Industry:Culture
Whether celebrating ethnic heritages, sporting events, political candidates or controversial causes, parades reclaim urban thoroughfares as public spaces for expressions of group identity Often, these carry memorial/patriotic associations—the bands, flags, military troops and heroes of 4th of July reverberate through media and political imagery. Others—the spectacles inaugurating Christmas shopping or celebrating bowl games—convey commercial messages through elaborate floats and televisual coverage. Disney, as well, promotes nightly events to sanctify its imagery of small-town America. Yet parades challenge as well as affirm; the legacy of civil-rights marches underpins contemporary Gay Pride parades and new ethnic celebrations demanding space in the American landscape.
Industry:Culture
Since the pre-war glory days of Shirley Temple, children have emerged as some of the biggest stars in Hollywood. Most, like Macauley Culkin, who became a huge star due to the Home Alone films of the 1990s, embody an endearing cuteness that captured the heart of America—with the aid of much marketing. Common too, as many exposes have revealed, are the hardships many of these young stars endure. These include clashes with parents over money (Gary Coleman of the sitcom Different Strokes, as well as Caulkin), drug abuse as fame fades (David Cassidy River Phoenix), suicide attempts and difficult adult lives. Elizabeth Taylor started her career as a child actor and went on to become one of the great sex symbols of the 1950s and 1960s, as well as one of Hollywood’s constant tabloid stories.
Nonetheless, child stars may have family in the “business.” For example, Hayley Mills was the daughter of English actor Sir John Mills. Drew Barrymore, who has grown up, gone into recovery and kept her stardom as a young adult, belongs to the Barrymore theatrical dynasty Others raise the specter of the “stage mother” (Gypsy, 1962).
Other popular child stars who have managed to develop later careers include director Ron Howard, who started as a sincere young child, Opie, in The Andy Griffith Show.
Marie and Donny Osmond, who emerged as Mormon pop singers in the late 1960s, have tried to resurrect their careers with a talk show. Jodie Foster and Cristina Ricci, meanwhile, broke the mold of child stars, presenting darker, more complicated versions of childhood (Taxi Driver, 1976; Adam’s Family Values, 1993). Both have successful adult careers.
Industry:Culture
The construction of nature, wild and tame, speaks eloquently to changing beliefs throughout American history. Folklore abounds with animal helpmates like Paul Bunyan’s giant blue ox babe or the subversive African American trickster tales of Br’er Rabbit outwitting his captors. Contemporary media build upon these images in popular shows where intelligent, benevolent and witty animals—dolphins, horses, dogs, a rare cat and overactive chimps—assist humans to develop their humanity. These overlap with more documentary depictions that may nonetheless denature the wild. Yet, lurking beyond these friendly figures is savage nature—unleashed, for example, in furious attacks in Them! (1954) by giant ants, Jurassic Park (1993) or Wolf (1994).
Rin Tin Tin (1916–32) and Lassie (a helpful female family collie played by a male dog with additional human actor sounds) provided gendered templates for domestic animals in movies (1923–31 and 1943–50, respectively). Lassie’s television debut came in 1947 (ABC). He/she reappeared for the next three decades, including a cartoon version (1973– 5). Dog companions also help delineate major characters in movies and shows, including comments on snobbery and affectation in the 1990s hit Frazier, or more heroic sidekick roles (Benji, 1974; Turner and Hooch, 1990). Cats, despite their popularity as pets, prove harder to work with, although they appear in witchcraft representations. Both cats and dogs are frequently used in advertising.
Other media play on curiosity about mammals “closer” to humans (Flipper the dolphin and various chimp shows and films). Some creatures, moreover, crossed the line through animation or tricks, like the wise-cracking, talking horse of CBS’ sitcom Mr Ed (1961– 6). In all, one sees valued traits of American relationships and citizenship read across species—loyalty independence—and an interesting continuing irony about humans.
Disney’s animated features, from Mickey Mouse to Tarzan (1999), have pushed this interlocutor/mimic role even further.
Animal documentary traditions have also blurred relations of nature and culture. Early television programs were especially linked to zoos and zookeepers creating bridges between the wild and the familiar, including interactions with talk-show hosts. National Geographic, America’s premier explorer of the exotic, has also produced more scientific studies of animals ranging from whales to domestic cats. Meanwhile, Disney’s liveaction films and television (Seal Island (1948), Jungle Cat (1960), etc.) have created characters and life narratives in the wild.
This immediacy and humanity of nature reaffirms both the meaning in wilderness and its essential humanity, a charter for appropriation. Yet nature can also convey power and uncertainty, as horror, sci-fi and disaster media suggest, and Cujo (1983) replaces Lassie. In the 1990s, videos of fighting and killing by wild animals are even marketed alongside reality shows like “Cops.” Hence, representations of animals and nature, like their manipulation in pets, parks and food production, provide multiple visions of American identity See also: nature.
Industry:Culture
The National Football League (NFL) is the most popular sports league in the United States, football surpassing baseball in the 1970s as America’s favorite spectator sport.
The organization has thirty franchised teams organized into the American and National Football Conferences. These teams are separately owned, but share about three-quarters of their revenues with each other, most significant of which is the money deriving from 1998 television contracts with CBS, FOX, ABC and ESPN to the tune of nearly $18 billion over eight years.
American football developed in the middle of the nineteenth century out of rugby. The American game developed new rules at the beginning of the twentieth century ostensibly to make it safer (too many college boys were getting injured and killed), but the longterm effect of these reforms was the establishment of a game with few rivals in terms of lethalness. Football is a veritable Rollerball (1975), in which part of the attraction is seeing an opposing team’s player spearheaded and taken out of the game.
If baseball is the game that harks back to a preindustrial age of artisans, football embodies industrial society No other game has achieved the time-work discipline and intense division of labor as football. In football each person is given a few specified tasks to learn by rote. Only one player combines several functions on the field—the foreman quarterback. It is his job to ensure that each player gets his production schedule and performs his task properly. The linemen will hold the opposing line, blockers will block, wide receivers will run down field, turn their heads and bodies to make a catch, run five to ten more yards (if they are lucky), and be brought crashing to the ground by several defenders. The foreman will make adjustments in the huddle, management will call in a new play from the sideline and the workers will set off to perform their tasks once more.
Such a division of labor also shapes the training required to play the sport. A linebacker’s job is to lift weights and eat, bulking up with the ingestion of various body-building drugs. He needs to study film of his opposing linemen to learn their moves, and he needs to learn the play book. Like an industrial worker, he will suffer the hardest knocks and will have the shortest life expectancy Kickers, meanwhile, will strengthen their kicking legs on an adjacent field.
Football has become the mainstay of sports television, dwarfing all other contenders.
Indeed, it is the success of football in its relationship with TV that accounts for its hold on the public attention. As a game of downs, of precision plays performed with brief intervals for reorganization, football is a very easy sport to present. The camera technician seldom has to chase after a ball in a freeflowing game. Instead, just as players are returned to positions after every play so every camera can be redirected to its preassigned position. Moreover, the time between downs means that each play can be presented several times with commentary covering every minute detail.
The National Football League came into existence in 1922 around a few East Coast and Midwest teams—the Washington Redskins, Green Bay Packers, New York Giants, Philadelphia Eagles and Chicago Bears. The league did not attract large crowds (like college games) until it became truly national, absorbing teams established in San Francisco, CA and Los Angeles, CA in 1950, and until the success of CBS and NBC’s television coverage (see sports media).
In 1959 a competing professional league, the American Football League, was brought into existence as a result of the efforts (supported by the ABC network) of two young Texas multi-millionaires, Lamar Hunt and Bud Adams, whose bids for franchises for their hometowns of Dallas, TX and Houston had been rejected. Although many of the teams in the AFL were not profitable, the league managed to survive long enough to threaten the NFL, until an agreement was brought about that united the leagues in 1967, and created the season finale, Superbowl Sunday. Since then the NFL has been able to fight off any challenges, such as that of the United States Football League in the mid-1980s, using its congressional exemption from antitrust legislation to ensure its continued monopoly Racial practices in football have followed a pattern more akin to basketball than to baseball. There are many African American (but few Asian American) athletes in the sport (over 50 percent of the players), but they still face obstacles. Often blacks have been competitive or excellent quarterbacks at the university level, only to be put at wide receiver in the professional game. The unwritten rule against selecting black quarterbacks was broken to some extent when Doug Williams led the Redskins to victory in Superbowl XXII, but Williams was sidelined a season after winning the Superbowl, something that would have been unlikely in the case of a white quarterback. Other black quarterbacks have often faced hostile home crowds, and have been traded earlier than might have been the case with their white counterparts. Likewise, there have been few black coaches in football, and men like Ray Rhodes at the Philadelphia Eagles were not given the freedom to purchase players that might have been accorded a white coach.
Injury is almost inevitable in football, if not from the common playing surface of astroturf, then from the fact that players wear protective padding and helmets. Whereas in the game of rugby a tackler generally pulls an opposing player down to avoid injuring him or herself, in football the aim is to knock the player down. Padding makes this possible, and injuries to the knees, which cannot be protected, are one consequence.
Since games are often determined by a quarterback’s efforts, he receives much of the defense’s attention. Many quarterbacks, like San Francisco 49’ers’ Steve Young or Denver Broncos’ John Elway, have been sidelined with concussions, while others, like Green Bay Packers’ Brett Favre, have had to fight addiction to the painkillers that make playing possible. There are few quarterbacks like 49’ers Joe Montana, who, in the words of a Chicago Bears linebacker, could just “get up, spit out the blood and wink at you, and say that was a great hit.” The plight of the average quarterback is nicely chronicled in North Dallas Forty (1979).
Violence is a common feature of football, one of its objectives being to knock an opposing player out of the game even when he may not have the ball. In recent years, this violence has spread to the bleachers, though it has not reached the proportions witnessed in soccer stadiums around the world. In cities where the fans are most notorious, city administrations have decided to open courts at the stadiums so that a judge can immediately punish any violent fans. Organizations tracking cases of domestic abuse have also made claims that the highest incidence of such violence occurs on Superbowl Sunday.
Industry:Culture
The Endangered Species Act was enacted in 1973 amid growing concern that species were becoming extinct at an accelerating rate due to the increased use of pesticides and other chemicals, the destruction of habitats to make way for suburbs and other development, and excessive hunting and fishing. Under the Act, the federal government lists an animal or plant species as “endangered” if it is in danger of extinction and as “threatened” if it is “likely to become an endangered species within the foreseeable future.” Once listed, no one may “harass, harm, pursue, hunt, shoot, wound, kill, trap, capture or collect” a member of the species or damage its habitat. The Endangered Species Act is one of the strictest federal environmental laws and perhaps the most controversial. The Act’s critics have charged that it has blocked needed economic development to protect “insignificant” species (such as the snail darter and kangaroo rat), and that it interferes with private property rights.
Industry:Culture
US public housing was never the major source of housing for the poor that it was in many European societies. Even at its height in the postwar period, fewer than 5 percent of Americans lived in federally funded subsidized housing projects. The “the projects”— carries associations of ghettoizavery term popularly used to refer to such housing—tion and social pathology, and is often used to stand for the presumed failures of liberalism as manifested in federal anti-poverty programs.
Nonetheless, in the 1930s, public housing was conceived with high hopes that it would be a stepping stone to independent private homeownership for the majority of its tenants.
The story of its failure is also the story of urban social policy more broadly which was utilized in the service of protecting private investment and was shaped in concert with the market interests of real-estate agents and developers. Moreover, most federal housing policies tended to favor programs that encouraged private home-ownership, like lowinterest guaranteed mortgages.
The first public housing projects, located in cities such as Atlanta, GA, New York City, NY and Chicago, IL were low-rise constructions from which the poorest and those believed to be socially “deviant,” like single mothers, were initially barred. In the post-Second World War era, with an increase in land values in inner cities and a massive northern migration of blacks, public housing design moved towards the construction of high-rise “superblocks.” These seemed costefficient and politically expedient for the maintenance of racially segregated neighborhoods. Even then, however, the design of high-rise public housing was accompanied by a spirit of utopian optimism, influenced by such modernist architects and urban planners as Le Corbusier, who believed that highrise buildings, designed by professional planners and managed by a state conceptualized as benign and rational, were the prototypical homes of the future. Such futuristic ideals quickly soured in developments like Robert Taylor Homes in south Chicago which, from the beginning, were poorly maintained and lacked such amenities as communal facilities and safe play spaces for children. They were used by city governments as places where the poorest African Americans could be “contained,” isolated from white working-class and middle-class neighborhoods. Unlike the earliest years of public housing, any pretense of screening tenants fell by the wayside and it quickly became the housing of last resort now sheltering mostly single mothers and their children living on benefits.
The Pruitt-Igoe housing project in St. Louis, MO came to represent the failure of modernist high-rise public housing. Built in the mid-1950s, it was demolished a mere twenty years later after being deemed an ungovernable tangle of pathology In his ethnographic study of life in Pruitt-Igoe, sociologist Lee Rainwater referred to it as “a federally built and supported slum.” In 1965 HUD (Department of Housing and Urban Development) was established as part of President Johnson’s War on Poverty and took over public housing. Federal housing programs moved away from supporting new construction to emphasizing subsidized rents for the poor in buildings managed by private landlords (for example, Section 8 certificates). In 1989, when Jack Kemp was appointed HUD secretary by then President George Bush, many city housing authorities had fallen into receivership or had come close to bankruptcy overwhelmed by inadequate funding from the federal government for maintenance and repair of their aging public-housing projects. Kemp encouraged the expansion of Tenant (or Resident) Management Organizations, the oldest of which dated back to the mid-1970s, to take over the operation of their projects from government agencies deemed incompetent and overly bureaucratic. While a few Tenant Management Organizations became nationally known for the improvements they were able to make in their communities and for their charismatic leaders, this model proved very difficult to put into practice on a large scale. Few of these organizations were actually able to become completely independent of their local housing authorities. In the late 1990s, several of even the most famous tenant-management projects, including Bromley-Heath in Boston and Cochran Gardens in St. Louis, were removed from tenant control and returned to being managed by their local housing authorities amidst charges of financial improprieties and claims that the tenantmanagement board failed to enforce new strict HUD regulations regarding tenant conduct.
In 1993, HOPE VI was adopted by HUD for the rehabilitation of public housing.
HOPE VI provides some federal funding for capital improvements and encourages city housing authorities to renovate their developments using a combination of public and private sources. Properties rehabilitated under the HOPE VI program are also required to maintain a balance of low-income, working-class and even middle-class tenants.
Although several housing developments are currently undergoing substantive remodeling, it is unclear what the future of these mixed-income communities will be.
Industry:Culture
The “Motor City” across the straits from Canada reached its apogee with Second World War production, when the automotive “Big Three”—Ford, General Motors and Chrysler—and related industries provided a solid industrial base, burgeoning employment and global clout. Five decades later, Detroit symbolizes the rustbelt— hemorrhaging people and jobs and deeply scarred by racial division. The estimated 1998 urban population plunged below 1 million (the metropolitan area exceeds 4 million).
Where Motown music celebrated an exuberant city in the 1960s, the recurrent arson of “Devil’s Night”—when abandoned property blazes on Halloween—provides an eerier emblem for the 1990s.
As Thomas Sugrue argues in The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit (1998), Detroit’s discrimination in employment and housing laid the foundations for later decline, foreshadowed in 1943 riots. While manufacturing drew diverse workers, neither owners nor the powerful United Auto Workers established equality. By the 1950s, automation cost jobs and the city lacked land for updated plants, which scattered around the US. The city’s growing black population was slammed by segregation and diminishing opportunities. Bloody riots in 1967 increased white flight to Grosse Point, Oak Park, Dearborn Heights, etc., while the inner city languished.
Capital and production shifts to cheaper assembly areas as well as the rise of foreign cars further drained the economy in the 1970s and 1980s, as an African American mayor Coleman Young tried to respond with urban patronage.
Detroit retains vestiges of its one-time wealth and power, from the Diego de Rivera murals in the Art Institute to its extensive library Wayne State University and its successful sports teams—Tigers (baseball), Pistons (basketball) and Redwings (hockey). Yet continuing crises overshadow the city in Ze’ev Chafet’s Devil’s Nïght (1990) or the chilling future of Robocop (1987).
Industry:Culture
The study of mind and behavior represents links with the life sciences, the social sciences, the humanities and therapeutic domains, including both the medical specialization of psychiatry and practices of clinical and humanistic psychology. With more than fifty subfields recognized by the American Psychological Assocation for its roughly 100,000 members, psychologists may be characterized by their research methods, their focal interests or the areas in which these interests are applied; they may also combine academic research and teaching with counseling and other roles. More than 250,000 psychologists are employed nationwide. Psychology is also deeply connected to research and theory in other disciplines, including anthropology, biology, education, information theory linguistics, medicine, neurobiology and sociology. Psychoanalytic interpretations also have currency in the humanities including film studies.
In addition, psychology permeates popular discourse both in reference to specific terms from the field (including psychoanalytic jargon) and a more general concern with emotions, motivations and individual and social problems in which “psychologistic” explanations have become commonplace. This role is reinforced by the psychologist in mass culture as commentator on events and problems in radio talk shows, as news columnist or television analyst and as lead character (Spellbound, 1945; High Anxiety, 1977; Color of Nïght, 1994; The Bob Newhart Show, CBS, 1972–8). The highly popular 1990s sitcom Frasier (NBC, 1993–), for example, contrasts a radio psychiatrist as advisor with his own emotional and social problems.
American pscyhology has a strong experimental tradition dating back to the nineteenth century, encompassing modern work in sensory studies, physiological psychology, comparative studies (with animals) and cognitive studies. Cognitive sciences have become linked with innovations in computers and communications. Other formative figures in American psychology, include more philosophical functionalists like William James and John Dewey J.B. Watson was the father of behaviorism, which focused on stimulus-response models.
Psychoanalysis was bolstered in the US by refugees fleeing the Nazis, including Karen Horney, Alfred Adler and others who developed diverse discussions of the Freudian legacy. Psychoanalysis was seized upon by Hollywood as both practice and subject of countless, albeit often comic, expositions. Questions raised about Freud and the limits of his observations and interpretations in the late twentieth century divided the psychoanalytic community in painful, sometimes public ways.
The Second World War and the Cold War became a watershed, as Herman (1995) argues, in bringing clinical and therapeutic aspects to the fore in both professional and public discourses as experts in the field skyrocketed (the APA soared from 2,739 members in 1940 to 30,839 in 1970). Their work in universities and private practice included studies of personality adjustment and social psychology tackling questions like gender and sexuality, prejudice and individuality, contributing to a broader reformulation of these issues in American life. These studies are also linked to applied psychological investigations and treatments in clinical practice, counseling, education and industry.
Abnormal psychology deals specifically with questions of different knowledge of and action in the world. Developmental psychology has also become an important field.
American Psychologist, the journal of the APA, is a central journal in the field. Psychology Today, a more popular journal, also has provided information on research and issues in the field for thirty years.
Industry:Culture