- Industry: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
The history of this multiplex musical category tracks an advent in American music, as well as the entanglements of a global mass media with local musics. Fusion as an experiment in musical aesthetics and as a marketable category of music begins as an era in the history of jazz. In the late 1960s, jazz musicians, pioneered by Miles Davis (for example his recording “Bitches Brew,” 1970), combined “traditional” jazz musical forms with those from rock music to form jazz fusion. Rock guitarist Jimi Hendrix and the R&B funk-rock of Sly and the Family Stone had particular influence on jazz music.
Throughout the 1970s, many jazz musicians continued to experiment with combinations of acoustic and electric instrumentations, or left behind the acoustic sound altogether for an amplified, electronic one. Rock rhythms and percussion styles replaced the bebop, swing styles common to jazz music and were used more frequently as compositional features. Rock musicians as well embraced fusion as a small number of rock bands featured jazz ensemble arrangements with solo improvisation.
Fusion of the 1960s and 1970s heralded not only a blurring of the boundaries between rock and jazz music, but also the emergence of a global sound. Jazz music had a history of incorporating other musical styles, mostly from Africa and from Latin American countries, and jazz fusion continued with this inclusive worldview. Fusion in the 1980s and 1990s, however, represented the blending together of sounds, instruments, rhythms and composition styles into a “polyphonic” genre of music referred to as world music, world beat, global fusion, world fusion, or just fusion music that loosely encompasses recordings of local musics from around the world to create the polyphonic forms produced for consumption on a world market. The “fusion” of local traditional musics with “Western” music is said to celebrate the recognition of other musical forms and instrumentations, while at the same time symbolizing musical expression in a global world. Paul Simon’s Grammy-winning “Graceland” (1986) recording, a blend of American pop music with the music of black South African musicians, is held up as an exemplar of contemporary fusion (although not without controversy). In the late 1990s and early twenty-first century fusion represents what sometimes appears to be a hodgepodge of musical productions that can include the coming together of so-called “traditional” musics from Africa, Asia, South America, and so forth with what has been referred to as a “universal pop aesthetic,” that is “Western” (Taylor, 1997). Fusion can also include “new and old world music,” “ancient future world fusion” and “folk fusion music” that can bring together the music of the Andes with Appalachian music, for example.
While fusion grew from experimentation in musical aesthetics, it also was the product of changing musical tastes and a record business increasingly oriented towards a world market. Fusion’s “hybridity” marks a fluidity of economic and aesthetic relationships between the music traditions of local worlds and the modern, often Western, tastes of a global culture. Although some would argue that hybridity has always been a feature of music and fusion should only be applied to the American jazz and rock fusion music.
Industry:Culture
The 1995 bust of an El Monte, California factory where seventy-five Thai immigrants worked 18-hour days behind barbed wire producing designer clothes for $1.60 per hour underscored the dark side of American fashion and consumerism. While this case led to new California laws banning such labor conditions and practices, and compensations ranging from $10,000 to $80,000 for the workers, demands for 8 billion items of clothes per year in America have fostered sweatshop conditions in the US and abroad. In the US, they are especially associated with ethnic enclaves or manufacturers in the US—Mexico borderlands. More corporations have been accused of underpaid and dangerous labor conditions overseas—in the US-controlled territory of Saipan (hence, “Made in America”) or Vietnam, Indonesia, China, Latin America and Africa. Major designers and distributors like Nike, DKNY, Guess and others have been implicated; celebrity television host Kathie Lee Gifford broke down in tears on air after accusations that her clothing line was produced by child labor. College students and labor activists in the 1990s increasingly organized boycotts of these products and lobbied institutions, government and manufacturers for changes—although ethics, here, must struggle against constant demand.
Industry:Culture
The technology for television was developed in the 1920s. Nonetheless, it was not until after the Second World War that television became popular in America. Since then, television has been attacked as a detriment to its watchers’ lives and hailed as an educational tool that opens up people to the world beyond their everyday experiences.
Both charges are true.
Despite differences in the media, American television was modeled after its counterpart in radio. It took on a similar industrial set-up—around a few major networks and local franchises—and early programming relied on staples like soap operas and talk shows. Similar governmental regulations—under the jurisdiction of the FCC—were also adopted. That means that, like most mass media in America, commercial television dominates, with public broadcasting playing a minor role.
Television’s expansion came at a time when Americans were emerging as a superpower. With the soldiers coming home, the rise of suburbia and the new abundance of material goods, TV became a new symbol for American wealth. The number of television receivers rose from 10 million in 1951 to 50 million in 1959. Local stations became points of metropolitan identity (while radio could be very localized, entire states like New Jersey could be left without a commercial station).
Television was initially viewed by Hollywood as serious competition; cinema attendance fell in the 1950s. Yet, the two media differ in their relationships with space and audiences. The box in the house keeps its audience in, while the screen in the movie theater takes people out of their homes. The success of TV in the 1950s was partly embodied by the strong nuclear family suburban living arrangement. However, over decades, closer relationships grew between television and film industries; they benefit from each other by sharing their resources from production to distribution. The arrival of color TV in the 1960s underscored this symbiosis. American TV is also global— syndicated shows are sold worldwide, creating a dominance not unlike Hollywood’s.
American television relies on advertising revenue even as it presents itself as providing services to the public (publicly funded television, PBS, was a late innovation in the 1960s, targeting high culture and underserved populations). Commercial television must sell its programs to the audience, then sell its audience to the advertisers. Hence, broadcasters developed the Nielsen rating system to measure viewership at home.
Though inaccurate, these ratings can determine the life and death of TV programs. These ratings have also promoted a culture where the group who has the highest spending power becomes the group that TV pleases. Broadcasters can also claim that they are making shows that the public demands, while avoiding those with small audiences.
American television is about show business and news. Following closely the successes of radio, major TV genres include game shows, talk shows, sitcoms, various dramas, soap operas, variety shows and sports events. News can exemplify how this evolved over time. Television news not only provides information, local and national, but also sells a prestige product. Television changed the delivery of news. As in radio, it is instantaneous, but it is also visual. Edward R. Murrow became the father and hero of broadcast journalism, with his famous 1954 exposé of Senator Joseph McCarthy on See It Now. While people debate television’s role in the explosive events of the 1960s, from the civil-rights movement to the Vietnam War, TV news was well respected for its independence and authority. However, the television news department, like other TV production departments, also relies on audience ratings. In the 1990s, with TV stations increasingly owned by even larger media conglomerates, TV news needed to be more sensational. It is generally believed that this new news is exploitative, spending hours on crime, celebrities’ deaths and fluff features, rather than any investigative journalism.
Political and critical debates about TV arise because it is seen to be a great force of socialization. Television enters people’s bedrooms, and it connects people from children to old age to worlds beyond their immediate environment. Despite its independence from government control, TV content, because it needs to appeal to the widest possible audience, tends to be middle-ofthe-road American fare, whether in dramas or in news.
Radicals point out a lack of diversity, while the ultra-conservatives find TV to have a liberal bias with its slight engagement with issues of homosexuality or its attacks on isolated corporate misdeeds. Yet television, especially with cable diversification, is about audiences rather than ideas or even difference.
In the 1980s, the introduction of cable TV challenged the dominance of the three national networks. New broadcast networks—FOX, Warner Brothers, UPN and Pax— added further competition to the traditional three. Increasingly, the mass audience became more of a targeted, niche audience where the industry seeks the most disposable income. CBS’ Murder She Wrote (CBS, 1984–97) for example, had impressive ratings, but since its demographics were mainly older, the show was cancelled. In the 1990s, shows geared towards a young audience dominated. At the same time, one of the top shows, Seinfeld, never attracted a minority audience. Both NAACP and La Raza have challenged major networks to include minority characters in their sitcom line-ups at the end of the century while worrying that other networks have become new ghettos.
Television has also been agglomerated into ever larger corporations. Of the three surviving early networks, NBC is owned by GE, CBS by Viacom and ABC by Disney New technological developments like Web/ interactive TV and HDTV will probably not so much change television as offer more sophisticated delivery systems. Television will still be contained in the home. The multiple-TV home, VCRs and the ability for the new technologies to allow more self-selection may encourage more individualized home entertainment. Yet, the development of HDTV would not only mean sharper pictures for the consumers, it would also effectively eliminate the development of small networks because of the prohibitive investment involved.
Industry:Culture
The National Film Preservation Act of 1988 was passed in recognition of the monumental quality of American films in their original or restored versions amid problems from emerging techniques to alter films (colorization) and evidence of the decay of film stocks and archives preserving older silent films and experimental stocks.
The Act established the National Film Preservation Board which manages the National Film Registry. Twenty-five “culturally, historically or aesthetically significant” films are chosen each year to be included in the registry It now encompasses 250 films that define a diverse American canon across genres, eras and styles. In 1996 the National Film Preservation Foundation was founded to further the work of preserving orphan films (films that have no owners).
Industry:Culture
The Roman Catholic Church, the largest Christian denomination in the world, is also the largest single church in the United States, with approximately one-quarter of the population as members. Adherents tend to be more numerous in the Northeast, the Great Lakes/Midwest and the Southwest/California, with fewer members in the South and in the Plains states, though shifting economic growth has altered this pattern somewhat. Additionally large influxes of Catholic Hispanics have emigrated from Mexico and areas of Central and South America into the Southwest and other areas of the country with the total population of adherents in the United States growing over the last decades.
Early Catholic immigrants in the US faced religious persecution from the Protestant and nativist populations, though, with time, Catholic numbers afforded them a measure of security against prejudice. They built schools and initiated strong mutual support systems, including parochial schools and religious colleges. As they worked towards assimilation into US culture, tensions with Rome arose over the extent of interaction the Church might have with this rapidly modernizing culture. The Papal condemnation of this assimilation in 1890 suppressed attempts at change for nearly seventy years.
Ultimately however, the ghetto mentality of the Church in the US was fractured by the election of John XXIII as Pope in 1958 and his calling of the Second Vatican Council (1962–5), a watershed in the life of the Roman Catholic Church.
The Second Vatican Council had a dramatic effect on the lives of the faithful in the United States. Among the changes instigated by the council, three stand out. The first is the emphasis on the importance of the laity in the Church. As an increasingly affluent and educated group of believers, Catholic laity in America desired a larger role in influencing Church governance and in determining how their donations might be spent.
The second is the change in the language of the liturgy from Latin to English.
Previously, many congregants were unable to follow the Latin Tridentine liturgy and would often be occupied in practices such as praying the Rosary during the Mass. With the use of English, and with the priest now facing the people rather than with his back to them, the laity were expected to participate actively in the Mass. This change resulted in the expectation of many of the laity that they might take larger roles in the liturgy. These expectations have been fulfilled to some extent, with lay people assisting in the proclamation of the scripture and in the distribution of communion during the Mass as well as, in many parishes, acting in advisory capacities in the decision-making of the parish.
A third result of the council is the affirmation of the goodness in other faith traditions, Christian and otherwise. In the time before the council, the Church taught that salvation was available only through the Catholic Church and emphasized this point through the discipline of the parish setting with its close-knit communities. Other Christians and people of other faiths could not receive the salvation of Jesus Christ, according to Church teaching. In relation to other Christians, the council stated that the ideal for Christianity was the restoration of Christian unity rather than the return of non-Catholic Christians to the Roman Catholic Church. In relation to people of other faiths, the Council also effectively conceded the possibility of revelation outside the Christian faith, a concession unique among Christian denominations. Thus, the RCC no longer considers itself the unique path to salvation, only the pre-eminent one. This position is a remarkable turnaround for the Church and one that has led to continued tension and confusion among some of the faithful as it has called into question the centrality of the Church for community and for the faith of the laity In reaction to these changes in the selfunderstanding of the Church, groups espousing the older teachings of the Church have arisen since the council. Some groups have lobbied for the return to the Latin Tridentine Mass, others have emphasized more traditional pieties, such as the Rosary and novenas, and others have emphasized a more rigorous following of the magisterium, or teaching authority of the Church. This authority is often equated with the pronouncements of the headquarters of the Church, the Roman Curia, though traditionally it has also rested in the work of theologians.
Other movements in the Church include an increased emphasis on the social teaching of the Church, especially in regard to the rights of the poor and homeless, to the environment, to nuclear armaments and to the economy. The United States bishops have released pastoral letters addressing these issues and suggesting an appropriate, faithbased response to them. These teachings have not been without controversy, as socially and politically conservative Catholics have taken issue with most of them and publicly attacked the bishops’ position. At the same time, Pope John Paul II, with his socially progressive and theologically conservative agenda, has appointed increasing percentages of theologically conservative but socially progressive bishops in US dioceses.
Developing from the Second Vatican Council and its emphasis on personal moral decisionmaking, the heightened role of the laity has called into question the central role of the clergy and of religious men and women. Far fewer men enter seminaries and religious orders than prior to the Council, while most women’s orders have only a fraction of previous numbers. Those seminaries that do attract larger numbers of candidates often espouse a more traditional religious sensibility. Where priests, brothers and sisters once worked, now lay people do much the same work of teaching and administering parishes. Other issues with which the Church has struggled are charges of sexual abuse by the clergy the role of women and the possibility of married priests.
Industry:Culture
The term “lobbyist” refers to any person or group that attempts to sway the actions of a political representative through a sustained effort at persuasion. In the British Parliament of the seventeenth century lobbyists would try to influence members personally as they walked through the lobby of the legislature to cast their votes, hence the origin of the term. In more recent times, lobbying is often accomplished through informal meetings between lobbyists (or their clients) and politicians, financial contributions, or, more publicly through media campaigns and other forms of organized publicity. Although all political pressure groups try to “lobby” politicians in most modern liberal democraeies, the word “lobbyist” is generally restricted to those whose professional livelihoods depend on their skill in directly influencing public policy on behalf of interested parties—such as trade unions, private companies, or industry organizations—who pay them to do so.
Lobbyists have been a part of America’s public life for some time, and probably have always affected political decisions to some extent. In the 1800s they were often referred to as “lobbiers,” and were already under attack. The American poet Walt Whitman called them “lousy combings and born freedom sellers of the Earth.” Then, as now, critics feared the representative political process—whereby legislators try to consider and reflect primarily the interests of the people who elected them—was being subverted through the influence of paid “insiders” like professional lobbyists. In more recent years, opponents have accused powerful and rich lobbies, such as that of the National Rifle Association, of protecting the interests of their clients to the detriment of society Such worries have only escalated in recent years as numerous high-placed political figures—men such as Michael Deaver, the former chief of staff for President Ronald Reagan, or former Secretary of the Treasury Lloyd Bentsen—left their formal political posts and moved directly into the more lucrative sphere of the private lobbying industry such actions seemingly dissolving fully the barrier between private interest and public good. In 1998 journalists credited lobbyists working for the tobacco industry with sabotaging antitobacco legislation that most of the American public supported.
Despite criticisms of professional lobbyists and of insider lobbying, however, attempts to reform the political system so as to restrict their influence have been largely unsuccessful. In the 1990s, politicians of all ideological stripes found it worthwhile to decry the nefarious influence of money in politics, but little practical progress was made in solving the problem. Constitutionally it is difficult to align any restriction on political lobbying with the commitment to freedom of speech. It is also not clear that all lobbying is necessarily destructive of the democratic process; much lobbying, for example, is undertaken by public interest groups of all ideological stripes. More practically any restriction that might pass constitutional muster would have to be voted upon by the very politicians who now profit so handsomely from the largesse of lobbyists and their clients.
In the summer of 1997, a Senate investigation of lobbyists’ influence in the federal political process provoked relatively little public interest. This had changed by the spring of 2000. The decision of Republican Senator John McCain of Arizona to put campaign finance reform at the center of his appeal during the spring’s presidential primaries sparked national concern over the influence of big money in politics.
Industry:Culture
Small retail establishments serving urban neighborhoods, these “corner” stores offer groceries and sundries, services and sociability as well as occasional credit, for those beyond downtown commercialism. The nickname underscores the convergence of family and business in ownership, residence and self-exploitation, as spouses and children work long hours for limited profits. Nonetheless, these have been a respected step in economic mobility (as in Dobie Gillis, CBS, 1958–63).
As white flight altered urban neighborhoods, these establishments were trapped. Those who did not close sometimes found themselves isolated by race or language. While new neighbors saw their higher prices as indications of exploitation rather than commercial marginalization, aging owners worried about crime and hatred. This tension has remained when new immigrants buy out these family stakeholds (e.g. Korean shopkeepers and African Americans in Los Angeles, CA). Spike Lee portrays both the old and new stores in Do the Right Thing (1989).
Industry:Culture
The president of the United States is one of the most powerful political figures in the nation and the world. As the leader of the United States, he or she is expected to create and implement new legislation and involve the United States in the affairs of other countries. Yet little power is invested in the office of the president itself. A president cannot enact legislation on his or her own (though he or she can sometimes affect how it is implemented). The president’s official power is limited to the very bare outlines of the Constitution; furthermore, the informal power that usually comes of being head of a party is unavailable to him or her because Congress is often controlled by members of another party and, even if the presidency and Congress are held by the same party discipline in American political parties is weak. The president, then, must rely on other sources of power in carrying out the tasks he or she is expected to handle.
The official powers of the president are enumerated in Article II of the US Constitution. They include the power to pardon criminals, the power to make treaties (with the approval of two-thirds of the Senate) and the office of Commander in Chief of the armed forces. The president is also responsible for reporting to Congress on the state of the union and making recommendations to Congress. The president’s legislative power is limited to the ability to veto legislation approved by Congress.
It is clear that the president’s role extends beyond the completion of the duties enumerated in the Constitution. The president has become a prominent and powerful political figure. Domes-tically, he or she presents a budget to Congress every year, creates and introduces legislation to Congress and tries to direct the economic affairs of the nation; yet the official powers of the president have not changed since the Constitution was created more than 200 years ago. There is a great deal of debate over how it is that the president manages to acquire this power. There are those who argue that the president’s power is highly personal; powerful presidents have the ability to persuade Congress to go along with what they want. Others claim that presidents can get Congress to implement particular policies by increasing public support for those policies, increasing fears that members of Congress will not be re-elected if they do not vote for policies the public wants. Still others argue that the amount of power a president wields is determined by the president’s place in time; personal qualities and actions have little to do with what a president can accomplish. There are also those that believe that a president’s power depends entirely on whether Congress and the president are of the same party.
Efforts have been made in recent years to increase the president’s legislative power.
The creation of a line-item veto, which would allow the president to veto certain parts of a bill without vetoing it in its entirety, was heavily debated in the early 1990s. Congress passed legislation giving the president the power of line-item veto over budget bills; this legislation came into effect on January 1, 1997. However, a year later the Supreme Court declared the line-item veto unconstitutional.
In foreign affairs, the president has considerably more constitutional power. The president’s position as Commander in Chief of the armed forces has allowed him to take actions that were not endorsed by Congress, frequently using this power to involve the country in military conflicts. The Gulf of Tonkin Resolution, passed in August of 1964 at the request of President Johnson, granted the president the power to use military force to “repel any armed attack against the forces of the United States and to prevent further aggression.” This allowed presidents Johnson and Nixon to begin and continue the Vietnam War without any declaration of war. This large grant of power was limited by Congress when they passed the War Powers Resolution of 1973; however, many presidents have refused to abide by the resolution. For instance, the initial deployment of troops into Iraq by President Bush was not endorsed by Congress (although Congress later voted to support these actions and allow the president to take any others he felt necessary), and Bush continued to insist that he did not need congressional approval at all.
The only way to remove a president from office is through the process of impeachment. A president can be impeached if he commits bribery treason, or “high crimes and misdemeanors.” While the definitions of bribery and treason are clear, the definition of “high crimes and misdemeanors” is not; it is under this category of crimes that claims of impeachable offenses are usually made. The impeachment procedure has two stages: in the first, the House of Representatives votes on articles of impeachment. If the House decides that formal charges are relevant and significant, an official trial is held in the Senate. Two presidents have been impeached: in 1868 Andrew Johnson narrowly escaped being stripped of his office by one vote in the Senate; in 1999 Bill Clinton was saved by a party-line vote also falling short of the necessary two-thirds majority. But, in 1974, a House committee recommended that members of the House of Representatives vote “yes” on articles of impeachment for Richard Nixon, leading the president to resign before the issue was voted on.
Industry:Culture
The most popular recreation for Americans. Upper middle-class Americans build pools in their yards or belong to expensive country clubs; middle-class Americans will belong to YMCAs or other health clubs and/or go to public community pools that are also frequented, especially in the cities, by members of the lower classes. Swimming is also an important aspect of vacationing at lakes and ponds, as well as at the shore.
Competitive swimming was popular in nineteenth-century Britain, mainly involving a gentlemanly breaststroke. In 1844, two American Indians demonstrated a stroke akin to what would become crawl at a London pool, but observers considered it “un-European.” The stroke was then introduced to California by an Australian who had learned the stroke in the South Sea Islands, and it quickly became established as the major Olympic speed race. The sport’s growing popularity was very much connected with the career of Johnny Weissmuller, who won eight gold medals at the 1924 and 1928 Olympic Games before going on to star in eighteen Hollywood movies as Tarzan.
More recently several Americans have dominated the sport: Donna deVarona, the “Queen of Swimming” in the early 1960s; Mark Spitz, winner of seven gold medals, and Shirley Babashoff, winner of eight medals altogether at the 1972 Olympics; Tracy Caulkins, generally considered one of the greatest all-round swimmers of all time and awarded the title “Swimmer of the Decade” by USA Today in 1990; Matt Biondi, winner of gold medals at three Olympiads (1984–92); and Greg Louganis, acknowledged as the greatest diver of all time.
Recently news reports have suggested that boys have been giving up the sport in large numbers as they reach puberty, partly because they find the newer streamlined swimming trunks too revealing. With so many sports options available at school and home, young boys are selecting those that they think will project the best image. As a result, swim teams have become increasingly populated by girl athletes, some programs becoming 70 percent female. The likelihood of any more Spitzs or Louganis emerging in the United States is slim, but the chances that an American woman may produce similar feats may be increasing with the growing numbers of athletic scholarships at colleges going to women (following Title IX), and in the aftermath of the collapse of support for the military-style training programs of the former Soviet Union and East Germany.
Synchronized swimming, largely associated with women athletes, has experienced tremendous growth in the 1980s and 1990s. Its origins in the 1933 World’s Fair of Chicago, IL, synchronized swimming was popularized during the 1940s in Hollywood’s “water ballets” or “aqua musicals,” identified with “America’s mermaid,” Esther Williams (e.g. Bathing Beauty, 1944; Million Dollar Mermaid, 1952). It continued to develop in the Midwestern collegiate programs as an alternative to speed swimming, though with the appeal of its stunts and physically demanding routines it was not exclusively associated with women.
First adopted as a non-medal sport at the 1952 Helsinki Olympics, synchronized swimming was finally introduced as a medal sport at the 1996 Atlanta games (though some medal events have been dropped since), as Americans wanted to showcase American talent and to meet the demand for more women participants in the Olympics.
Since pools were already in place for the other swimming events, synchronized swimming was considered a low-cost, high-entertainment addition to the program.
In the process, synchronized swimming has disrupted gender associations in sports, in ways similar to figure skating. While the sport plays on notions of femininity, witnessed also in the feminizing of cheerleading, it also demands great athleticism, and so is further breaking down the gendered linkages of male with “athletic” and female with “grace” See also: sports and gender.
Industry:Culture
Those between thirteen and twenty-something represent both the “downfall” of American society and serve as its trend-setters and hopes. Teens have also become icons of modernity and symbols in a culture deeply oriented towards youth in the postwar era.
Nonetheless, social dependency on the ability of teenagers to replicate the social roles of adults produces a type of anxiety that consistently revolves around the activities of youth (Cohen, 1972).
The teenage years are viewed as the definitive time of identity formation. This becomes the focus of school and extra-curricular activities like sports, as well as family anxiety. While religious rites, social celebrations (coming out, commencement) and civic landmarks—especially acquisition of a driver’s license at sixteen—mark increasing adulthood, it remains partial. Even as many teenagers work in addition to school and family obligations, they are often treated as potential victims. Hence, laws censor adult information and shield them from vice (alcohol and tobacco sales are legally discouraged). Teens are also viewed as vulnerable to problematic peer relations— whether status groups or gangs. These prohibitions, in turn, become foci of rebellion.
Hence, the activities of American teenagers have become the focus of countless socialscience studies that tend to focus largely on the notion of deviance. These studies were concentrated during the 1960s, which remains an iconic decade of teen rebellion, from Vietnam to Woodstock. Drug use, teenage suicide and sex remain consistent trends among analysts (Gaines 1990) and policy-makers. Moreover, both underscore the special stress on those who are marked as different in race, class, gender or ability from the larger society. Teenagers can be cruel, yet this is often only a refraction of cultural values and socio-economic opportunity. Hence, the socially constructed category of adolescence becomes a series of trials, tribulations and experimentation that one will survive, it is hoped, to become an adult.
Yet teenagers, individually and en masse, have also become important agents in their own right. Since the Second World War, teenagers have become a major consumer force with a large amount of disposable income. An entire cultural industry has been created in the United States to appeal to this age group in the form of clothing stores, music trends and entertainment, incorporating a rapid obsolescence despite the endurance of sex, jeans and rock ’n’ roll as primary motifs. In the 1990s, the survival of many corporate entities depended on teenage consumer power.
While often viewed as frivolous, teenagers have also been at the center of important social changes in society. Teenagers have been involved in landmark political events and moments of social upheaval, including social demonstrations in the 1960s, especially the antiwar and civil-rights movement. With the draft beginning at eighteen, and the voting age being lowered to eighteen nationwide in the 1970s, this has made them real as well as potential actors in major issues, even if politicians speak of them still in terms of tutelage (on issues such as abortion).
Representations of teenagers in popular culture often reflect their position as consumers in fashion, music and mass media. Again, this historically focused on the notion of deviance and rebellion, with some interesting changes. Films such as Blackboard Jungle (1955), Rebel Without a Cause (1955) and West Side Story (1961) portrayed rebellious teenagers of the 1950s, coupled with subtexts of class, race and social hierarchies. The tension between teens on the beach or at the drive-in and those in the gang or the “hood” continues to shape teen flicks. Teenage anxiety permeated films such as The Breakfast Club and Sixteen Candles in the 1980s, while also reflecting on dysfunctional families. In the 1990s, the sobering Kïds (1995) and Basketball Diaries (1995) have treated drug use and sex, while being criticized as causes, rather than effects of youth violence. Representations of teenagers in television have generally dealt with less political teenage dilemmas or have focused on family interrelations in shows such as Happy Days (1974–84) or Beverly Hills 90210 (1990–9), although news media often play up stories of gangs and victimization.
Industry:Culture