ÐÓÁÐÈÊÈ

U.S. Culture

   ÐÅÊËÀÌÀ

Ãëàâíàÿ

Çîîëîãèÿ

Èíâåñòèöèè

Èíôîðìàòèêà

Èñêóññòâî è êóëüòóðà

Èñòîðè÷åñêèå ëè÷íîñòè

Èñòîðèÿ

Êèáåðíåòèêà

Êîììóíèêàöèè è ñâÿçü

Êîñìåòîëîãèÿ

Êðèïòîëîãèÿ

Êóëèíàðèÿ

Êóëüòóðîëîãèÿ

Ëîãèêà

Ëîãèñòèêà

Áàíêîâñêîå äåëî

Áåçîïàñíîñòü æèçíåäåÿòåëüíîñòè

Áèçíåñ-ïëàí

Áèîëîãèÿ

Áóõó÷åò óïðàâëåí÷ó÷åò

Âîäîñíàáæåíèå âîäîîòâåäåíèå

Âîåííàÿ êàôåäðà

Ãåîãðàôèÿ ýêîíîìè÷åñêàÿ ãåîãðàôèÿ

Ãåîäåçèÿ

Ãåîëîãèÿ

Æèâîòíûå

Æèëèùíîå ïðàâî

Çàêîíîäàòåëüñòâî è ïðàâî

Çäîðîâüå

Çåìåëüíîå ïðàâî

Èíîñòðàííûå ÿçûêè ëèíãâèñòèêà

ÏÎÄÏÈÑÊÀ

Ðàññûëêà íà E-mail

ÏÎÈÑÊ

U.S. Culture

(NHL). NHL teams play a regular schedule that culminates in the

championship series. The winner is awarded the Stanley Cup, the league’s

top prize.

Television transformed sports in the second half of the 20th century. As

more Americans watched sports on television, the sports industry grew into

an enormous business, and sports events became widely viewed among

Americans as cultural experiences. Many Americans shared televised moments

of exaltation and triumph throughout the year: baseball during the spring

and summer and its World Series in the early fall, football throughout the

fall crowned by the Super Bowl in January, and the National Basketball

Association (NBA) championships in the spring. The Olympic Games, watched

by millions of people worldwide, similarly rivet Americans to their

televisions as they watch outstanding athletes compete on behalf of their

nations. Commercial sports are part of practically every home in America

and have allowed sports heroes to gain prominence in the national

imagination and to become fixtures of the consumer culture. As well-known

faces and bodies, sports celebrities such as basketball player Michael

Jordan and baseball player Mark McGwire are hired to endorse products.

Although televised games remove the viewing public from direct contact

with events, they have neither diminished the fervor of team

identification nor dampened the enthusiasm for athletic participation.

Americans watch more sports on television than ever, and they personally

participate in more varied sporting activities and athletic clubs.

Millions of young girls and boys across the country play soccer, baseball,

tennis, and field hockey.

At the end of the 20th century, Americans were taking part in individual

sports of all kinds—jogging, bicycling, swimming, skiing, rock climbing,

playing tennis, as well as more unusual sports such as bungee jumping,

hang gliding, and wind surfing. As Americans enjoy more leisure time, and

as Hollywood and advertising emphasize trim, well-developed bodies, sports

have become a significant component of many people's lives. Many Americans

now invest significant amounts of money in sports equipment, clothing, and

gym memberships. As a result, more people are dressing in sporty styles of

clothing. Sports logos and athletic fashions have become common aspects of

people’s wardrobes, as people need to look as though they participate in

sports to be in style. Sports have even influenced the cars Americans

drive, as sport utility vehicles accommodate the rugged terrain, elaborate

equipment, and sporty lifestyles of their owners.

Probably the most significant long-term development in 20th-century sports

has been the increased participation of minorities and women. Throughout

the early 20th century, African Americans made outstanding contributions

to sports, despite being excluded from organized white teams. The

exclusion of black players from white baseball led to the creation of a

separate Negro National League in 1920. On the world stage, track-and-

field star Jessie Owens became a national hero when he won four gold

medals and set world and Olympic records at the Berlin Olympics in 1936.

The racial segregation that prevented African Americans from playing

baseball in the National League until 1947 has been replaced by the

enormous successes of African Americans in all fields of sport.

Before the 20th century women could not play in most organized sports.

Soon, however, they began to enter the sports arena. Helen Wills Moody, a

tennis champion during the 1920s, and Babe Didrikson Zaharias, one of the

20th century’s greatest women athletes, were examples of physical grace

and agility. In 1972 Title IX of the Education Amendments Act outlawed

discrimination based on gender in education, including school sports.

Schools then spent additional funding on women's athletics, which provided

an enormous boost to women’s sports of all kinds, especially basketball,

which became very popular. Women's college basketball, part of the

National Collegiate Athletic Association (NCAA), is a popular focus of

interest. By the end of the 20th century, this enthusiasm led to the

creation of a major professional women’s basketball league. Women have

become a large part of athletics, making their mark in a wide range of

sports.

Sports have become one of the most visible expressions of the vast

extension of democracy in 20th-century America. They have become more

inclusive, with many Americans both personally participating and enjoying

sports as spectators. Once readily available only to the well-to-do,

sports and recreation attract many people, aided by the mass media, the

schools and colleges, the federal and state highway and park systems, and

increased leisure time.

Celebrations and Holidays

Americans celebrate an enormous variety of festivals and holidays because

they come from around the globe and practice many religions. They also

celebrate holidays specific to the United States that commemorate

historical events or encourage a common national memory. Holidays in

America are often family or community events. Many Americans travel long

distances for family gatherings or take vacations during holidays. In

fact, by the end of the 20th century, many national holidays in the United

States had become three-day weekends, which many people used as mini

vacations. Except for the Fourth of July and Veterans Day, most

commemorative federal holidays, including Memorial Day, Labor Day,

Columbus Day, and Presidents’ Day, are celebrated on Mondays so that

Americans can enjoy a long weekend. Because many Americans tend to create

vacations out of these holiday weekends rather than celebrate a particular

event, some people believe the original significance of many of these

occasions has been eroded.

Because the United States is a secular society founded on the separation

of church and state, many of the most meaningful religiously based

festivals and rituals, such as Easter, Rosh Hashanah, and Ramadan, are not

enshrined as national events, with one major exception. Christmas, and the

holiday season surrounding it, is an enormous commercial enterprise, a

fixture of the American social calendar, and deeply embedded in the

popular imagination. Not until the 19th century did Christmas in the

United States begin to take on aspects of the modern holiday celebration,

such as exchanging gifts, cooking and eating traditional foods, and

putting up often-elaborate Christmas decorations. The holiday has grown in

popularity and significance ever since. Santa Claus; brightly decorated

Christmas trees; and plenty of wreathes, holly, and ribbons help define

the season for most children. Indeed, because some religious faiths do not

celebrate Christmas, the Christmas season has expanded in recent years to

become the “holiday season,” embracing Hanukkah, the Jewish Festival of

Lights, and Kwanzaa, a celebration of African heritage. Thus, the

Christmas season has become the closest thing to a true national festival

in the United States.

The expansion of Christmas has even begun to encroach on the most

indigenous of American festivals, Thanksgiving. Celebrated on the last

Thursday in November, Thanksgiving has largely shed its original religious

meaning (as a feast of giving thanks to God) to become a celebration of

the bounty of food and the warmth of family life in America. American

children usually commemorate the holiday’s origins at school, where they

re-create the original event: Pilgrims sharing a harvest feast with Native

Americans. Both the historical and the religious origins of the event have

largely given way to a secular celebration centered on the traditional

Thanksgiving meal: turkey—an indigenous American bird—accompanied by foods

common in early New England settlements, such as pumpkins, squashes, and

cranberries. Since many Americans enjoy a four-day holiday at

Thanksgiving, the occasion encourages family reunions and travel. Some

Americans also contribute time and food to the needy and the homeless

during the Thanksgiving holiday.

Another holiday that has lost its older, religious meaning in the United

States is Halloween, the eve of All Saints’ Day. Halloween has become a

celebration of witches, ghosts, goblins, and candy that is especially

attractive to children. On this day and night, October 31, many homes are

decorated and lit by jack-o'-lanterns, pumpkins that have been hollowed

out and carved. Children dress up and go trick-or-treating, during which

they receive treats from neighbors. An array of orange-colored candies has

evolved from this event, and most trick-or-treat bags usually brim with

chocolate bars and other confections.

The Fourth of July, or Independence Day, is the premier American national

celebration because it commemorates the day the United States proclaimed

its freedom from Britain with the Declaration of Independence. Very early

in its development, the holiday was an occasion for fanfare, parades, and

speeches celebrating American freedom and the uniqueness of American life.

Since at least the 19th century, Americans have commemorated their

independence with fireworks and patriotic music. Because the holiday marks

the founding of the republic in 1776, flying the flag of the United States

(sometimes with the original 13 stars) is common, as are festive

barbecues, picnics, fireworks, and summer outings.

Most other national holidays have become less significant over time and

receded in importance as ways in which Americans define themselves and

their history. For example, Columbus Day was formerly celebrated on

October 12, the day explorer Christopher Columbus first landed in the West

Indies, but it is now celebrated on the second Monday of October to allow

for a three-day weekend. The holiday originally served as a traditional

reminder of the "discovery" of America in 1492, but as Americans became

more sensitive to their multicultural population, celebrating the conquest

of Native Americans became more controversial.

Holidays honoring wars have also lost much of their original significance.

Memorial Day, first called Decoration Day and celebrated on May 30, was

established to honor those who died during the American Civil War (1861-

1865), then subsequently those who died in all American wars. Similarly,

Veterans Day was first named Armistice Day and marked the end of World War

I (1914-1918). During the 1950s the name of the holiday was changed in the

United States, and its significance expanded to honor armed forces

personnel who served in any American war.

The memory of America's first president, George Washington, was once

celebrated on his birthday, February 22nd. The date was changed to the

third Monday in February to create a three-day weekend, as well as to

incorporate the birthday of another president, Abraham Lincoln, who was

born on February 12th. The holiday is now popularly called Presidents’ Day

and is less likely to be remembered as honoring the first and 16th

American presidents than as a school and work holiday. Americans also

memorialize Martin Luther King, Jr., the great African American civil

rights leader who was assassinated in 1968. King’s birthday is celebrated

as a national holiday in mid-January. The celebration of King's birthday

has become a sign of greater inclusiveness in 20th-century American

society.

EDUCATION

Role of Education

The United States has one of the most extensive and diverse educational

systems in the world. Educational institutions exist at all learning

levels, from nursery schools for the very young to higher education for

older youths and adults of all ages. Education in the United States is

notable for the many goals it aspires to accomplish—promoting democracy,

assimilation, nationalism, equality of opportunity, and personal

development. Because Americans have historically insisted that their

schools work toward these sometimes conflicting goals, education has often

been the focus of social conflict.

While schools are expected to achieve many social objectives, education in

America is neither centrally administered nor supported directly by the

federal government, unlike education in other industrialized countries. In

the United States, each state is responsible for providing schooling,

which is funded through local taxes and governed by local school boards.

In addition to these government-funded public schools, the United States

has many schools that are privately financed and maintained. More than 10

percent of all elementary and secondary students in the United States

attend private schools. Religious groups, especially the Roman Catholic

Church, run many of these. Many of America's most renowned universities

and colleges are also privately endowed and run. As a result, although

American education is expected to provide equality of opportunity, it is

not easily directed toward these goals. This complex enterprise, once one

of the proudest achievements of American democracy because of its

diversity and inclusiveness, became the subject of intense debate and

criticism during the second half of the 20th century. People debated the

goals of schools as well as whether schools were educating students well

enough.

History of Education in America

Until the 1830s, most American children attended school irregularly, and

most schools were either run privately or by charities. This irregular

system was replaced in the Northeast and Midwest by publicly financed

elementary schools, known as common schools. Common schools provided

rudimentary instruction in literacy and trained students in citizenship.

This democratic ideal expanded after the Civil War to all parts of the

nation. By the 1880s and 1890s, schools began to expand attendance

requirements so that more children and older children attended school

regularly. These more rigorous requirements were intended to ensure that

all students, including those whose families had immigrated from

elsewhere, were integrated into society. In addition, the schools tried to

equip children with the more complex skills required in an industrialized

urban society.

Education became increasingly important during the 20th century, as

America’s sophisticated industrial society demanded a more literate and

skilled workforce. In addition, school degrees provided a sought-after

means to obtain better-paying and higher-status jobs. Schools were the one

American institution that could provide the literate skills and work

habits necessary for Americans of all backgrounds to compete in

industries. As a result, education expanded rapidly. In the first decades

of the 20th century, mandatory education laws required children to

complete grade school. By the end of the 20th century, many states

required children to attend school until they were at least 16. In 1960,

45 percent of high school graduates enrolled in college; by 1996 that

enrollment rate had risen to 65 percent. By the late 20th century, an

advanced education was necessary for success in the globally competitive

and technologically advanced modern economy. According to the U.S. Census

Bureau, workers with a bachelor’s degree in 1997 earned an average of

$40,000 annually, while those with a high school degree earned about

$23,000. Those who did not complete high school earned about $16,000.

In the United States, higher education is widely available and obtainable

through thousands of private, religious, and state-run institutions, which

offer advanced professional, scientific, and other training programs that

enable students to become proficient in diverse subjects. Colleges vary in

cost and level of prestige. Many of the oldest and most famous colleges on

the East Coast are expensive and set extremely high admissions standards.

Large state universities are less difficult to enter, and their fees are

substantially lower. Other types of institutions include state

universities that provide engineering, teaching, and agriculture degrees;

private universities and small privately endowed colleges; religious

colleges and universities; and community and junior colleges that offer

part-time and two-year degree programs. This complex and diverse range of

schools has made American higher education the envy of other countries and

one of the nation’s greatest assets in creating and maintaining a

technologically advanced society.

When more people began to attend college, there were a number of

repercussions. Going to college delayed maturity and independence for many

Americans, extending many of the stresses of adolescence into a person’s

20s and postponing the rites of adulthood, such as marriage and

childbearing. As society paid more attention to education, it also devoted

a greater proportion of its resources to it. Local communities were

required to spend more money on schools and teachers, while colleges and

universities were driven to expand their facilities and course offerings

to accommodate an ever-growing student body. Parents were also expected to

support their children longer and to forgo their children's contribution

to the household.

Funding

Education is an enormous investment that requires contributions from many

sources. American higher education is especially expensive, with its heavy

investment in laboratory space and research equipment. It receives funding

from private individuals, foundations, and corporations. Many private

universities have large endowments, or funds, that sustain the

institutions beyond what students pay in tuition and fees. Many, such as

Harvard University in Massachusetts and Stanford University in California,

raise large sums of money through fund drives. Even many state-funded

universities seek funds from private sources to augment their budgets.

Most major state universities, such as those in Michigan and California,

now rely on a mixture of state and private resources.

Before World War II, the federal government generally played a minor role

in financing education, with the exception of the Morrill Acts of 1862 and

1890. These acts granted the states public lands that could be sold for

the purpose of establishing and maintaining institutions of higher

education. Many so-called land-grant state universities were founded

during the 19th century as a result of this funding. Today, land-grant

colleges include some of the nation’s premier state universities. The

government also provided some funding for basic research at universities.

The American experience in World War II (especially the success of the

Manhattan Project, which created the atomic bomb) made clear that

scientific and technical advances, as well as human resources, were

essential to national security. As a result, the federal government became

increasingly involved in education at all levels and substantially

expanded funding for universities. The federal government began to provide

substantial amounts of money for university research programs through

agencies such as the National Science Foundation, and later through the

National Institutes of Health and the departments of Energy and Defense.

At the same time, the government began to focus on providing equal

educational opportunities for all Americans. Beginning with the GI Bill,

which financed educational programs for veterans, and later in the form of

fellowships and direct student loans in the 1960s, more and more Americans

were able to attend colleges and universities.

During the 1960s the federal government also began to play more of a role

in education at lower levels. The Great Society programs of President

Lyndon Johnson developed many new educational initiatives to assist poor

children and to compensate for disadvantage. Federal money was funneled

through educational institutions to establish programs such as Head Start,

which provides early childhood education to disadvantaged children. Some

Americans, however, resisted the federal government’s increased presence

in education, which they believed contradicted the long tradition of state-

sponsored public schooling.

By the 1980s many public schools were receiving federal subsidies for

textbooks, transportation, breakfast and lunch programs, and services for

students with disabilities. This funding enriched schools across the

country, especially inner-city schools, and affected the lives of millions

of schoolchildren. Although federal funding increased, as did federal

supervision, to guarantee an equitable distribution of funds, the

government did not exercise direct control over the academic programs

schools offered or over decisions about academic issues. During the 1990s,

the administration of President Bill Clinton urged the federal government

to move further in exercising leadership by establishing academic

standards for public schools across the country and to evaluate schools

through testing.

Concerns in Elementary Education

The United States has historically contended with the challenges that come

with being a nation of immigrants. Schools are often responsible for

modifying educational offerings to accommodate immigrants. Early schools

reflected many differences among students and their families but were also

a mechanism by which to overcome these differences and to forge a sense of

American commonality. Common schools, or publicly financed elementary

schools, were first introduced in the mid-19th century in the hopes of

creating a common bond among a diverse citizenship. By the early 20th

century, massive immigration from Europe caused schools to restructure and

expand their programs to more effectively incorporate immigrant children

into society. High schools began to include technical, business, and

vocational curricula to accommodate the various goals of its more diverse

population. The United States continues to be concerned about how to

incorporate immigrant groups.

The language in which students are taught is one of the most significant

issues for schools. Many Americans have become concerned about how best to

educate students who are new to the English language and to American

culture. As children of all ages and from dozens of language backgrounds

seek an education, most schools have adopted some variety of bilingual

instruction. Students are taught in their native language until their

knowledge of English improves, which is often accomplished through an

English as a Second Language (ESL) program. Some people have criticized

these bilingual programs for not encouraging students to learn English

more quickly, or at all. Some Americans fear that English will no longer

provide a uniform basis for American identity; others worry that immigrant

children will have a hard time finding employment if they do not become

fluent in English. In response to these criticisms, voters in California,

the state that has seen the largest influx of recent immigrants, passed a

law in 1998 requiring that all children attending public schools be taught

in English and prohibiting more than one year of bilingual instruction.

Many Americans, including parents and business leaders, are also alarmed

by what they see as inadequate levels of student achievement in subjects

such as reading, mathematics, and science. On many standardized tests,

American students lag behind their counterparts in Europe and Asia. In

response, some Americans have urged the adoption of national standards by

which individual schools can be evaluated. Some have supported more

rigorous teacher competency standards. Another response that became

popular in the 1990s is the creation of charter schools. These schools are

directly authorized by the state and receive public funding, but they

operate largely outside the control of local school districts. Parents and

teachers enforce self-defined standards for these charter schools.

Schools are also working to incorporate computers into classrooms. The

need for computer literacy in the 21st century has put an additional

strain on school budgets and local resources. Schools have struggled to

catch up by providing computer equipment and instruction and by making

Internet connections available. Some companies, including Apple Computer,

Inc., have provided computer equipment to help schools meet their

students’ computer-education needs.

Concerns in Higher Education

Throughout the 20th century, Americans have attended schools to obtain the

economic and social rewards that come with highly technical or skilled

work and advanced degrees. However, as the United States became more

diverse, people debated how to include different groups, such as women and

minorities, into higher education. Blacks have historically been excluded

from many white institutions, or were made to feel unwelcome. Since the

19th century, a number of black colleges have existed to compensate for

this broad social bias, including federally chartered and funded Howard

University. In the early 20th century, when Jews and other Eastern

Europeans began to apply to universities, some of the most prestigious

colleges imposed quotas limiting their numbers.

Americans tried various means to eliminate the most egregious forms of

discrimination. In the early part of the century, "objective" admissions

tests were introduced to counteract the bias in admissions. Some educators

now view admissions tests such as the Scholastic Achievement Test (SAT),

originally created to simplify admissions testing for prestigious private

schools, as disadvantageous to women and minorities. Critics of the SAT

believed the test did not adequately account for differences in social and

economic background. Whenever something as subjective as ability or merit

is evaluated, and when the rewards are potentially great, people hotly

debate the best means to fairly evaluate these criteria.

Until the middle of the 20th century, most educational issues in the

United States were handled locally. After World War II, however, the

federal government began to assume a new obligation to assure equality in

educational opportunity, and this issue began to affect college admissions

standards. In the last quarter of the 20th century, the government

increased its role in questions relating to how all Americans could best

secure equal access to education.

Schools had problems providing equal opportunities for all because

quality, costs, and admissions criteria varied greatly. To deal with these

problems, the federal government introduced the policy of affirmative

action in education in the early 1970s. Affirmative action required that

colleges and universities take race, ethnicity, and gender into account in

admissions to provide extra consideration to those who have historically

faced discrimination. It was intended to assure that Americans of all

backgrounds have an opportunity to train for professions in fields such as

medicine, law, education, and business administration.

Affirmative action became a general social commitment during the last

quarter of the 20th century. In education, it meant that universities and

colleges gave extra advantages and opportunities to blacks, Native

Americans, women, and other groups that were generally underrepresented at

the highest levels of business and in other professions. Affirmative

action also included financial assistance to members of minorities who

could not otherwise afford to attend colleges and universities.

Affirmative action has allowed many minority members to achieve new

prominence and success.

At the end of the 20th century, the policy of affirmative action was

criticized as unfair to those who were denied admission in order to admit

those in designated group categories. Some considered affirmative action

policies a form of reverse discrimination, some believed that special

policies were no longer necessary, and others believed that only some

groups should qualify (such as African Americans because of the nation’s

long history of slavery and segregation). The issue became a matter of

serious discussion and is one of the most highly charged topics in

education today. In the 1990s three states—Texas, California, and

Washington—eliminated affirmative action in their state university

admissions policies.

Several other issues have become troubling to higher education. Because

tuition costs have risen to very high levels, many smaller private

colleges and universities are struggling to attract students. Many

students and their parents choose state universities where costs are much

lower. The decline in federal research funds has also caused financial

difficulties to many universities. Many well-educated students, including

those with doctoral degrees, have found it difficult to find and keep

permanent academic jobs, as schools seek to lower costs by hiring part-

time and temporary faculty. As a result, despite its great strengths and

its history of great variety, the expense of American higher education may

mean serious changes in the future.

Education is fundamental to American culture in more ways than providing

literacy and job skills. Educational institutions are the setting where

scholars interpret and pass on the meaning of the American experience.

They analyze what America is as a society by interpreting the nation’s

past and defining objectives for the future. That information eventually

forms the basis for what children learn from teachers, textbooks, and

curricula. Thus, the work of educational institutions is far more

important than even job training, although this is usually foremost in

people’s minds.

ARTS AND LETTERS

The arts, more than other features of culture, provide avenues for the

expression of imagination and personal vision. They offer a range of

emotional and intellectual pleasures to consumers of art and are an

important way in which a culture represents itself. There has long been a

Western tradition distinguishing those arts that appeal to the multitude,

such as popular music, from those—such as classical orchestral

music—normally available to the elite of learning and taste. Popular art

forms are usually seen as more representative American products. In the

United States in the recent past, there has been a blending of popular and

elite art forms, as all the arts experienced a period of remarkable cross-

fertilization. Because popular art forms are so widely distributed, arts

of all kinds have prospered.

The arts in the United States express the many faces and the enormous

creative range of the American people. Especially since World War II,

American innovations and the immense energy displayed in literature,

dance, and music have made American cultural works world famous. Arts in

the United States have become internationally prominent in ways that are

unparalleled in history. American art forms during the second half of the

20th century often defined the styles and qualities that the rest of the

world emulated. At the end of the 20th century, American art was

considered equal in quality and vitality to art produced in the rest of

the world.

Throughout the 20th century, American arts have grown to incorporate new

visions and voices. Much of this new artistic energy came in the wake of

America’s emergence as a superpower after World War II. But it was also

due to the growth of New York City as an important center for publishing

and the arts, and the immigration of artists and intellectuals fleeing

fascism in Europe before and during the war. An outpouring of talent also

followed the civil rights and protest movements of the 1960s, as cultural

discrimination against blacks, women, and other groups diminished.

American arts flourish in many places and receive support from private

foundations, large corporations, local governments, federal agencies,

museums, galleries, and individuals. What is considered worthy of support

often depends on definitions of quality and of what constitutes art. This

is a tricky subject when the popular arts are increasingly incorporated

into the domain of the fine arts and new forms such as performance art and

conceptual art appear. As a result, defining what is art affects what

students are taught about past traditions (for example, Native American

tent paintings, oral traditions, and slave narratives) and what is

produced in the future. While some practitioners, such as studio artists,

are more vulnerable to these definitions because they depend on financial

support to exercise their talents, others, such as poets and

photographers, are less immediately constrained.

Artists operate in a world where those who theorize and critique their

work have taken on an increasingly important role. Audiences are

influenced by a variety of intermediaries—critics, the schools,

foundations that offer grants, the National Endowment for the Arts,

gallery owners, publishers, and theater producers. In some areas, such as

the performing arts, popular audiences may ultimately define success. In

other arts, such as painting and sculpture, success is far more dependent

on critics and a few, often wealthy, art collectors. Writers depend on

publishers and on the public for their success.

Unlike their predecessors, who relied on formal criteria and appealed to

aesthetic judgments, critics at the end of the 20th century leaned more

toward popular tastes, taking into account groups previously ignored and

valuing the merger of popular and elite forms. These critics often relied

less on aesthetic judgments than on social measures and were eager to

place artistic productions in the context of the time and social

conditions in which they were created. Whereas earlier critics attempted

to create an American tradition of high art, later critics used art as a

means to give power and approval to nonelite groups who were previously

not considered worthy of including in the nation’s artistic heritage.

Not so long ago, culture and the arts were assumed to be an unalterable

inheritance—the accumulated wisdom and highest forms of achievement that

were established in the past. In the 20th century generally, and certainly

since World War II, artists have been boldly destroying older traditions

in sculpture, painting, dance, music, and literature. The arts have

changed rapidly, with one movement replacing another in quick succession.

Visual Arts

The visual arts have traditionally included forms of expression that

appeal to the eyes through painted surfaces, and to the sense of space

through carved or molded materials. In the 19th century, photographs were

added to the paintings, drawings, and sculpture that make up the visual

arts. The visual arts were further augmented in the 20th century by the

addition of other materials, such as found objects. These changes were

accompanied by a profound alteration in tastes, as earlier emphasis on

realistic representation of people, objects, and landscapes made way for a

greater range of imaginative forms.

During the late 19th and early 20th centuries, American art was considered

inferior to European art. Despite noted American painters such as Thomas

Eakins, Winslow Homer, Mary Cassatt, and John Marin, American visual arts

barely had an international presence.

American art began to flourish during the Great Depression of the 1930s as

New Deal government programs provided support to artists along with other

sectors of the population. Artists connected with each other and developed

a sense of common purpose through programs of the Public Works

Administration, such as the Federal Art Project, as well as programs

sponsored by the Treasury Department. Most of the art of the period,

including painting, photography, and mural work, focused on the plight of

the American people during the depression, and most artists painted real

people in difficult circumstances. Artists such as Thomas Hart Benton and

Ben Shahn expressed the suffering of ordinary people through their

representations of struggling farmers and workers. While artists such as

Benton and Grant Wood focused on rural life, many painters of the 1930s

and 1940s depicted the multicultural life of the American city. Jacob

Lawrence, for example, re-created the history and lives of African

Americans. Other artists, such as Andrew Wyeth and Edward Hopper, tried to

use human figures to describe emotional states such as loneliness and

despair.

Abstract Expressionism

Shortly after World War II, American art began to garner worldwide

attention and admiration. This change was due to the innovative fervor of

abstract expressionism in the 1950s and to subsequent modern art movements

and artists. The abstract expressionists of the mid-20th century broke

from the realist and figurative tradition set in the 1930s. They

emphasized their connection to international artistic visions rather than

the particularities of people and place, and most abstract expressionists

did not paint human figures (although artist Willem de Kooning did

portrayals of women). Color, shape, and movement dominated the canvases of

abstract expressionists. Some artists broke with the Western art tradition

by adopting innovative painting styles—during the 1950s Jackson Pollock

"painted" by dripping paint on canvases without the use of brushes, while

the paintings of Mark Rothko often consisted of large patches of color

that seem to vibrate.

Abstract expressionists felt alienated from their surrounding culture and

used art to challenge society’s conventions. The work of each artist was

quite individual and distinctive, but all the artists identified with the

radicalism of artistic creativity. The artists were eager to challenge

conventions and limits on expression in order to redefine the nature of

art. Their radicalism came from liberating themselves from the confining

artistic traditions of the past.

The most notable activity took place in New York City, which became one of

the world’s most important art centers during the second half of the 20th

century. The radical fervor and inventiveness of the abstract

expressionists, their frequent association with each other in New York

City’s Greenwich Village, and the support of a group of gallery owners and

dealers turned them into an artistic movement. Also known as the New York

School, the participants included Barnett Newman, Robert Motherwell, Franz

Kline, and Arshile Gorky, in addition to Rothko and Pollock.

The members of the New York School came from diverse backgrounds such as

the American Midwest and Northwest, Armenia, and Russia, bringing an

international flavor to the group and its artistic visions. They hoped to

appeal to art audiences everywhere, regardless of culture, and they felt

connected to the radical innovations introduced earlier in the 20th

century by European artists such as Pablo Picasso and Marcel Duchamp. Some

of the artists—Hans Hofmann, Gorky, Rothko, and de Kooning—were not born

in the United States, but all the artists saw themselves as part of an

international creative movement and an aesthetic rebellion.

As artists felt released from the boundaries and conventions of the past

and free to emphasize expressiveness and innovation, the abstract

expressionists gave way to other innovative styles in American art.

Beginning in the 1930s Joseph Cornell created hundreds of boxed

assemblages, usually from found objects, with each based on a single theme

to create a mood of contemplation and sometimes of reverence. Cornell's

boxes exemplify the modern fascination with individual vision, art that

breaks down boundaries between forms such as painting and sculpture, and

the use of everyday objects toward a new end. Other artists, such as

Robert Rauschenberg, combined disparate objects to create large, collage-

like sculptures known as combines in the 1950s. Jasper Johns, a painter,

sculptor, and printmaker, recreated countless familiar objects, most

memorably the American flag.

The most prominent American artistic style to follow abstract

expressionism was the pop art movement that began in the 1950s. Pop art

attempted to connect traditional art and popular culture by using images

from mass culture. To shake viewers out of their preconceived notions

about art, sculptor Claes Oldenburg used everyday objects such as pillows

and beds to create witty, soft sculptures. Roy Lichtenstein took this a

step further by elevating the techniques of commercial art, notably

cartooning, into fine art worthy of galleries and museums. Lichtenstein's

large, blown-up cartoons fill the surface of his canvases with grainy

black dots and question the existence of a distinct realm of high art.

These artists tried to make their audiences see ordinary objects in a

refreshing new way, thereby breaking down the conventions that formerly

defined what was worthy of artistic representation.

Probably the best-known pop artist, and a leader in the movement, was Andy

Warhol, whose images of a Campbell’s soup can and of the actress Marilyn

Monroe explicitly eroded the boundaries between the art world and mass

culture. Warhol also cultivated his status as a celebrity. He worked in

film as a director and producer to break down the boundaries between

traditional and popular art. Unlike the abstract expressionists, whose

conceptual works were often difficult to understand, Andy Warhol's

pictures, and his own face, were instantly recognizable.

Conceptual art, as it came to be known in the 1960s, like its

predecessors, sought to break free of traditional artistic associations.

In conceptual art, as practiced by Sol LeWitt and Joseph Kosuth, concept

takes precedent over actual object, by stimulating thought rather than

following an art tradition based on conventional standards of beauty and

artisanship.

Modern artists changed the meaning of traditional visual arts and brought

a new imaginative dimension to ordinary experience. Art was no longer

viewed as separate and distinct, housed in museums as part of a historical

inheritance, but as a continuous creative process. This emphasis on

constant change, as well as on the ordinary and mundane, reflected a

distinctly American democratizing perspective. Viewing art in this way

removed the emphasis from technique and polished performance, and many

modern artworks and experiences became more about expressing ideas than

about perfecting finished products.

Photography

Photography is probably the most democratic modern art form because it can

be, and is, practiced by most Americans. Since 1888, when George Eastman

developed the Kodak camera that allowed anyone to take pictures,

photography has struggled to be recognized as a fine art form. In the

early part of the 20th century, photographer, editor, and artistic

impresario Alfred Stieglitz established 291, a gallery in New York City,

with fellow photographer Edward Steichen, to showcase the works of

Ñòðàíèöû: 1, 2, 3


© 2000
Ïðè ïîëíîì èëè ÷àñòè÷íîì èñïîëüçîâàíèè ìàòåðèàëîâ
ãèïåðññûëêà îáÿçàòåëüíà.