Audience Research: Overview
Audience Research: Overview
The history of media audience studies can be seen as a series of oscillations between perspectives that have stressed the power of the text (or message) over its audiences and perspectives that have stressed the barriers “protecting” the audience from the potential effects of the message. The first position is most obviously represented by the whole tradition of effects studies, mobilizing a “hypodermic” model of media influence, in which the media are seen to have the power to “inject” their audiences with particular “messages,” which will cause those audiences to behave in particular ways. This has involved, from the right, perspectives that see the media as causing the breakdown of “traditional values” and, from the left, perspectives that see the media causing their audiences to remain quiescent in political terms, with the media inculcating consumerist values or causing the audience to inhabit some form of false consciousness.
Bio
One of the most influential versions of this kind of “hypodermic” theory of media effects was that advanced by Theodor Adorno and Max Horkheimer, along with other members of the Frankfurt School of Social Research. Their “pessimistic mass-society thesis” reflected the authors’ experience of the breakdown of modern Germany into fascism during the 1930s, a breakdown that was attributed, in part, to the loosening of traditional ties and structures, a disintegration regarded as leaving people more “atomized” and exposed to external influences and especially to the pressure of the mass propaganda of powerful leaders, the most effective agency of which was the mass media. This “pessimistic mass-society thesis” stressed the conservative and reconciliatory role of “mass culture” for the audience. Mass culture was seen to suppress “potentialities” and to deny awareness of contradictions in a “one-dimensional world”; only art, in fictional and dramatic form, could preserve the qualities of negation and transcendence. Implicit in this theory was a “hypodermic” model of the media, which were seen as having the power to “inject” a repressive ideology directly into the consciousness of the masses.
However, against this overly pessimistic backdrop, the emigration of the leading members of the Frankfurt School (Adorno, Horkheimer, Herbert Marcuse) to the United States during the 1930s led to the development of a specifically “American” school of research in the 1940s and 1950s. The Frankfurt School’s “pessimistic” thesis proved unacceptable to American researchers. According to these researchers, the “pessimistic” thesis proposed too direct and unmediated an impact by the media on audiences; it took too far the thesis that all intermediary social structures between leaders/media and the masses had broken down; it did not accurately reflect the pluralistic nature of American society; it was sociologically naive. Clearly, the media had social effects; these must be examined and researched, but, equally clearly, these effects were neither all-powerful, simple, nor even necessarily direct. The nature of this complexity and indirectness also needed to be demonstrated and researched. Thus, in reaction to the Frankfurt School’s predilection for critical social theory and qualitative and philosophical analysis, American researchers, such as Herta Herzog, Robert Merton, Paul Lazarsfeld, and, later, Elihu Katz began to develop a quantitative and positivist methodology for empirical audience research into the “sociology of mass persuasion.”
Throughout the 1950s and 1960s, the overall effect of this empirically grounded “sociology of mass persuasion” was to produce a much more qualified notion of “media power,” in which media consumers were increasingly recognized to not be completely passive “victims” of the culture industry.
Among the major landmarks here were Merton’s Mass Persuasion and Katz and Lazarsfeld’s Personal Influence, in which they developed the concept of “two-step flow” communication, where the in fluence of the media was seen as crucially mediated by “gatekeepers” and “opinion leaders” within the audience community.
Looking back at these developments from the perspective of the 1970s, Counihan noted the increasing significance of a new perspective on media consumption: the “uses-and-gratifications” approach, largely associated in the United States with the work of Katz and, in Britain, with the work of Jay Blumler and James Halloran as well as the studies of the Leicester Centre for Mass Communications Research, during the 1960s. Within that perspective, the viewer came to be credited with an active role, so that there was then a question, as Halloran put it, of looking at what people do with the media, rather than what the media do to them (see Halloran). This argument was obviously of great significance in moving the debate forward: to begin to look to the active engagement of the audience with the medium and with the particular television programs that they might be watching. A key advance developed by the uses-and-gratifications perspective was that of the variability of response and interpretation. From this perspective, one can no longer talk about the “effect” of a message on a homogenous mass audience, who are all expected to be affected in the same way. Clearly, uses-and-gratifications did represent a significant advance on effects theory, insofar as it opens up the question of differential interpretations. However, critics argue that the theory is limited because the perspective remains individualistic, insofar as differences of response or interpretation are ultimately attributed solely to individual differences of personality or psychology. From this point of view the approach remains severely limited by its insufficiently sociological or cultural perspective.
It was against this background that Stuart Hall, working at the Centre for Contemporary Cultural Studies at the University of Birmingham, England, developed the “encoding/decoding” model of communication as an attempt to take forward insights that had emerged within each of these other perspectives. In subsequent years, this model has come to be widely influential in audience studies. It took from the effects theorists the notion that mass communication is a structured activity, in which the institutions that produce the messages do have the power to set agendas and to define issues. This model moves away from the idea that the medium has the power to make a person behave in a certain way (as a direct effect, which is caused by a simple stimulus, provided by the medium), but it holds onto a notion of the role of the media in setting agendas (see the work of Bachrach and Baratz on the media’s agenda-setting functions) and providing cultural categories and frameworks within which members of the culture will tend to operate.
Hall’s paradigm also attempts to incorporate from the uses-and-gratifications perspective the idea of the active viewer, making meaning from the signs and symbols that the media provide. However, the model was also designed to take on board concerns with the ways in which responses and interpretations are socially structured and culturally patterned at a level beyond that of individual psychologies. The model was also, critically, in formed by semiological perspectives, focusing on the question of how communication works and drawing on Umberto Eco’s early work on the decoding of TV as a form of “semiological guerrilla warfare.” The key focus was on the realization that we are, of course, dealing with signs and symbols, which only have meaning within the terms of reference supplied by codes (of one sort or another) that the audience shares, to some greater or lesser extent, with the producers of messages. In this respect, Hall’s model was also influenced by Roland Barthes’s attempts to update Ferdinand de Saussure’s ideas of semiology—as “a science of signs at the heart of social life”—by developing an analysis of the role of “mythologies” in contemporary cultures.
The premises of Hall’s encoding/decoding model are (1) the same event can be encoded in more than one way; (2) the message always contains more than one potential “reading”—messages propose and “prefer” certain readings over others, but they can never become wholly closed around one reading: they remain polysemic (i.e., capable, in principle, of a variety of interpretations); (3) understanding the message is also a problematic practice, however transparent and “natural” it may seem. Messages encoded one way can always be decoded in a different way.
The television message is treated here as a complex sign, in which a “preferred reading” has been inscribed but retains the potential, if decoded in a manner different from the way in which it has been encoded, of communicating a different meaning. The message is thus a structured polysemy. It is central to Hall’s argument that all meanings do not exist “equally” in the message, which is seen to have been structured in dominance, despite the impossibility of a “total closure” of meaning. Further, the “preferred reading” is itself part of the message and can be identified within its linguistic and communicative structure. Thus, when analysis shifts to the “moment” of the encoded message itself, the communicative form and structure can be analyzed in terms of what the mechanisms are that prefer one, dominant reading over the other readings; that is, the means the encoder uses to try to “win the assent of the audience” to one’s preferred reading of the message.
Hall assumes that there will be no necessary “fit,” or transparency, between the encoding and decoding ends of the communication chain. It is precisely this lack of transparency, and its consequences for communication, that must be investigated, Hall claims. Having established that there is always a possibility of disjunction between the codes of those sending and those receiving through the circuit of mass communications, the problem of the “effects” of communication could now be reformulated, as that of the extent to which decodings take place within the limits of the preferred (or dominant) manner in which the message has been initially encoded. However, the complementary aspect of this problem is that of the extent to which these interpretations, or decodings, also reflect, and are inflected by, the code and discourses that different sections of the audience inhabit and the ways in which this reflection or deflection is determined by the socially governed distribution of cultural codes between and across different sections of the audience: that is, the range of different decoding strategies and competencies in the audience. In this connection, the model draws both on Frank Parkin’s work on “meaning systems” and on Pierre Bourdieu’s work on the social distribution of forms of cultural competence.
During the 1970s, at around the same time that Hall was developing the encoding/decoding model, the growing influence of feminism led to a revitalization of interest in psychoanalytic theory, in which concern for issues of gender take a central place. Within media studies, this interest in psychoanalytic theories of the construction of gendered identities, within the field of language and representation, was one of the informing principles behind the development of the particular approach to the analysis of the media (predominantly the cinema) and its effects on the spectator, developed by the journal Screen (for a time in the late 1970s, heavily influential in this field, particularly in British film studies).
Screen theory emphasized the analysis of the effects of cinema (and especially, the regressive effects of mainstream, commercial Hollywood cinema) in “positioning” the spectator (or subject) of the film, through the way in which the text (by means of camera placement, editing, and other formal characteristics) “fixed” the spectator into a particular kind of “subject-position,” which, it was argued, “guaranteed” the transmission of a certain kind of “bourgeois ideology” of naturalism, realism, and verisimilitude.
Screen theory was largely constituted by a mixing of Jacques Lacan’s rereading of Sigmund Freud, stressing the importance of language in the unconscious, and Louis Althusser’s early formulation of the “media” as an “Ideological State Apparatus” (even if operating in the private sphere), which had the principal function of securing the reproduction of the conditions of production by “interpellating” its subjects (spectators, audiences) within the terms of the “dominant ideology.” Part of the appeal of this approach for media scholars rested in the weight the theory gave to (“relatively autonomous”) language and “texts” (such as films and media products) as having real effects in society. To this extent, the approach was argued to represent a significant advance on previous theories of the media (including traditional Marxism), which had stressed the determination of all superstructural phenomena (such as the media) by the “real” economic “base” of the society, thus allowing no space for the conceptualization of the media themselves as having independent (or at least, in Althusser’s terms, “relatively autonomous”) effects of their own.
Undoubtedly one of screen theory’s great achievements, drawing as it did on psychoanalysis, Marxism, and the formal semiotics of Christian Metz, was to restore an emphasis on the analysis of texts that had been absent in much previous work. In particular, the insights of psychoanalysis were extremely influential in the development of later feminist work on the role of the media in the construction of gendered identities and gendered forms of spectatorship.
Proponents of screen theory argued that previous approaches had neglected the analysis of the textual forms and patterns of media products, concentrating instead on the analysis of patterns of ownership and control—on the assumption, crudely put, that once the capitalist ownership of the industry was demonstrated, there was no real need to examine the texts (programs or films) themselves in detail, as they would only display minor variations within the narrow limits dictated by their capitalist owners. Conversely, screen theory focused precisely on the text and emphasized the need for close analysis of textual/formal patterns: hardly surprisingly, given the background of its major figures in English and literary studies. However, these theorists’ arguments, in effect, merely inverted the terms of the sociological/economic forms of determinist theory that they critiqued. In screen theory, it was the text itself that was the central (if not exclusive) focus of the analysis, on the assumption that, since the text “positioned” the spectator, all that was necessary was the close analysis of texts, from which their “effects” on spectators could be automatically deduced, as spectators were bound to take up the “positions” constructed for them by the text (film).
The textual determination of screen theory, with its constant emphasis on the “suturing” (see Heath) of the spectator into the predetermined subject position constructed for him or her by the text, thus allocated a central place in media analysis to the analysis of the text. As Moores puts it, “the aim was to uncover the symbolic mechanisms through which cinematic texts confer subjectivity upon readers, sewing them into the film narrative, through the production of subject positions” on the assumption that the spectator (or reading subject) is left with no other option but, as Heath suggests, to “make . . . the meanings the film makes for him/her.”
Although film studies remains influenced by the psychoanalytic model (which has been usefully developed by Valerie Walkerdine in a way that attempts to make the paradigm less universalist/determinist), within communication and media studies it was Hall’s encoding/decoding model that set the basic conceptual framework for the notable boom in studies of media consumption and the media audiences that occurred during the 1980s. To take only the best-known examples, the body of work produced in that period included David Morley’s study of the Nationwide audience, Dorothy Hobson’s study of Crossroads viewers, Tania Modleski’s work on women viewers of soap opera, Janice Radway’s study of readers of romance fiction, Ien Ang’s study of Dallas viewers, John Fiske’s study entitled Television Culture, Greg Philo and Justin Lewis’s studies of the audience for television news, Sut Jhally and Lewis’s study of American audiences for The Cosby Show, and the work of K. Schroder and Tamar Liebes and Elihu Katz on the consumption of American television fiction in other cultures. Toward the end of the 1980s, much of the most important new material on media consumption was collected together in the published proceedings of two major conferences on audience studies—Phillip Drummond and Richard Paterson’s collection Television and Its Audience, bringing together work on audiences presented at the International Television Studies Conference in London in 1986, and Ellen Seiter’s collection Remote Control: Television, Audiences, and Cultural Power, based on the in fluential con ference of that name held in Tubingen, Germany, in 1987.
During the late 1980s, a new strand of research developed in audience studies, focusing on the domestic context of television’s reception within the household, often using a broadly ethnographic methodology and characteristically focusing on gender differences in TV viewing habits within the household or family. The major studies in this respect are Morley’s Family Television, James Lull’s Inside Family Viewing, Ann Gray’s Video Playtime, Roger Silverstone’s Television and Everyday Life, and, from a historical perspective, Lynn Spigel’s Make Room for TV.
In recent years, a number of technological and market developments have transformed the terrain of media consumption. The rise of home video, with its capacity for “time shifting,” has meant that viewers are no longer compelled to watch programs when they are broadcast but can integrate the shows more readily into their personal schedules. The remote control enables viewers to “graze” the broadcast schedules without rising from their armchairs, making audience members capable of “cannibalizing” what the broadcasters offer into their own customized/personalized viewing selections. At the same time, the development of cable and satellite services in many countries has led to the growth of multichannel viewing environments, where TV viewers now have a far wider range of choices.
Some observers have argued that these developments lead to the greater empowerment of the viewer/consumer in relation to the broadcasters. In combination with the dominant consumerist ideologies of the 1980s and 1990s, these technological and institutional changes have strengthened the development of what has come to be known as “active audience” theory. However, more recently some critics have urged caution, warning that audience activity should not be conflated with audience power, insofar as the media institutions continue to set the agenda (even if it is now broader and more readily time shifted and cannibalized by the viewer) from which audiences have to make their viewing choices.
One of the most significant issues that arises in this new technological and institutional context is that of the cultural consequences of the fragmentation of broadcast audiences. In this new situation, fewer people share a common broadcast experience as provided by a national broadcast channel. National broadcasting systems can no longer encourage social or cultural integration to the same extent that they did in the past. This trend continues with the rise of satellite broadcasting systems, which often bring together diasporic audiences across wide geographical territories, which transcend and cut across national communities and boundaries.