Video Editing
Video Editing
Television historians have had little to say about post production. despite the central role that video-editing practices and technologies have played in the changing look and sound of television. Video editing developed through three historical phases: physical film/tape cutting, electronic linear editing, and digital nonlinear editing. Even before the development of a successful videotape recording format in 1956 (the Ampex VR- 1000), time zone requirements for national broadcasting required a means of recording and transporting programs. Kinescopes, filmed recordings of live video shows for delayed airing in other time zones, were used for this practice. Minimal film editing of these kinescopes was an obligatory part of network television.
Bio
Once videotape found widespread use, the term "stop-and-go recording" was used to designate those "live" shows that would be shot in pieces then later edited together. Physically splicing the two-inch quad videotape proved cumbersome and unforgiving, how ever, and NBC/Burbank developed a system in 1957 that used 16mm kinescopes, not for broadcasting, but as "work-prints" to rough-cut a show before physically handling the videotape. Audible cues on the film's optical soundtrack allowed tape editors to match-back frame-for-frame each cut. Essentially, this was the first "offline" system for video. Known as ESG, this system of rough-cutting film and conforming on tape (a reversal of what would become standard industry practice in the 1990s) reached its zenith in 1968 with Rowan and Martin's Laugh-In. That show required 350 to 400 tape-splices and 60 hours of physical splicing to build up each episode's edit master.
A cleaner way to manipulate prerecorded video elements had, however, been introduced in 1963 with Ampex's all electronic Editec. With VTRs (videotape recorders) now controlled by computers, and in- and out-points marked by audible tones, the era of electronic "transfer editing" had begun. Original source recordings were left unaltered, and discrete video shots and sounds were re-recorded in a new sequence on a second-generation edit master. In 1967 other technologies added options now commonplace in video-editing studios. Ampex introduced the HS-100 videodisc recorder (a prototype for now requisite slow-motion and freeze-frame effects) that was used extensively by ABC in the 1968 Olympics. Helical scan VTRs (which threaded and recorded tape in a spiral pattern around a rotating head) appeared at the same time and ushered in a decade in which technological formats were increasingly miniaturized (en abled in part by the shift to fully transistorized VTRs, like the RCA TR-22, in 1961 ). New users and markets opened up with the shift to helical: educational, community activist, and cable cooperatives all began producing on the half-inch EIAJ format that followed; producers of commercials and industrial video made the three-quarter- inch U-matic format pioneered by Sony in 1973 its workhorse platform for nearly two decades; newsrooms jettisoned 16mm news film (along with its labs and unions) for the same videocassette-based format in the late 1970s; even networks and affiliates replaced venerable two inch quad machines with one-inch helical starting in 1977.
The standardization of "time-code" editing, more than any other development, made this proliferating use viable. Developed by EECO in 1967, time-code was awarded an Emmy in 1971 and standardized by SMPTE shortly thereafter. The process assigned each video frame a digital "audio address," allowed editors to manage lists of hundreds of shots, and made frame accuracy and rapidly cut sequences a norm. The explosive growth of nonnetwork video in the 1970s was directly tied to these and other refinements in electronic editing.
Nonlinear digital editing, a third phase, began in the late 1980s both as a response to the shortcomings of electronic transfer editing, and as a result of economic and institutional changes (the influence of music video, and the merging of film and television). To creative personnel trained in film, state-of-the-art online video suites had become little more than engineering monoliths that prevented cutting-edge directors from working intuitively. In linear time-code editing, for example, changes made at minute 12 of a program meant that the entire program after that point had to be re-edited to accommodate the change in program duration. Time-code editing, which made this possible, also essentially quantified the process, so that the art of editing meant merely managing frame in/out numbers for shots on extensive edit decision lists (EDLs). With more than 80 percent of prime-time television still shot on film by the end of the 1980s, the complicated abstractions and obsolescence that characterized these linear video formats also meant that many Hollywood television producers simply preferred to deliver programs to the networks from film prints-cut on flatbeds and conformed from negatives. The capital intensive nature of video post production also segregated labor in the suites. Directors were clients who delegated edit rendering tasks to house technicians and DYE artists. Online linear editing was neither spontaneous nor user-friendly.
Nonlinear procedures minimized the use of video tape entirely and attacked the linear straightjacket on several fronts. Beginning in 1983, systems were developed to download or digitize (rather than record) film/video footage onto videodiscs (LaserEdit, LucasArts' EditDroid, CMX 6000) or computer hard drive arrays (Lightworks, the Cube). This created the possibility of random-access retrieval as an "edited" sequence. Yet nonlinear marked an aesthetic and methodological shift as much as a technological breakthrough. Nonlinear technologies desegregated the editing crafts; synthesized postproduction down to the desktop level, the personal-computer scale; allowed users to intervene, rework, and revise edited sequences without re-creating entire programs; and enabled editors to render and recall for clients at will numerous stylistic variations of the same show. Directors and producers now commonly did their own editing, in their own offices. When Avid launched its Composer in 1989, the trade journals marveled at its "32 levels of undo," and its ability to eliminate changes and restore previously edited sequences. Nothing was locked in stone.
This openness allowed for a kind of experimentation and formal volatility perfectly suited for the stylistic excesses that characterized contemporary television in the late 1980s and 1990s. When systems like the Avid and the Media 100 were upgraded to online mastering systems in the 1990s-complete with on-command digital video effects-the anything-can go-anywhere premise made televisual embellishment an obligatory user challenge. The geometric growth of hard-disc memory storage, the pervasive paradigm of desktop publishing, and the pressure to make editing less an engineering accomplishment than a film artist's intuitive statement sold nonlinear procedures and technologies to the industry.
Video editing faces a trajectory far less predictable than that of the 1950s, when an industrial-corporate triumvirate of Ampex/RCA/NBC controlled technology and use. The future is open largely because editing applications have proliferated far beyond those developed for network oligopoly. Video is everywhere. Nonlinear established its beachhead in the production of commercials and music videos, not in network television. Still, by 1993, the mainstream Academy of Television Arts and Sciences had lauded Avid's nonlinear system with an Emmy. By 1995 traditional television equipment manufacturers such as Sony, Panasonic, and Grass Valley were covering their bets by selling user-friendly, nonlinear Avid-clones even as they continued slugging it out over digital tape-based electronic editing systems.
Although prime-time producing factories like Universal/MCA Television continued using a range of film, linear, and nonlinear systems through the mid 1990s, in a few short years digital technology would dominate post production. Even as Avid's Composer and Symphony systems were standard in high resolution online work, a range of new technologies and standards undercut Avid's market share. While marketed initially as "industrial" and "nonbroadcast" technologies, Sony's DVCAM format and Apple's firewire protocol (both with 4:1:1 compression), provide cost-effective alternatives for image processing and data storage, and edit systems utilizing these for mats rapidly spread in popularity. Sensing corporate decline, Avid rebuffed the very partner (Apple) that had made Avid synonymous with nonlinear work, by announcing that it would discontinue Macintosh support and make systems only for NT platforms in 1997-98. Apple got the last laugh, however, with its launch of Final Cut Pro (FCP) in 1999, an inexpensive editing program for Macintoshes that was built internally around the new DV compression and firewire.
Today, FCP makes filmmaking available to any consumer, even as it is widely utilized in post production, providing a system that even online editors can use outside of the online suites. Initially denigrated by high-end editors, FCP systems now are available with uncompressed and high-definition boards manufactured by Pinnacle and others; a development that further complicates the institutional appetite for segregating prime-time/high-definition work from industrial/consumer applications. Many of the new prime-time reality shows of the early 2000s, such as Temptation Island, for example, employed FCP, since the thousands of "unplanned" hours of DY CAM footage now had to be waded through by gangs of low-paid production assistants before they could ever be assembled as a show by editors. In some ways, this was a throwback to the old Hollywood studio system, which utilized many assistants, and a retreat from the new status nonlinear had achieved by combining the entire editing process into single multitasked workstation. In this way, aesthetic and generic changes in television affect changes and uses of technologies.
The need for ever-higher resolutions (and therefore faster processing and increased data storage) continues to challenge editors and manufacturers. Rather than forever increase the storage, speed, and cost of each nonlinear workstation, companies such as Quantel have developed common, large "servers" that can be accessed by scores of editors working simultaneously at remote workstations. These "wide area networks" represent but one way that post production executives and manufacturers juggle the inflationary costs of higher-resolution quality with the economies of data storage. At the same time, audiences at home now edit their own programming flows with personal video recorders such as TiVo and Video Replay that utilize the very storage and processing technologies that standardized nonlinear editing in the industry.