By Aditya Savnal. Posted on June 17, 2015
Francis Ford Coppola once said, "The essence of cinema is editing. It's the combination of what can be extraordinary images of people during emotional moments, or images in a general sense, put together in a kind of alchemy."
Editing today has come a long way from the early days of cinema. The history of editing is a fascinating one and this in-depth video by Filmmaker IQ shows that evolution in fine detail.
In November 2, 1936, BBC began transmitting the world’s first public regular television broadcast service. But during the Second World War, it had to go off air. And in 1948, three years after the war ended, the first television commercial broadcasts began in United States.
It enabled people to watch shows and news broadcasts in the comfort of their homes. These shows were cut live in a studio that had several cameras hooked up to a video switcher that could switch between cameras. These signals were sent over the air and through cables to affiliate studios in other parts of the network for broadcasting. But all this had to be live as there were no devices to electronically record the television signal. This created a broadcast problem, when you wanted to show the same content in a country with a different time zone.
This led to the invention of kinescope, a film camera focused on a video monitor that let people record TV transmissions. The process sounded simple. But issues like ghosting and banding created a problem in reproducing good quality reproduction of images.
But, the kinescope played an important role in the evolution of TV broadcasting. There was a great demand for TV and by 1954, TV networks were actually using more raw film stock in their kinescopes than all of the Hollywood film studios combined. The networks clearly needed a more economical alternative.
Though magnetic tapes were being used in audio recording, technological hurdles prevented transferring them onto video. Finally after years of hardwork, in 1956 Ampex released the first commercially available video tape recorder – the 2 Inch Quadruplex video tape. By 1959, videotapes became a much accepted format in the television industry.
During this period, tape was only being used for archival and distribution purposes. It was possible to edit these early 2 inch Quadruplex tapes and had a process similar to editing films. But it was a cumbersome process, as all of this had to be done without actually seeing what frame you were on. Because the quadruplex tape was incapable of holding still frames.
NBC then developed a system using the kinescopes for creating work prints. The shows were then edited using these kinescope film prints and contained audio cues which the editor could match back when splicing the video tape. Known as ESG, its process was similar to what would later be called offline editing, which essentially is editing with a lower quality copy of the original raw material and then assembling the high quality originals based on that edit.
And then there was another video editing technique that soon would develop an existence of its own– Linear Editing. Linear editing made it possible to assemble the different cuts of a film onto a single tape, unlike the earlier process. However, it sounds simpler than it actually was. Several issues were faced by this form of editing, like how to ensure the signals between the video decks matched up?
And before linear editing became popular, it underwent a few technological advancements. In 1963 Ampex introduced the Editec – an all electronic videotape recorder with a simple microprocessor that could control in and out points that were marked by audible tones.
And in 1967 – the SMPTE time code developed by EECO and Society of Motion Picture and Television Engineers enabled a video tape player to locate any frame on the tape as each frame was assigned an “address” in terms of hours, minutes, seconds and frames. This practice made linear editing very popular in the 1970’s. It allowed smaller market TV affiliates their own video editing systems that helped them cut their own shows.
However this didn’t make linear editing a creatively advanced craft and it became a technical profession. And as a show was assembled in linear fashion, any changes to the beginning of a show would mean everything after would have to be reassembled, thereby quashing the existence of the rough cut.
With the passage of time, editors and broadcasting corporations, began to shun the linear editing technique. A new editing technique was being developed which gave editors to edit the freedom as per their preference. This system eventually came to be known as Non-Linear Editing. As a technique and process, non linear editing was nondestructive. It enabled editors to assemble a cut in any order and made possible changes without disturbing the rest of the assembly. It was a much more natural way of editing.
During the 80s, the editing suite EditDroid became popular. However the company shut down in 1987. Other machines tried using a bank of VCRs for editing, but they made the process slow and cumbersome.
Two years later, the Avid1 – a Macintosh based non linear editor was released. And the rest, as they say is history. Soon, the physical editing equipments and techniques gave way to digital editing technology.
As computers became more powerful and storage cheaper, software based Non-linear editors like Adobe Premiere and Media 100 forced Avid to constantly lower the price of their system.
In the midst of this, a company called Macromedia developed the software Keygrip. But licensing and other issues with their partners such as Microsoft, forced them to sell the product. The buyer was Steve Jobs and his company Apple would release the software the following year as Final Cut Pro.
The divide between television/video production and film production began to blur with the adoption of high definition video production. Japan and other countries were working to standardize HD video streaming since the 70’s. US however made a seemingly late breakthrough in HD technology, with their first public HD broadcast being done on 23rd July 1996.
Over the years, filmmaking has undergone a transition from being shot in film to being shot digitally with HD video and 2K film scan resolution. Did you know that HD video and 2K film scans roughly share the same resolution with HD resolution being 1920×1080 and 2K resolution being 2048×1080. So it wasn’t long before Hollywood became keen to skip the whole 35mm film format and make a transition from film to digital. And Star Wars II : Attack Of The Clones became the first film to be shot digitally.
And by the latter half of the 2000's, the emergence of better cameras, 4K resolution among other things made it possible to shoot films straight onto the digital format with online editing transformed the process of film editing.