Tristan Gardner
Premium Green-Screen Keying at Scale: What It Takes
The Art and Science of Green Screening: Our Journey to Mastery
As a provider of speaker-driven courses, we deal with huge reels of green-screen footage. Achieving cinema-level quality results while removing green backgrounds (known as “keying”) from hours of raw footage hosts a variety of difficulties that grow exponentially at scale. Whether it involved building computers or trying every keying software on the market, we’ve undergone quite the journey in developing the capability to do so.
Our dedication to transforming this bottleneck into a powerful capability not only ensures that we maintain quick turnarounds, but also positions us as a reliable long-term partner providing consistent results for our clients every time.
The Complexity of Green Screening
Keying is the process of removing green or blue from a chroma-background that you wish to replace. It’s one of the first steps in creating the incredible scenes we see in movies and TV shows today. While some commercial studios shy away from it, we embrace its challenges by investing in specialized computers, adopting industry standard software for visual effects and developing our own code to streamline the setup of batch renders. We aim to provide unparalleled green screening in digital education and we can do so the same way every time, regardless of production inconsistencies which are bound to happen across various teams, locations, and variables.
Why is Green Screening Challenging?
The perfection of a green-screen key lies in the computer's ability to differentiate between a pixel that should be retained and one that should be replaced. There are two major methods we use to make these comparisons. Chroma-keying involves picking a target hue on your backdrop and comparing all pixels to it: the closer it is to a match, the more probable it will be deleted. This becomes particularly challenging when the backdrop is not perfectly lit or around tricky areas like hair, moving objects with motion blur, or through translucent materials like glasses, to name a few.
More powerful software allows us to employ image-based keying (IBK) which involves comparing your image to a clean shot of the background with no subject called a clean plate. Making the switch from Adobe’s Keylight (a fantastic chroma-keyer) to Nuke’s IBK gave us crispy and realistic results with less manual edits to individual clips required. We’re able to pull keys around difficult parts like frizzy hair and get better quality than we ever had before, and we were able to apply this effect to all clips that were shot in the same session. There’s no chattering around subjects, missing edges, or any noise on the background coming through.
The Technical Hurdles
Keying requires huge compute power when working with cinema camera codecs at 4k and when exporting an alpha channel. Consider this: a computer must decide the fate of over 8 million pixels (4k resolution) for every single frame. For a 15-minute video take, shot at 30 frames per second, the number of decisions to be made are staggering. This is why the process is resource-intensive.
Most pro-consumer grade video software relies heavily on the CPU for the bulk of rendering jobs, but for quickest green-screen renders, you should be maxing out the GPU which is much more suited for tasks like keying. We benchmarked key quality, CPU and GPU usage on the same file while rendering via Ultra Key, Keylight, Primatte, Ultimatte and other popular prosumer keyers. None of them gave us the right combination of majority GPU usage and quality keys in hard situations set up efficiently for batch rendering. This led us to adopt Nuke, the leading compositing and visual effects software Utilized by most major studios and VFX houses in Hollywood.
Our Evolution
Previously, our studio was equipped with Apple desktops alone. While aesthetically pleasing, they lacked flexibility for the upgraded GPUs we needed without shelling out $7,000. So, we learned to install our own integrated server system suited for high data rates and built computers perfect for heavy render jobs - Oh and we had to reintroduce ourselves to the Windows OS. With Nuke baking keys through the GPU, we managed to reduce our green screen rendering time by nearly 75%.
Nuke, while as expensive as a used car, was a game-changer. However, mastering it was no small feat. As one visual effects house states,
“The industry standard for the most realistic compositing possible, Nuke is not for the faint of heart. Nuke is an incredibly advanced node-based program for feature-film quality VFX and compositing. It’s geared towards serious VFX artists and can take a substantial amount of time to learn.”
Credit: Ben Thompson, ActionVFX
We relied on Google, convoluted forums and our own experiments to harness its power.
Our newfound knowledge of image-based / difference keying in Nuke revolutionized our keying ability. By making minor adjustments to our filming process, we could use a background image, or a clean plate, to achieve consistent results across all video takes.
Efficiency at Scale
Quality is paramount, but when churning out keyed chroma-footage at scale, efficiency is just as crucial. There are many keyers that provide great results on a clip-by-clip basis. However, processing many clips on a commercial budget requires a software that can both pull a great key and allows you to batch that job across many clips.
Although Nuke isn’t directly made for batch rendering like we need, it’s set up with visual nodes, which allow complex mappings of effects and renderings in a web of edits. We use this foundation to create our own expandable batch-rendering system. One 45 minute course can have as many as 40 raw interview files each ranging from five minutes to fifteen minutes. That is a lot of footage.
But we had one more problem: Nuke requires a lot of manual entry. and setting up even one file to render properly has a lot of room for human error. After keying a hero clip, it still took over 20 clicks and multiple points of manual data entry to properly setup each file for export. We have to be sure everything is correct before starting a lengthy export. Luckily, Nuke integrates Python directly inside itself, so all we had to do was toil through its API documentation and figure out how to replicate our human process for creating this powerful node-web with code. Now, we run two custom Python scripts and our entire web of nodes is created automatically.
Green screening is intricate if quality is a concern, and its complexity multiplies at scale. While there are numerous tools available, achieving efficiency and quality simultaneously without ballooning client budgets requires innovation. Third-party vendors that we solicited for quotes seemed to consistently take the clip-by-clip approach and charge exorbitant amounts of money both onshore and overseas. We've ventured beyond conventional commercial workflows, refusing to settle for what merely got the job done. Our journey in mastering green screening wasn't easy, but it has positioned us as leaders in premium digital learning production. I hope our story offers insights and saves you from the endless forum scrolls after midnight.
Whether it's help with keying your own footage, or insight into our specific process, please reach out at suorastudios.com/contact.
Check out a featured project case study below.