'Behind-the-Tech' with Alex Coppedge and Sky Ferren

 

09/25/2024

Tell me about Long Live Dick Parker

Long Live Dick Parker is a TV series about an aging outlaw with a lifetime of questionable adventures. He has dodged trouble like a half-wild animal for decades. Known for tall tales and dangerous enemies, Dick possesses a peculiar device that stores and relives memories – something powerful enough to be weaponized.

When his enemies come for it, Dick turns to the only person he trusts, “Professor” – a risk-averse, regret-filled man who always played it safe. Now, “Professor” must confront Dick's chaotic past and his own faded dreams, deciding whether to follow his wild friend or let the world fall into dangerous hands.

In this particular teaser, we witness both the end of the road for Dick and the origin of his discovery of the device.

How did the Ghostframe test element for this project come about?

In 2023, Sky Ferren and I found ourselves deeply engrossed in the making of a film, where we sought to blend the use of LED volume with the trusted advantages of traditional chroma key workflows, essential for set extensions and post-production VFX work. We recognized that, for this approach to truly succeed on set, it would need to serve not only the technical demands of production and post, but also the practical needs of the actors, director, department heads, and every crew member present. To achieve this, we had to confine the disruptive flicker of GhostFrame technology to the inner frustum alone, thereby preserving the comfort and immersion of those on stage.

Upon presenting this challenge to an engineer at Lux Machina, we were met with an innovative solution – one that had the potential to bring our vision to life. With the concept in hand, all that remained was the testing, to see if this newly devised process could meet the rigorous demands of production.

What were you aiming to achieve?

Our ambition was to harness all the advantages of shooting on an LED volume, enhancing the collaborative spirit it fosters during both preproduction and production; while retaining an easy entry into post VFX work. Above all, we sought to offer actors the opportunity to perform within the context of their surrounding environment, rather than against the barren void of a blue screen. For the director, it was imperative to make informed creative decisions in real time, while ensuring the production designer, VFX supervisor, and cinematographer could collaborate with precision during the shoot itself. Questions of vital importance always arise: Where should the sun be placed in this scene and what practical lighting will we use to heighten? What practical reflections could be captured on set? Which camera angle best serves both the narrative and the technical requirements for VFX down the line?

To ensure the process functioned seamlessly from a creative and technical perspective – and to expedite this collaboration into post-production – we understood the necessity of having VFX compositing artists present on set. Thus, we developed an on-set pipeline that allowed data from Unreal Engine to be directly transmitted to Nuke artists working beside us, literally as we shot each take. In doing so, the collaborative efforts between production and post-production could begin while the cameras still rolled, and we received near-time slap compositions, immediately ready for dailies, ensuring a dynamic and integrated workflow from shoot to post-production.

How did you go about getting collaborators/ partners involved to make it happen?

As the pipeline began to take shape in theory, we knew it was imperative to consider its full journey – from preproduction to production, and through post-production. This required enlisting the proper partners to ensure its success. The process had to not only serve the narrative of the script but also address technical challenges, particularly those anticipated by the camera department – chief among them being the issue of motion blur when filming at higher frame rates. Moreover, we aimed to minimize chroma spill cleanup during post-production.

To address these challenges, we collaborated with Megapixel and Lux Machina, working to refine a method that would confine the disruptive GhostFrame flicker to the inner frustum only. Our goal was to alternate chroma key frames and in-camera Unreal environment frames on a 48fps timeline, flickering chroma every other frame. Naturally, the question of preserving cinematic motion blur arose, as higher frame rates can often result in undesired motion blur. Through testing with multiple cameras, we discovered the optimal shutter settings that would maintain the integrity of motion blur once we divided 48fps back down to 24fps. The stage team and engineers’ expertise enabled us to resolve these issues efficiently, ensuring proper sync and the retention of cinematic motion blur across our frames.

To further mitigate chroma spill, we sought out Michael McReynolds, who was developing a GhostFrame Extractor node for Nuke. His innovative tool allowed us to take the 48fps footage, extract the chroma frames, pull a clean key, and roll the matte back onto the in-camera frames. This node efficiently provided a key, preserving all lighting and reflections on both talent and set, while eliminating the chroma spill from the inner frustum’s blue chroma frames. A perfect solution for immediate post-vis work.

Finally, we partnered with CreamSource, whose Image Based Lighting fixtures allowed us to realize the full potential of interactive lighting. This became particularly important after the removal of blue chroma spill using the GhostFrame Extractor, revealing the true benefits of dynamic, in-camera lighting that enhanced the visual impact of our production.

Why/ How is this working test important to professionals and the industry in VFX and Virtual production workflow?

For this test, we chose to shoot a short teaser, putting our entire pipeline to practical use. This is a critical step in bridging the gap between Virtual Production and VFX within the industry. In filmmaking, the art department begins developing assets early on, and virtual production enables a Virtual Art Department to start previs, offering an invaluable collaborative opportunity for all departments before production even begins.

These early assets can then be further refined for in-camera VFX (ICVFX) on LED volumes, providing crucial groundwork for technical visualization, such as lighting schematics, camera placement, lens choices, movement, set construction, set decoration, blocking, and shot coverage. This foundation ensures that informed, precise creative decisions are made on set, empowering the talent and crew to execute the vision with creative specificity during production; and allows for easy pivoting when that “magical” moment occurs on set between a cohort of creative minds.

Finally, by integrating comp artists directly on stage, we seamlessly extend the journey of those 3D assets and captured data from the LED stage, pushing them to post-production without delay – allowing VFX work to begin immediately, before we even wrap for the day. This workflow not only accelerates the process but ensures greater creative control throughout the entire production pipeline, and pulls talented post-production artists into the decision making process as it unfolds.

 
Guest User