INTRODUCTION
Every film can be identified as an organic system when it gets to a completion state where every shot and every scene are bound together to fit a story context, however provide with the same screenplay the context itself varies from person to person. In most of the cases, the final interpretation right falls into hands of the film editor.
In traditional motion picture production, to capture a variety of actor performances, each shot will mostly end up with multiple takes. Actors make multiple attempts during the production phase with performance variations from time to time. Subtle changes in each performance possess different meaning for a different context. This leaves the film editor to compose the film based on their own interpretation with the ocean of dailies and. However, a significant amount of performance and emotions are abandoned during postproduction. Boundless possibilities were hidden behind the scene as the untold story.
The realm of Interactive media provides a new way of storytelling that preserve a great number of performances as a pool and can dynamically tailor the audio-visual experience at different times to different audiences. The Real-time Editing System explores a new frontier by composing its story timeline based on audiences’ real-time inputs, much like a storyteller adapts their speech to suit their audiences’ interest every different time.
In traditional motion picture production, to capture a variety of actor performances, each shot will mostly end up with multiple takes. Actors make multiple attempts during the production phase with performance variations from time to time. Subtle changes in each performance possess different meaning for a different context. This leaves the film editor to compose the film based on their own interpretation with the ocean of dailies and. However, a significant amount of performance and emotions are abandoned during postproduction. Boundless possibilities were hidden behind the scene as the untold story.
The realm of Interactive media provides a new way of storytelling that preserve a great number of performances as a pool and can dynamically tailor the audio-visual experience at different times to different audiences. The Real-time Editing System explores a new frontier by composing its story timeline based on audiences’ real-time inputs, much like a storyteller adapts their speech to suit their audiences’ interest every different time.
ABSTRACT
The Real-Time Editing System was built as a part of my thesis project to fulfill the Master of Fine Arts degree requirement in the University of Southern California, School of Cinematic Arts. It is a plug-in engineered on the foundation of the Unity 3D engine, its timeline system, and the Timeline Events add-on. This system supports the dynamic editing behavior of Unity film and the concept of Real-Time Editing can be transfer to another game engine by following its design logic and blueprints.
The Real-Time Editing System allows the story to deliver different experiences for different audience from the same liner story. The variation of the experience reflected in the pacing and performance of the story. The system can dynamically shrink or expand the scene for audiences depend on their preference, whether they prefer more on the actions or more on the characters. The system also capable to preserves all actors’ performances during the virtual production phase and swap them on during the play.
The diverse experience that the real-time editing system brings to the audience is customized and calculated from eye tracking data. The viewer's eye gazing behavior data is collected by Tobii eye-tracking camera and then converted into weights, there are multiple algorithms interpret on the viewer's gazing behavior from these weights. Different scenes are pre-set with one or more algorithms for calculation. The result of the calculation determines the real-time editing behavior.
The Real-Time Editing System allows the story to deliver different experiences for different audience from the same liner story. The variation of the experience reflected in the pacing and performance of the story. The system can dynamically shrink or expand the scene for audiences depend on their preference, whether they prefer more on the actions or more on the characters. The system also capable to preserves all actors’ performances during the virtual production phase and swap them on during the play.
The diverse experience that the real-time editing system brings to the audience is customized and calculated from eye tracking data. The viewer's eye gazing behavior data is collected by Tobii eye-tracking camera and then converted into weights, there are multiple algorithms interpret on the viewer's gazing behavior from these weights. Different scenes are pre-set with one or more algorithms for calculation. The result of the calculation determines the real-time editing behavior.
PROJECT VISION
This project aims to achieve real-time rendered graphics in combination with procedurally determined motion picture editing, all running in the Unity engine, to achieve a dynamic story flow for a non-repetitive audio-visual experience. This procedurally determined editing system, or The Real-Time Editing System, allows an animated film to edits the film based on audience interaction.
As an interactive system, The Real-time Editing System mainly aiming to explore the influence of audience subconscious effect as Passive Interaction on the story. The fundamental interaction is the audience’s Gaze inputs. These inputs determine editing outcomes through a heat-map based Gaze weight calculation system in order to tailor the experience, either by the varying the camera coverage (Virtual Camera settings) or by varying the actor performance (Animation Clip). We also added a physical controller as a potential interaction device, for audiences who seek more active interaction and feedback to enhance the emotional experience.
As an interactive system, The Real-time Editing System mainly aiming to explore the influence of audience subconscious effect as Passive Interaction on the story. The fundamental interaction is the audience’s Gaze inputs. These inputs determine editing outcomes through a heat-map based Gaze weight calculation system in order to tailor the experience, either by the varying the camera coverage (Virtual Camera settings) or by varying the actor performance (Animation Clip). We also added a physical controller as a potential interaction device, for audiences who seek more active interaction and feedback to enhance the emotional experience.
CONCLUSION
The Real-time editing system in project Heliosphere is an experimental prototype which conserves variation of performances and camera sets then edit the story in the Real-time based on audience‘s Gaze inputs. An exploration which took advantage of the Real-time rendering game engine in the interdisciplinary field of virtual production, allows the system to conserve varieties of performances during the production and delivery dynamic story experiences to the audience.