Research & Presentation

(Term 1: Week 14)

Advantages of Real-time Rendering in Animation and Visual Effects Design

Introduction

The world has reached technological advancement where real-time technology is possible to reinvent the entertainment production techniques. The industry must undergo a new paradigm shift. The next evolution in content production must adapt to the more flexible and interactive technology that can produce higher quality results in extremely short time.

Since the very beginning, the animation and visual effects production process were almost linear in nature and basically consisted of five phases of production which are development, preproduction, production, postproduction and distribution. Each phase had to be ready before the production could continue to the next phase. It also suffers with the delay to get immediate results from the current time-consuming rendering technique known as off-line rendering. In traditional workflows, any significant change to characters or camera angle might send the production back to square one to reproduce the work and visual render.

In other aspect, the artistic demands are also increasing. Previously impossible visions, from virtual world to photorealistic digital humans or beasts, directors are competing to come out with unique new ideas in their masterpiece. But with all that demands, tight schedules and budgets, the mantra of “fix it in post” resulting pressures on the visual effects production around the world. It is common for artists to feel frustrated with long working hours and turnaround time.

What is Real-Time Rendering

Real-time rendering is a technology designed to quickly process and display images on screen. It can be referred to basically anything related to render but most often synonym to 3D computer graphic and computer-generated imagery (CGI). There are actually many real-time engines available in the market for free and commercial such as Unreal Engine, Unity 3D, Blender’s Eevee and CryEngine. Video game creators have been using this technology for decades to create interactive games that rapidly renders 3D visual while simultaneously accepting user input allowing user to interact with characters and virtual world in real-time.

Real-time rendering and offline rendering are the two major rendering types. The main difference between the two is speed. Offline rendering is popular among animation and film production because of its capability to produce realistic renders, despite having to sacrifice time. But as real-time engines become more capable in producing realistic imagery in shorter time, designers starting to recognise this technology and begin implementing it in their workflows. 

As a brief comparison between rendering method, offline rendering such as Mental Ray and Blender’s Cycle use a technique called ray tracing, which render realistic images using a method that almost identical to how real lighting works. It cast multiple rays of light and bounce around the scene to reflect, refract, or absorb by objects. Since there are many probabilities along the bouncing route and one ray from one pixel is not that accurate, it casts additional randomised rays from that pixel and averages the result. This whole process takes a lot of processing power and time to calculate.

Figure 1: Ray Tracing
Figure 2: Rasterisation

The method used by real-time renderers such as Unreal Engine and Blender’s Eevee on the other hand are designed to ease the processing burden by using a process called rasterisation. The technique can be described as image manipulations where the 3D geometry is projected onto a raster of pixels that make up 2D image. Every pixel color is determined by shader based on surface normal and light position. Image edges are then anti-aliased, and occlusion is determined by z-buffer values. To make the output look nice, additional trickery is added such as light maps, shadow maps, blurred shadows, screen space reflections, and ambient occlusion. Real-time render uses approximations on the behaviour of light and will not be as accurate as off-line render. But modern real-time engines are getting more powerful than ever in which they can now doing hybrid rendering to do real time ray tracing with minimal sample and very few bounces to render only reflections and shadows. It then combines machine learning denoiser result with rasterised diffuse and other passes to get a nice final result.

Benefits for Production Workflow

In short, it is about interactivity, time and cost. One of the biggest advantages in using a real-time platform is that all departments are able to work immediately and simultaneously. It is now just ‘production’, and the term ‘preproduction’ is a thing of the past. Pre-visualisation works can be changed and updated with refine versions without needing to start from scratch in a new phase of production. 

Review cycle is fast as directors can give feedbacks in real-time and is no longer remain several days or weeks behind artist’s works as the production does not have to wait for painfully slow renders. Creative decisions can be made faster and artists can progress quickly with up-to-date direction without any delays between design iterations. This can prevent the loss of valuable production time and resources. 

Because of the less time intensive in real-time environment, artists can pitch and quickly experimenting their ideas to test out hunches and concepts in a way they cannot in traditional offline workflow. This will empower the director with additional idea perspectives as well. Early development process like script writing also can take advantage of the technology to use previs (revisualisation) to look at the sets, the characters, and all the assets to create a great script.

All of this would never have been possible without real-time technology, as it would have taken far too long to create unnecessary versions of the scene and ultimately too much luxury in terms of cost and time. With real-time rendering integration, directors, designers and clients will be able to instantly see how the end result of a project will look like. They will have complete control to experiment with any ideas and making changes to things like characters, lighting, and camera positions as they work, and avoid having to wait for lengthy rendering time.

Benefits for Visual Creativity

Real-time rendering is being integrated in visual effects and animation to create virtual production, digital humans, animated series, and commercials. It is clear that this is not a distant dream or possible trend but the future of filmmaking.

Epic Games, the creator of Unreal Engine, reveals how their engines are enriching production, enabling interactive storytelling and real-time production. Broadcasting companies have achieved a new level of quality and creativity by connecting broadcast technology with real-time engine for environment. Virtual sets eliminate high costs associated with physical sets and complex scenes can be shot live, without requiring extensive post-production, further driving down costs. 

Figure 3: Real-time virtual set

In weather news for example, special effects and CG elements like rain and flood can be added to a scene instantly and can interact with weather-caster in real-time allowing for greater flexibility with creative decision making. In 2015, The Weather Channel introduced this kind of immersive mixed reality experience to better explain the anatomy of a hurricane.

Figure 4: Weather-caster in virtual flood

In 2017, The Future Group uses Unreal Engine and Ross Video technology in combination with its own interactive mixed reality platform to produce Lost in Time, an episodic series in which contestants can compete, navigate and interact with the spectacular interactive virtual environments in real-time.

Figure 5: Contestants compete in a digital world of Lost in Time

In the same year, visual effects studio, The Mill produced a futuristic short film called ‘The Human Race’, that features the 2017 Chevrolet Camaro ZL1 in a heated race with an autonomous Chevrolet FNR concept car. For this film, live video feeds and data were fed on set into Unreal Engine together with The Mill’s Cyclops virtual production toolkit. CG car was then overlaid on top of a proxy vehicle covered in tracking markers called the Blackbird. The use of real-time technology gives the viewers the ability to customise the car in the film as they watch it. This hybridisation of film and gaming open possibilities in interactive creative storytelling. A film that you can play.

One of the most outstanding use of real-time rendering in full production is Zafari in 2018, a charming 52 episodes animated series produced by Digital Dimension. The series revolves around a cast of quirky critter characters who are born with the skin of other types of animals. Zafari is the first episodic animated series to be fully rendered in Unreal Engine.

The team was looking to create stunning visual effects with global illumination, subsurface scattering, motion blur, great water and lush jungle environment, while without being overly expensive and time-consuming to render. The animation also features dynamic simulation for character furs, trees and vegetations. Traditionally is not a simple task, but the studio was able to achieve that with the help of real-time rendering technology. Digital Dimension stated they can do 20 test renders within half an hour as compared to 2 in a day using previous iterations of the pipeline. This is the major benefit of using a real-time engine compared to reliance on a render farm in which shot iterations can be turned around extremely quickly.

One of feature films that has taken advantage of real-time rendering was Rouge One: A Star Wars Story. Industrial Light & Magic’s Advanced Development Group (ILM / ADG) was using Unreal Engine to make photorealistic renders of the beloved, sarcastic droid K-2SO in real time and bypassing the pre-rendering process. The droid had some great scenes and stole the hearts of millions of Star Wars fans. The group has its own render pipeline built on Unreal called the ADG’s GPU real-time renderer. ADG develops the tool to expand and enhance the creative process of storytelling through real-time rendering of visuals that approach motion picture quality, as the other scenes of the film was rendered using the standard offline render. By comparison, the real-time render scenes are very much identical to the shots that was rendered using offline render. By using the technology, the team was able to see the character on screen during shooting with added benefit that they could visualise the data in camera on set with virtual cinematography. The film also used real-time virtual sets using a system called SolidTrack which allows the team to build geometry of what is going to replace on the blue screen. It generates real-time graphics to get a representation of what the final set extension’s going to be.

ILM continued pursuing the usage of real-time rendering in the Solo: A Star Wars Story production. The team used StageCraft VR, a new virtual production system that powered by Unreal Engine to design and understand the scene physical dimensions when it came to previsualising the stunts. Working with real-time assets in a virtual production environment means that the team can move things around and play with different aspects of the sequence. ILM stated that the ability to work creatively in real time brings out something that is impossible with cumbersome, slow, pre-rendered assets.

These are just a few examples of real productions taking advantage of real-time rendering, with many more are expected to start embracing the technology and get creative with it.

Conclusion

Real-time production pipeline has many advantages that can save tremendous amount of time and cost while still maintaining high quality result. The technology is now powerful enough to produce high visual fidelity to match the aesthetic style of the creative vision. 

Real-time rendering allows creator to achieve shots that are otherwise unachievable and to obtain them quickly. And since the quality differences between offline and real-time rendering are getting closer and generally indistinguishable, we may see more major productions around the world to invest in real-time engines and standardise on new tools that make it easier to integrate the technology with existing pipelines.

The idea of hitting render and see the entire shots popping out in seconds may sounds too good to be true, but this is the power of real-time filmmaking. The future of content creation in a world where storytellers are not restricted by technology but empowered by it. A world where the only limitations are creativity and imagination.

References

Sloan, K., 2017. Why Real-Time Technology is the Future of Film and Television Production, pp.3-13.

Moller, T., 2018. Real-Time Rendering – Fourth Edition, Chapter 2: The Graphic Rendering Pipeline, pp.11-14. 

Evanson, N., 2019. How 3D Game Rendering Works, A Deeper Dive: Rasterization and Ray Tracing [Online] Available at: <https://www.techspot.com/article/1888-how-to-3d-rendering-rasterization-ray-tracing/> [Accessed: 10 January 2021]

Unity. Real-Time Rendering in 3D [Online] Available at: <https://unity3d.com/real-time-rendering-3d> (Accessed: 10 January 2021)

Unity. Real-time filmmaking, explained [Online] Available at: <https://unity.com/solutions/real-time-filmmaking-explained> [Accessed: 10 January 2021]

Novotny, J., 2018. How Does Eevee Work [Online] Available at: <https://blender.stackexchange.com/questions/120372/how-does-eevee-work> (Accessed: 10 January 2021)

Mirko, 2020. The Main Advantages of Real Time Engines vs Offline Rendering in Architecture [Online] Available at: <https://oneirosvr.com/real-time-rendering-vs-offline-rendering-in-architecture/> (Accessed: 10 January 2021)

Lampel, J., 2019. Cycles vs. Eevee – 15 Limitations of Real Time Rendering in Blender 2.8 [Online] Available at: <https://cgcookie.com/articles/blender-cycles-vs-eevee-15-limitations-of-real-time-rendering> (Accessed: 10 January 2021)

Failes, I., 2017. How Real-time Rendering Is Changing VFX And Animation Production [Online] Available at: <https://www.cartoonbrew.com/tools/real-time-rendering-changing-vfx-animation-production-153091.html> (Accessed: 10 January 2021)

Pimente, K., 2018. Animated children’s series ZAFARI springs to life with Unreal Engine [Online] Available at: <https://www.unrealengine.com/en-US/spotlights/animated-children-s-series-zafari-springs-to-life-with-unreal-engine> (Accessed: 16 January 2021)

Failes, I., 2017. Upcoming Animated Series ‘Zafari’ Is Being Rendered Completely With The Unreal Game Engine [Online] Available at: <https://www.cartoonbrew.com/tools/upcoming-animated-series-zafari-rendered-completely-unreal-game-engine-153123.html> (Accessed: 16 January 2021)

The Mill. Blending live-action film and gaming for Chevrolet [Online] Available at: <https://www.themill.com/experience/case-study/chevrolet-the-human-race/> (Accessed: 18 January 2021)

Bishop, B., 2017. Rogue One’s best visual effects happened while the camera was rolling [Online] Available at: <https://www.theverge.com/2017/4/5/15191298/rogue-one-a-star-wars-story-gareth-edwards-john-knoll-interview-visual-effects>(Accessed: 18 January 2021)

Seymour, M., 2017. Gene Splicer From 3lateral & ILM Rogue One on UE4 [Online] Available at: <https://www.fxguide.com/quicktakes/gene-splicer-from-3lateral-ilm-rogue-one-on-ue4/> (Accessed: 18 January 2021)

Morin, D., 2019. Unreal Engine powers ILM’s VR virtual production toolset on “Solo: A Star Wars Story” [Online] Available at: <https://www.unrealengine.com/en-US/spotlights/unreal-engine-powers-ilm-s-vr-virtual-production-toolset-on-solo-a-star-wars-story> (Accessed: 18 January 2021)

Polinchock, D., 2019. The Weather Channel Uses Immersive Mixed Reality to Bring Weather to Life [Online] Available at: <https://www.mediavillage.com/article/the-weather-channel-uses-immersive-mixed-reality-to-bring-weather-to-life/> (Accessed: 18 January 2021)

Lumsden, B., 2019. Virtual Production: The Future Group pushes XR to the limit [Online] Available at: <https://www.unrealengine.com/en-US/spotlights/virtual-production-the-future-group-pushes-xr-limit> (Accessed: 18 January 2021)

Leave a Reply

Your email address will not be published. Required fields are marked *