SEED Research & Announcements Blogs Publications Open Source Careers Contact Us Research & Announcements Blogs Publications Open Source Careers Contact Us

SIGGRAPH 21: Global Illumination Based on Surfels

This course was presented at ACM SIGGRAPH 2021. https://s2021.siggraph.org/

This SIGGRAPH presentation by Henrik Halen and Andreas Brinck presents global Illumination Based on Surfels (GIBS) – a solution for calculating indirect diffuse illumination in real-time. The solution combines hardware ray tracing with a discretization of scene geometry to cache and amortize lighting calculations across time and space. It requires no pre-computation, no special meshes, and no special UV sets, freeing artists from tedious and time-consuming processes required by traditional solutions. GIBS enables new possibilities in the runtime, allowing for high fidelity lighting in dynamic environments and for user created content, while accommodating content of arbitrary scale. The algorithm is part of the suite of tools available to developers and teams throughout EA as part of the Frostbite engine.

This talk will detail the GIBS algorithm and how surfels are used to enable real-time ray traced global illumination. We will describe how the scene is discretized into surfels on the fly, and why we think this discretization is a good fit for caching lighting operations. The talk will describe the acceleration structure used to enable efficient access to surfel data, and how this structure allows us to cover environments of arbitrary size, while keeping a predictable performance and memory footprint. We will detail how the algorithm handles dynamic objects, skinned characters, and transparency. Several techniques have been developed to efficiently integrate irradiance on surfels. We will describe our use of ray guiding, ray binning, spatial filters, and how we handle scenes with large numbers of lights.

Contributors:

Henrik Halen joined Electronic Art's SEED research division as a Senior Rendering Engineer in 2017. His work at SEED is focused on real-time graphics algorithms, lighting and characters. Henrik's experience as a rendering engineer prior to joining SEED includes a decade of contributions to franchises such as Gears of War, Battlefield, Medal of Honor and Mirror's Edge.

Andreas Brinck has worked as a rendering engineer for more than two decades. He joined Electronic Arts in 2011 to help start Ghost Games and was later the rendering lead on NFS Rivals, NFS 2015, NFS Payback, and NFS Heat. In 2019 he joined DICE LA where he is currently working on the Battlefield franchise.

Kyle Hayward has worked as a rendering engineer since 2010. He has focused on multiple areas in graphics, from animation compression to global illumination, working on both offline and real-time solutions. He joined EA in 2012, and later became the NBA rendering lead from 2014 onwards. In 2019 he joined Frostbite, where he has been working on global illumination and raytracing.

Xiangshun Bei has been a rendering engineer within DICE LA at EA since 2019, focusing on real-time rendering and ray tracing. He currently works on the Battlefield franchise. Prior to DICE, he contributed to graphics drivers for Adreno GPU on Snapdragon SoC at Qualcomm. He received his master’s degree in computer science from University of Southern California in 2017.

Download the presentation slides as PDF (50 MB).

Related News

Incorporating ML Research Into Audio Production: ExFlowSions Case Study

SEED
Jun 25, 2024
Mónica Villanueva and Jorge García present the challenges and lessons learned from turning a machine learning generative model from a research project into a game production tool.

Beyond White Noise for Real-Time Rendering

SEED
May 14, 2024
SEED’s Alan Wolfe discusses the use of different types of noise for random number generation, focusing on the application of blue noise in rendering images for gaming.

Evaluating Gesture Generation in a Large-Scale Open Challenge

SEED
May 9, 2024
This paper, published in Transactions on Graphics, reports on the second GENEA Challenge, a project to benchmark data-driven automatic co-speech gesture generation.