Master thesis rendering

We are Frostbite

Frostbite is a technology and collaboration platform for diverse game franchises such as Battlefield, FIFA, Star Wars Battlefront, Mass Effect, Need for Speed, Dragon Age and Plant vs. Zombies Garden Warfare.

Frostbite is widely seen as an industry leading platform for game development, especially when it comes to high fidelity real-time rendering. We are now looking for a master thesis student to join our Frostbite Rendering team in Stockholm.

Your task will be to investigate, research and/or prototype something that could help us make even better looking games in the future!

We thought you might also want to know

Please note this is an unpaid internship that is to be a part of your ongoing university education. We are not able to provide relocation services for this role, so we will prioritize candidates already residing in Sweden. You may be asked to take on a programming test in the recruitment process. Please send your CV, cover letter and a brief description of the thesis subject as part of the application.

Follow this link to send in your application:

Master thesis subjects

Below is a list of suggestions for master thesis subjects. However, we are also open to your own suggestions!

Scene voxelization and applications

Scene voxelization

Point based scene representation or binary voxelization. For some advanced lighting techniques a approximate and efficient GPU based representation of the scene is required. Investigate existing methods and research around possible representations and implement a prototype.

Many-light shadowing

Many-lights rendering techniques are becoming more and more efficient but adding lights without shadows tend to flatten the world out. We would like to investigate efficient shadowing techniques for spot and omnidirectional like lights that would easily scale to many-lights. A voxel-based scene representation could be used to make visibility query more efficient.

Particles collision

Particles could collide with the voxel instead of the depth buffer. This would give a more coherent behavior non visible surface and out of screen particles.

Ambient occlusion

Improve current AO techniques by tracing the voxel representation of the world.

Fluid simulation

Use the voxel representation to do coarse fluid simulation with improved boundary condition with the world opaque surface.

Progressive voxelization

Refine the voxelization from the many depth buffer we have instead of voxelizing the scene.

Rendering techniques

Volumetric Cloud rendering

Rendering convincing skies is not only the result of the sky and atmosphere simulation only but also the result of beautiful clouds. Frostbite can render beautiful panoramic clouds but it is a very static solution. Dynamic time of day and weather will require more advanced cloud rendering techniques.
We want to investigate different solution to the rendering of a cloud layer or more complex cumulus shapes. They should all render with respect to the sun position with self-shadow and a scattering simulation. Several solutions can be investigated (precomputed transfer, slabs, etc.).  The clouds will need to cast shadow on the environment and the rendering should be scalable enough for them to be rendered into different views within a single frame.


Dynamic volumetric smoke/fire simulation

Old school particle billboards are useful in lots of cases and they are very fast to render. But they are not practical when it comes to the simulation of detailed smoke or fire, especially when one desires that extra quality required for some cinematic shots. To this aim, we want to be able to simulate and render smoke/fire simulations for a higher final image quality. Grid based, i.e. Eulerian, simulations are a perfect fit for this use case.
We would like our artists to be able to run such high visual quality simulations in real time in their game while paying attention to performance. As such the simulation will potentially have to scale from very simple smoke to complex heat / black body simulation with vorticity confinement to more advanced details augmentation such as wavelet turbulence, etc. The right parameters and workflow will have to be chosen, researched and developed in Frostbite. We would like the simulation to be real-time and further implementation details would be support for interactions with all scenes elements if required: wind, forces, particles, meshes, etc.


Object-space shading

Material and lighting calculations are normally performed directly on visible surface fragments. One alternative is to move either the former or both techniques into object space. Several parameterizations could be used for this, such as UV-mapped textures, Mesh Colors, or spatial hashing. This could allow caching and amortization of shading calculations, and reduce aliasing.



Dynamic volumetric rendering

Rendering volumetric effects such as smoke using particle billboards give questionable results with very dynamic camera and changing viewpoints. This is especially true when one desires that extra bit of quality required for some cinematic shots. To this aim, we want to be able to render volumetric data in real time to achieve advanced volumetric effect such as smoke, fire or explosions.
We would like our artists to be able to render such high visual quality simulations in real time in their game while paying attention to performance. As such the rendering will have to scale from very simple smoke to more complex emissive heat / black body data. The right input parameters and workflow will have to be researched and developed in Frostbite. We would like the render complexity to scale from real time pre-visualization, game runtime to high visual fidelity cinematic shots. These different levels of complexity will be key when rendering the volumetric data into multiple sub views or for shadow estimation. As such, different pre-computed data representation as well as efficient data structure for empty space skipping will be important. Another areas of interested would be to investigate better temporal integration adapted to volumetric rendering.


Adaptive shading

Screen resolutions and framerates are constantly increasing, especially in the land of VR. This means that the amount of GPU work required to render a scene also grows. At the same time, there is an increase of spatial and temporal redundancy, meaning that pixels nearby in screen-space and in the temporal domain will look similar. Exploiting redundancy in rendering and shading calculations (in either or both domains) could significantly boost rendering performance.

Sparse volumetric rendering

Our current volumetric rendering simulation is bounded in space to a maximum distance from the main camera view. We would like to investigate porting it to an adaptive sparse system. The conducted in investigation would involve defining page/tile allocation strategies based on participating media primitives and experimenting with hierarchy/LoD. The goal of such a research project would be to result in a faster volumetric rendering solution when primitives are sparsely distributed in a scene while allowing the rendering of volumetric effect virtually to an infinite distance.

Higher fidelity particle rendering

Particle in Frostbite are only rendered as simple texture quad. It can optionally support a normal. Star Wars Battlefront have increase the quality bar by using pre-rendered transmittance texture basically containing volumetric shadow for a few directions. We would like to investigate more advanced way to render particles with a higher quality. Pre-integrated lighting and shadowing would be interesting to investigate: view dependent baking of volumetric data would help bringing particle quality up. This would likely end up with a higher memory requirement, so compression of this data will need to also be investigated.

SPH simulation and transformation

Particle simulation in frostbite are simple in a sense that each particles has no knowledge of its neighborhood and other particles. To do more advanced simulation such as water or lava, we would need a Lagrangian particles simulation, e.g. Smooth Particle Hydrodynamic, taking into account the medium viscosity etc. Other simulation could also be used such as n-body simulation. Such complex simulation will have to be run on GPU with appropriate acceleration  data structure.

Simulating fluid, lava or slim is not going to be enough. we will need to render convert the particles into a visual representation that will match the approximated medium. To this extend, different approach could be used such as meta-balls, screen-space splatting, etc.


Gpu particles experimentations

We have a basic framework for GPU particles in which we can start experimenting

  • GPU collision
  • Sorting optimization
  • Temporal sorting experimentation
  • Efficient simulation batching for different emitters

Translucent materials

Rendering translucent materials is a complex process as it exhibits complex visual features such as internal details, internal self-shadowing and light scattering. Such materials are important to render plastic, marble, eye or even complex alien skin. The existing solutions in Frostbite will have to be improved in order to handle these more advanced use cases.

Geometry filtering

Investigate methods for getting rid of geometric aliasing (height field/displacement map filtering, screen space NDF, etc.). There are several methods we would like to look at, both at shading level and screen space methods for G-buffer representations.

GPU particle simulation

All our particles are currently CPU simulated, however for some types of effects we could improve performance by simulating on the GPU. Investigate and implement a system that interfaces with the existing workflows but allows for GPU simulated particles. This includes simulation, limited interaction with the world (collision) and possibly rendering dispatching.

Non-classical and hybrid rendering

Current real-time rendering uses primarily triangle rasterization in order to generate images. A few algorithms use alternate representations, and are practical enough for the current generation of hardware. Ray-marching has been used for screen-space reflections and relief mapping. Capsule-based ambient occlusion has proven to be a great tool for improving realism at an affordable GPU cost. Point splatting coupled with temporal super-sampling has found use in non-photorealistic rendering. Voxel and SDF-based approaches have been applied to global illumination and shadowing. There are many more techniques and avenues which could be explored either in their own context, or in a hybrid scheme with rasterization.

Creation tools

BC6-7 compression

These (semi)new texture compression formats have decompression support in newer graphics hardware, but we have not yet found a good library for compressing our source texture either in tool time (pipeline) or runtime. Research existing libraries or implement our own version.


PTEX parametrization or LEADR (Linear Efficient Anti-aliased Displacement and Reflectance Mapping). Investigate, compare and implement these recent techniques for displacement mapping and surface mapping.


Follow this link to send in your application:

Fill in the form and we’ll get back to ya