top of page
SPITE: CURSE OF ARCHIBALD
DESCRIPTION
Game project #6 at The Game Assembly
MY CONTRIBUTIONS
-
Navigation mesh
-
Deferred rendering
-
Boss mechanics
GENRE
GENRE
GENRE
Hack and slash
Hack and slash
Hack and slash
TIMEFRAME
14 weeks (20h/week)
GENRE
Hack and slash
TEAM
5 programmers, 5 artists, 3 level designers
ENGINE
Our custom engine

"Awaken, oh mighty Axemaiden! Behold your village in ruin, torched by the monk whose hand in marriage you spurned. Let your fierce spirit guide you through his evil hordes and quench your burning thirst for revenge.
As we were starting up project 6, we learned that the bar had been significantly raised in two areas: One was the enemy AI, which now required pathfinding and flocking, and the other was the rendering pipeline, which now had to support more advanced techniques than just bog-standard forward rendering.
Most of my time during this project was spent implementing, debugging, and optimizing our navmesh system, as it affected most aspects of gameplay and had to function correctly to even be able to play through a level. Other than this, I set up our engine's deferred rendering pipeline and shadow mapping and also implemented the boss fight.
From this point on my group was also using the entity component system EnTT, and I consider one of my most important takeaways from this project to be the experience of having delivered a full game while working in an ECS - a very pleasant experience!
Most of my time during this project was spent implementing, debugging, and optimizing our navmesh system, as it affected most aspects of gameplay and had to function correctly to even be able to play through a level. Other than this, I set up our engine's deferred rendering pipeline and shadow mapping and also implemented the boss fight.
From this point on my group was also using the entity component system EnTT, and I consider one of my most important takeaways from this project to be the experience of having delivered a full game while working in an ECS - a very pleasant experience!
NAVIGATION MESH

BACKGROUND
Since the game was designed to be a Diablo-like, we required the player to be able to move by point-and-clicking and that enemies can find and walk up to the player in order to attack her, possibly while navigating their way around obstacles in the process. The solution is of course to use a navigation mesh, or navmesh, and it was my job to implement its in-game functionality.
Our level designers initially built their navmeshes using Unity's built-in NavMesh baking tool, which would then be exported to an .obj file through a script written by our programmer Viktor Rissanen. Unfortunately, we later found out that Unity's built-in tools had several issues that made it unsuitable for our purposes: For one, NavMesh.CalculateTriangulation typically produces meshes with very skinny triangles that lead to suboptimal paths during pathfinding. Moreover, the triangulation would often fail to match the level geometry at or near slopes.
The real dealbreaker however was that the resulting mesh would sometimes just fail to be connected. All of these problems eventually lead our pipeline programmer Marcus Cazzola to figure out how to use Unity's ProBuilder to manually build and export a navmesh, and this worked like a charm!
The point where the navmesh had been successfully exported is where I come in.
Our level designers initially built their navmeshes using Unity's built-in NavMesh baking tool, which would then be exported to an .obj file through a script written by our programmer Viktor Rissanen. Unfortunately, we later found out that Unity's built-in tools had several issues that made it unsuitable for our purposes: For one, NavMesh.CalculateTriangulation typically produces meshes with very skinny triangles that lead to suboptimal paths during pathfinding. Moreover, the triangulation would often fail to match the level geometry at or near slopes.
The real dealbreaker however was that the resulting mesh would sometimes just fail to be connected. All of these problems eventually lead our pipeline programmer Marcus Cazzola to figure out how to use Unity's ProBuilder to manually build and export a navmesh, and this worked like a charm!
The point where the navmesh had been successfully exported is where I come in.
IMPLEMENTATION
Step one was to parse the .obj file and set aside its vertices and indices in some internal structure, which I did manually using stringstreams. Then I iterated through all triangles and build an adjacency list by checking which triangles shared exactly two vertices.
Step two begun when the player clicked on the screen; I would then use the position of the mouse along with the camera matrix to construct a direction vector, and use this to raycast onto the navmesh in order to find the clicked triangle. I implemented this raycast check manually since the math was quite simple. (A completely different method would have been to use render picking.) In case two or more triangles intersected the ray, I would pick the one closest to the camera.
If the previous step found a triangle, step three would be to use the A* pathfinding algorithm to calculate the most efficient route from the player's current triangle to the target triangle. I chose to use the distance between centroids as the heuristic, which worked well as long as most triangles weren't too skinny.
The final step was to run the so-called "simple stupid funneling algorithm" in a 2D plane to straighten out the naive path out into a series of line segments going as straight as possible from start to finish. After this I also had to compensate for slopes by snapping path vertices vertically onto their containing triangles, and possibly subdivide the path with new vertices where the funneled path cut straight across a triangle edge.
This final path was then fed to the steering system of the player. Enemies worked similarly, but could use the player's current triangle directly as input to step three.
Step two begun when the player clicked on the screen; I would then use the position of the mouse along with the camera matrix to construct a direction vector, and use this to raycast onto the navmesh in order to find the clicked triangle. I implemented this raycast check manually since the math was quite simple. (A completely different method would have been to use render picking.) In case two or more triangles intersected the ray, I would pick the one closest to the camera.
If the previous step found a triangle, step three would be to use the A* pathfinding algorithm to calculate the most efficient route from the player's current triangle to the target triangle. I chose to use the distance between centroids as the heuristic, which worked well as long as most triangles weren't too skinny.
The final step was to run the so-called "simple stupid funneling algorithm" in a 2D plane to straighten out the naive path out into a series of line segments going as straight as possible from start to finish. After this I also had to compensate for slopes by snapping path vertices vertically onto their containing triangles, and possibly subdivide the path with new vertices where the funneled path cut straight across a triangle edge.
This final path was then fed to the steering system of the player. Enemies worked similarly, but could use the player's current triangle directly as input to step three.


RESULT
The navmesh was used to guide every moving entity in the game, including the player, the enemies, and even the decorative rats!
DEFERRED RENDERING

IMPLEMENTATION
Under the historically most common mode of rendering - forward rendering - the pixels are shaded by iterating over all lights in the scene, and this takes place every draw call. This is potentially very wasteful as a fully shaded pixel belonging to one mesh may later be overdrawn by another mesh, meaning the first round of shading was unnecessary. Therefore, I added a different mode of rendering - deferred rendering - to our pipeline.
First, I used the DirectX 11 API to create a so-called geometry buffer to store 2D representations of certain useful mathematical quantities present in the scene as seen from the camera, with a texture each for positions, normals, etc. Next, I wrote a HLSL shader which would use the technique of multiple render targets to render a mesh and these quantities onto the geometry buffer. Finally, I adapted our engine's physically based light shaders to use the geometry buffer instead.
For this project we used an entity component system to store and iterate over components, namely the header-only library EnTT. Luckily, since the library is incredibly convenient to use, if anything the rendering pipeline ended up being the most straightforward out of all projects so far. With this setup, all I had to do was create a view over all ModelInstances in the scene and render them to the geometry buffer and then create a view over all lights and render them with a separate shader while turning on additive blending.
Being the first time I set up a geometry buffer meant that it wasn't very space effective. For example, I use one texture to store the world position but learned later that it is possible to reconstruct it from the depth and camera frustum. Also, I stored two separate normals; one given by interpolating vertex normals and the other resulting from normal mapping. Both seemed to be used in one of the lightning shaders we used, but presumably the visual effect was not worth the overhead of using two textures.
First, I used the DirectX 11 API to create a so-called geometry buffer to store 2D representations of certain useful mathematical quantities present in the scene as seen from the camera, with a texture each for positions, normals, etc. Next, I wrote a HLSL shader which would use the technique of multiple render targets to render a mesh and these quantities onto the geometry buffer. Finally, I adapted our engine's physically based light shaders to use the geometry buffer instead.
For this project we used an entity component system to store and iterate over components, namely the header-only library EnTT. Luckily, since the library is incredibly convenient to use, if anything the rendering pipeline ended up being the most straightforward out of all projects so far. With this setup, all I had to do was create a view over all ModelInstances in the scene and render them to the geometry buffer and then create a view over all lights and render them with a separate shader while turning on additive blending.
Being the first time I set up a geometry buffer meant that it wasn't very space effective. For example, I use one texture to store the world position but learned later that it is possible to reconstruct it from the depth and camera frustum. Also, I stored two separate normals; one given by interpolating vertex normals and the other resulting from normal mapping. Both seemed to be used in one of the lightning shaders we used, but presumably the visual effect was not worth the overhead of using two textures.
RESULT
Although the benefit wasn't immediately apparent other than a slight increase in performance, having the new pipeline in place opened up the door to more advanced rendering techniques. For example, I later used the information in the geometry buffer to create a shockwave VFX, and in our next project I would use it again when adding screen space ambient occlusion to our game.
One drawback of using deferred rendering is that we had to commit to a particular material model, but this was not an issue as every discipline at The Game Assembly already had a shared material model. Another drawback was that transparent objects still had to use forward rendering due to them requiring blending; this part of the pipeline was later set up by another programmer in our team.
One drawback of using deferred rendering is that we had to commit to a particular material model, but this was not an issue as every discipline at The Game Assembly already had a shared material model. Another drawback was that transparent objects still had to use forward rendering due to them requiring blending; this part of the pipeline was later set up by another programmer in our team.


APPLICATION: UNBOUNDED NUMBER OF LIGHTS
In previous projects we were limited to no more than 8 analytic lights at a time, and while this number could be increased by changing a shader macro, the fundamental problem had always been that the amount of rendering work scales as O(meshes * lights). With deferred rendering, the rendering work instead scales as O(meshes + lights) since we first draw only the meshes during the geometry pass and then draw only the lights during the light pass. Although our level designers still had to be careful to not insert too many lights in our scenes, we at least no longer had an upper bound.
I used the technique of light volumes to increase performance, meaning I drew a bounding mesh when rendering each light so that not all pixels needed to run the light calculations. For this purpose I used a bounding cube, and an improvement would therefore have been to instead use spheres for point lights and cones for spot lights.
I used the technique of light volumes to increase performance, meaning I drew a bounding mesh when rendering each light so that not all pixels needed to run the light calculations. For this purpose I used a bounding cube, and an improvement would therefore have been to instead use spheres for point lights and cones for spot lights.
BOSS MECHANICS

ROUNDHOUSE PUNCH
Archibald does his best to deliver a roundhouse punch to the player, but what with his size and being stuck in the ground, only really manages a slap.
In order to implement this attack, I had to create a method to recursively traverse the animation skeleton and compute the world transform of his hand joint so I could couple a damaging collider to it.
In order to implement this attack, I had to create a method to recursively traverse the animation skeleton and compute the world transform of his hand joint so I could couple a damaging collider to it.

ACID POOLS
Archibald uses his wretched, demonic powers (and maybe some of his knowledge as a former monk?) to spawn pools of acid under the player. Luckily, they don't last for long.
The pools themselves spawn randomly, but only at certain points specified by our level designers. The VFX for the pools was coded by Elias Pettersson.
The pools themselves spawn randomly, but only at certain points specified by our level designers. The VFX for the pools was coded by Elias Pettersson.

INHALE ROCKS
Using his impressive lung capacity, Archibald inhales surrounding rocks into his gaping maw, possibly hitting the player in the process. Those can't be good for your teeth!
For this attack, I used my previously mentioned method to find a joint in Archibald's jaw, and used its coordinate system to construct a forward-facing cone that would attract rocks inside it.
For this attack, I used my previously mentioned method to find a joint in Archibald's jaw, and used its coordinate system to construct a forward-facing cone that would attract rocks inside it.

?
Still not good for your teeth, Archibald!
bottom of page