Developed over 2 months in early 2024 (100% worktime)
Work: Gameplay, UI, Tool
S.L.I.M.E is a 3D puzzle-platformer game that I and 11 other game developers made in the course "Spelprojekt 2". In the game you play as a slime who wakes up in an abandoned laboratory where you must jump, squish, and slime your way out in order to unravel the mysteries of how you got there and what happened.
Itch.io link: S.L.I.M.E
My Contributions
Player movement and collision
Because the player's movement was such a central part of our game, we wanted full control over it and therefore created both the movement and collision system from scratch. The movement was made using a state machine for each of the states the slime can be in, which influences the collision hitboxes, animations, and the types of movement available to the player.
One challenge I encountered was the level of freedom the player had when exploring the world. The environment contained many objects of varying sizes and shapes, and the player character itself also had a unique shape. This required a collision system that was precise enough to correctly detect and interact with these objects while still feeling smooth and responsive. At the same time, it needed to be optimized enough to avoid performance issues.
With these constraints in mind, I created an event based system that triggers when any of the box casts or raycasts shown in the image are activated. The box casts in yellow act as the primary colliders and, along with the other raycasts, are anchored to different joints of the character model. The green raycasts check whether there is enough space above the player to return to an unsquished state.
I chose to use multiple raycasts instead of a single large box cast because there were situations where the player felt that they should be able to unsquish but could not, as a small edge of the box cast was still being triggered. The same reasoning applies to the blue raycast beneath the slime, which checks whether the character should be considered grounded.
Although using this many raycasts introduces a performance cost, given the scope of the project we felt the trade-off was worthwhile for improved player satisfaction.
In combination with the player controller I also implemented controller support, both for playing the game and for navigating the various menus in it.
Puzzle elements
Since the game is partly a puzzle game, it needed some puzzle elements. In our case, these were buttons, cables, weights, and doors. I created the logic for the doors and cables so that the cables light up when they are powered and the doors open when the right buttons are powered.
Due to time constraints, I did not have the opportunity to create a fully refined tool for this task. Instead, I implemented a basic system based on Unity Events that triggers when the relevant button is activated. This event iterates through a list of objects and changes their materials, which is how the cables change color. The event also sends a boolean value to the door to indicate that the button is active.
When the button is deactivated, all of these events occur in reverse.
Menu
Since we also wanted controller support for the game, we had to make the menus navigable with them. It was also the first time any of us had worked with Unity's new Input System, which led to some mistakes, such as missing a button that converts UI control to the new system.
Because I missed that button and didn’t know it existed until after the game was finished, I had to create my own system for controlling the menu with a controller. It’s not the prettiest solution or the easiest to work with, but it works, and given the time constraints we had, I’m still pretty happy with how it turned out.
The controller support system I created was based on using a Scriptable Object for each interactable menu element, where I manually defined all relevant information such as position, default slider values, and which Scriptable Object should become selected based on button inputs. I then implemented a manager script that handled input and kept track of the currently selected element’s Scriptable Object, allowing the player to interact with it.
Because this system was built on Unity’s built-in UI framework, the logic for what should happen when a button is pressed was implemented directly on the UI elements themselves. As a result, each Scriptable Object needed to be linked to its corresponding button when the scene loaded, enabling the button to be “pressed” programmatically.
Lights
We also wanted to establish a specific atmosphere for the game, as it was set in a long-abandoned laboratory. Because of this we wanted lamps to flicker and blink, not only to enhance the atmosphere but also to help guide the player along the correct path, as flickering lights naturally draw more attention.
To achieve this, I created a simple tool that allows you to toggle whether a light should flicker between intensities randomly within a set time interval. The tool can also make the light turn on and off at random within a set time interval, and both features can be combined.