As part of our preparation for the games industry, BUas’s IGAD study program put us through a simulated pre-production phase.
I acted in the role of producer during the 8 weeks of this phase. It was my responsibility to guide the team, write the project plan, and enable the developers to work optimally.
A minimalistic overview of which milestones would be achieved on which dates.
I tailored the content of the milestones to follow an unconventional piece by piece approach in order to complement the structure of the game.
A scope breakdown had to be made that comprehensively detailed the time and resource cost of the game’s features.
I’m particularly proud of it, since I made a custom breakdown model that merges the most fitting parts of all the other models that different studios use.
I made several schematics detailing the flow and dependency between tasks.
Only the most important processes received a pipeline at this early stage.
Articulating the main pipelines helped me customize the structure of both the High Level Plan and Scope Breakdown. They both were restructured for a safer development process by taking advantage of the game’s granular nature.
A multifaceted look at where our product could/would/should fit in the market as well as how we could make it more efficiently given our circumstances.
I prioritised those two artefacts to be done early on as team activity. This was to establish a sense of security in the development process. I knew they would help inform the creative vision by providing much needed constraints.
As we were inexperienced in pre-production, sometimes challenges proved very tense and abstract. In those times, I found I had to meet my leadership responsibilities and maintain peace and balance within the team.
In some instances, stakeholder feedback would prove to be a really hard hit. As result, team members would get angry with themselves or each other, tension would rise and break out.
In other instances, people would be frustrated with the work habits of some of their teammates and be bitter towards them.
Ultimately, the core team stuck together during the production of the game, proving my management efforts served their purpose.
I guided the team during our Sprint Planning Process, constantly rethinking and redesigning our production practices such that they fit our needs, capabilities, and general team dynamics.
I refined our practices to allow us to plan together quickly and align our goals for the sprint, then split to individually produce estimates and kick the sprint off.
We regularly communicated our progress to stakeholders. They would raise concerns and question our moves, forcing us to ground our decisions in solid reason.
My job was to shield the team from these external conflicts and allow them to work creatively. To do this, I maintained constant communication with our stakeholders, always verifying our next moves, thus providing peace within the team.
The brief required us to work externally with engine programmers who were developing a custom voxel engine.
If our project ever moved on to full production, it would be driven by whatever that team produced.
We would inform each other about changes in pipelines and creative plans, then make coordinated adjustments to meet our development goals.
At the end of Pre-Production, our project got approved to continue to full-scale production for the next part of the study program. It was on display during the university’s Playday, where many people got to play it and give their feedback which helped us fully develop the product later on.
Our little First Playable Prototype at the end was widely received as quick, fresh, and fun due to how it allowed the two players to gradually find their rhythm and pull off impressive cinematic combos.
The project went into Production, where the challenge would be to preserve and execute our vision with an addition of 20+ team members.
This phase lasted 8 weeks during which I acted in the role of Gameplay Lead and Systems/Gameplay designer.
For a long time we had a major problem – stakeholders insisted it wasn’t fun to shoot and pass back and forth. The difficult part was defining the problem precisely.
I investigated to reveal the actual problem: a disbalance in the frequency of events in the core loop – too much positioning, not enough throwing & catching.
Once I had concrete problem definition, solving the issue became trivial. I drew out a scheme on the whiteboard.
A complaint about the speed popped out of nowhere from multiple unrelated sources – some said it was too fast, others that it was too slow.
As Gameplay Lead, I took control of the situation and gathered data to help us make our next move.
To save time, I had the testers simply run through our already balanced art-passed levels. Then I queried them about the experience on two axes – fun factor and accessibility – more info in the pictures.
So as soon as patterns emerged and overlapped, I trusted my intuition and acted upon it.
The end results seemed satisfactory across the board.
There was an issue related to the camera viewing angle. It would have to tilt and reduce the angle to the ground in order to complement the game’s visuals.
Art wanted to change the camera into being less top down so that more of the art can be shown and at better angles. I tried to advise caution by focusing on a particular gameplay concern: The readability of the XY plane which houses all the playspace.
since I got overruled on that front by the entire art department, I opted for a
fallback solution – make hitboxes
bigger. Worked like a charm.
Since we developed the final game in a different engine than the one we used for the first playable, a lot of what made the game feel good was lost.
In order to recreate the fun, I had to examine the prototype and take it apart using relative metrics.
I measured the size of the character relative to the screen and gauged its speed using the character width as measurement.
The process was quick and easy but proved to be invaluable in this situation as this was not the only time that well-tuned metrics would be invalidated by explicit changes.
I developed and used a custom systems documentation framework.
The full breakdown is elsewhere.
In short, I cycle through multiple ‘system views’ which I complement with conceptual models to communicate complex information.
System View – particular architecture data about the system that strictly only relates to specific things you care about.
Conceptual Model – visual abstraction that shows elements of a system interacting through their attributes.
Essentially, I zero in on a specific set of details that I visually represent using relatable abstractions.
The trick is to know which details to focus on. And that depends on who you’re presenting to.
Tech constraints made the project very dynamic – major changes were introduced frequently, invalidating previously balanced variables.
To manage this risk, I set up a playtesting framework with two playtesting approaches.
Rapid Iteration and Testing Evaluation – The RITE template was purposed towards internal team testing within quick and easy setups. The focus was one player performance and qualitative feedback.
Large-Scale Testing – past the RITE process, a design had to be tested thoroughly so it could be ready for release. This phase would take into account additional variables, like user profiling cross-referenced with objective quantitative data collected by the engine during the session.
Initially, it was my job to recreate the movement dynamics and feel from the first playable made with UE4 in our custom voxel engine.
It was as matter of balancing the camera orientation and movement speed.
To do it effectively, I used the Rapid Iteration method, where I adjusted the values after each test.
My approach was to develop 3 presets. Each preset would mostly emphasise one aspect of the original movement.
Since perfect conversion was near impossible, I did this to see which aspect of the original movement would translate best to the voxel-based gameplay.
To adjust to the Stakeholder feedback, I needed to rebalance the projectile-related metrics. Operating it had to become more accessible.
I strategised an approach:
– First, me and the other designers agreed on a re-balance.
– Then, I drew a table where each row was the design pillars and each column was the elements of the core gameplay loop. For each intersection I had key questions and performance metrics.
– Lastly, I designed 3 environments and several survey questions to test all the setups.
At the end, the projectile proved to be a mostly intuitive tool for a large majority the testers. Albeit the lack of visual and aural polish skewed the results.
Our project was displayed at Playday, where lots of people had the chance to play the game and enjoy its dynamic combat, appealing visuals, and catchy tunes.
We had two full level sets with 3 enemy types next to the entire player character gameplay model working as intended.
Additionally, we had soundtrack and audio, several systems that complemented the main combat system (combo, leaderboard, etc.), and a local arena mode.