Charles Gbadamosi

Charles Gbadamosi

AI Tools for Competitive Game Balance

For many developers, content balance is a bottleneck for scope. While flexible tools such as quest and level editors can be developed to streamline the content creation process, balancing content is largely manual. Large scale esports titles that have access to vast quantities of player data can use a variety of analytics techniques, such as the work by M. Aung et al which used data from over 400,000 League of Legends players [1], to draw conclusions about their games. This data is invaluable in discovering and fixing balance issues in live content, but pre-release content, and less popular games - that may not necessarily be smaller in content scale - do not have access to the same volume of data and cannot be balanced this way.

 

Our team encountered this problem first hand on the mobile title “Dragon’s Watch RPG” - an online competitive 6v6 turn based RPG with over 700 possible characters and a 300 quest single player campaign. With a total development team size of 8 and myself as the only designer, in the few months leading up to release it became apparent that a traditional balance approach was not feasible. To solve this problem, I built an abstract representation of the “Dragon’s Watch” combat system and simulated thousands of Bot vs Bot scenarios. This simulator assisted in both the balance of single player quest difficulty and individual character balance for multiplayer by establishing character matchup win rates. Unfortunately, continued development of these tools was shelved as we moved on to the development of other systems, but this tool allowed us to release with a majority of the planned content in a balanced state, and accelerated our future content releases post release.

 

Proposed Research

The research question of this proposal is: How might AI techniques be developed and used to assist in the balance of games in order to increase their content scope or reduce development cost. Specifically, how might AI techniques assist in the balance of competitive games and esports, particularly those that feature teams of characters. Dr Jeremy Gow has agreed to supervise this research.

 

At a high level, the goal would be to build and experiment with a suite of AI tools that can be

practically useful to game designers in game balance, either by combining and generalizing existing research or developing new techniques. To this end, there are several research areas that are of interest, but the basic approach is to replicate current industry standard balance workflows, usually performed on large volumes of human player data, by generating similar data sets using bots.

 

Simulation-based approaches to balance, which attempt to mirror the vast data sets available to popular live titles by using bot players, are an interesting area for exploration. The degree to which a bot can convincingly play “like a human” can potentially have an impact on the reliability of simulation results depending on the game being simulated.

 

Some degree of Game Logic Abstraction - reducing a game to its essential underlying rules and goals - is potentially helpful in building AI Design tools, allowing them to more easily manipulate content by giving them access to models that reflect those parts of the game which are deemed interesting for balance. This was important for “Dragon’s Watch RPG” - the pure-data game logic for combat was refactored out into a separate module that was used by the balance tool, allowing it to run performantly while still accurately simulating the game. Defining abstract rule sets that closely model a game is a challenge that varies wildly between games, and there should be some consideration of practical methods by which such an abstraction can be defined. The degree to which a designer must alter their current development process to accommodate tools is a critical consideration for industry adoption and this should be reflected in the design of such tools.

 

Balance issues can often be the result of “degenerate” mechanic combinations, with seemingly balanced content prior to release being revealed to be unbalanced when used in a specific or unusual way that was not considered by the game designers. In the case of simple games where all possible action combinations can reasonably be explored, these issues can be discovered prior to launch given enough processing power. However, in the more interesting case where a game is too complex for all options to be considered, human creativity is arguably what leads to the discovery of such combinations. For agents in simulation-based balance tools to “creatively” seek out “degenerate” strategies is an interesting search problem. It may be beneficial to build agents that can learn from simulation results and adjust their strategies to intelligently seek out and abuse

discovered balance issues, or give designers degrees of control over the personalities and play styles of agents to target suspect areas.

There is also the possibility of using general video game playing agents, such as those discussed in the work of D. Perez-Liebana et al [2], to lower the barrier to entry for industry adoption of a simulation-based tool by providing a one-size-fits all AI solution that requires little or no game specific training. Whether these tools are analytical or advisory is another consideration. In the case of “Dragon’s Watch RPG”, simulation results were used to guide design decisions, but actual content balance changes were still performed manually. A more sophisticated tool might suggest new parameter values, or give a designer the ability to specify balance targets and automatically rebalance game parameters to reach those targets. Similar approaches to automated game balance have been explored before, such as work by M. Morosan and R. Poli [3] which focused on tuning character parameters using Genetic Algorithms. However, these have historically been tightly integrated into specific game projects rather than being generalized tools, and there remains the more general

problem of assessing entirely new mechanics whose impact on the game may only be partially parametrically defined. 

 

Critical to the construction of AI Game Design Tools is the understanding that “balanced” is not synonymous with “fun”. How such a tool might need to consider “fun” when potentially suggesting balance changes is also an area of interest.

 

Industry Engagement

Industry engagement is valuable for this project in two ways. Firstly, with a focus on practical

application it could be valuable to understand the established balance workflow of a popular esport title, such as MOBA games like League of Legends, in order to understand what types of data are most valuable in the balance process and the data science methodologies that are applied so that simulation-based tools can aim to fill the same role for content that has no such data. Large companies with successful esports titles, such as Riot Games and Blizzard, are at the forefront of data science in games and would be excellent candidates for this type of relationship. Observation of their existing data science practices and pipeline can lead to valuable insight into the construction of new tools that either fit into or mirror existing workflows. Secondly, an industry relationship where developed tools can be trialed on unreleased content would also be beneficial to assess their practicality and test hypotheses, especially in cases where the game has a large enough player base to perform traditional post-release balance analytics and compare the results to simulated balance analytics. In this case a company that does not typically apply the same rigor in data science based balance might be more suitable in that both sides of the process - tools for the generation and assessment of pre-release balance data and the integration of tracking tools and assessment of post-release balance data - can be introduced at the same time

and in the same way for more direct experimental comparison. Companies such as Chucklefish, that have popular multiplayer game releases but also a relatively

small company size that does not yet include a large data science team, are good candidates for this second type of relationship. Chucklefish also has a close relationship with my former employer, The Secret Police, the developers of “Dragons’ Watch” who are currently porting the Chucklefish title

“Stardew Valley” to Mobile, and would potentially be willing to work with me again.

 

References

[1] Predicting Skill Learning in a Large, Longitudinal MOBA Dataset, M. Aung, V. Bonometti, A. Drachen, P. Cowling, A.V. Kokkinakis, C. Yoder, A. Wade, Proceedings of the IEEE Conference on Computational intelligence and Games (CIG) (2018)

[2] Analyzing the Robustness of General Video Game Playing Agents, D. Perez-Liebana, S. Samothrakis, J. Togelius, T. Schaul, S. Lucas, Proceedings of the IEEE Conference on Computational intelligence and Games (CIG) (2016)

[3] Lessons from Testing an Evolutionary Automated Game Balancer in Industry, M. Moroşan, R. Poli, 10.1109/GEM.2018.8516447 (2018).

 

Home institution: Queen Mary

Supervisor: Dr Jeremy Gow

Ready to apply?

Once you have identified your potential supervisor, we would encourage you to contact them to discuss your research proposal.

Learn More