Stay informed

Sign up to our low volume newsletter for news, viewpoints and events from Spirit AI.

Apply up for access!

We are currently running an exclusive beta program. Enter your details and we’ll be in touch within 48 hours.

Protect and cultivate your online spaces through nuanced detection and intervention

Make in-game player communities, chatrooms and online social platforms safer and more inclusive environments. Using the power of machine learning and predictive analytics, Ally detects potentially abusive language and behaviours by monitoring all chat.

Protection that promotes
player engagement.

The system takes a player-centric approach – after all, our interactions are nuanced. We may be fine with playful taunts and language when used by our friends, but when it’s a stranger, the context is totally changed. With this combination of both player and community preferences, Ally can take quick action when needed.

Ally investigates abuse incidents further through a multi-stage triage process of analysis. We look at the deeper context of the whole conversation and of the history of interactions between the players. Ally highlights repeat perpetrators who display similar abusive behaviours on multiple occasions automatically compile a dossier of evidence for your safeguarding team to act upon.

Create a loyal player community by making sure every player feels welcomed and included from the start. Ally maximises retention and ensures you deliver against your duty of care by safeguarding players against unwanted antisocial or abusive behaviours. And because our solution is centred on the player, asking them what’s safe, players feel fully engaged and free to explore.

The key components that power

Multi-level Modular system

We take a multi-level monitoring approach which can be tailored according to your architecture and needs.

Language and behavioral analysis

Based on what each player sees and experiences, our analysis enables Ally to identify stalking, begging or even non-verbal abuse, both in one-to-one or many-to-one scenarios.

Longer-term predictive analysis

We look for new patterns of behaviour, enabling you to find situations you may not have been looking for.


Based on custom settings defined by the Community Safeguarding team, our Triage Manager decides how the system responds to each abusive scenario as it is detected.

Community Safeguarding team

Intervene against abusers with automated system actions from the Allybot, actions such as muting, automatic reminder of community rules, suspension of abuser, or even with direct action from the Community Safeguarding team for the most serious abuse.

Context-aware reporting system

We can create a case file for further analysis by the community team, whether a player proactively reports an instance of abuse or responds to an Ally enquiry.