© 2019 Spirit AI Ltd. All rights reserved.
Ally helps community moderators intervene and disrupt the behavioural patterns that ruin your customers’ experiences. A small number of bad actors can change the dynamic of an entire community.REQUEST MORE INFO
– Ally allows moderators to understand community behaviour in near real time.
– The power of Ally comes into its own when we consider modes of moderation.
– Chat moderation typically involves reviewing and moderating individual messages.
– For a decently sized community, the number of messages sent per day can easily be in the tens of millions.
– Ally allows moderators to evaluate behaviour at the conversation and player levels.
– Not only does this result in more effective moderation, as your team is able to evaluate large chunks of the chat stream through targeted actions, but they are also less likely to burn out.
Automatic interventions via web hooks trigger muting, rule reminders, or suspensions. The most serious abuse is brought to the attention of members of your team who can take additional actions against the worst actors in your community.
Ally uses a combination of Ai, Machine Learning, Natural Language Processing (NLP), and Natural Language Understanding (NLU), to analyze any message stream. The insights surfaced by Ally have the power to transform the efficiency of your community moderation.
Toxic speech is only one aspect of negative behaviour. The algorithms that define Ally’s classification system have been trained on millions of messages. Ally recognizes and tags those conversational patterns and notify the community managers.
Detection algorithms are tailored to reflect your community norms. Ally’s targets and tags both the positive and negative behaviour that impacts your customers most.
Don’t let a vocal minority be the whole story. Ally provides in depth analysis of your whole
community and represents the breadth of community sentiment. Bad actors are highlighted as well as the positive influencers that make your community an attractive and pleasant place to be.
All you need to do is push your message stream to Ally’s API. Ally’s insights into your community can then be accessed in the cloud via a web interface, an API, or deployed onto your own infrastructure via Docker containers.
Get a bird’s-eye view of the latest activity in your community. Seeing the changing patterns in the behaviour of your community members will help your moderation team find where to intervene quickly and efficiently.
It is critical to understand the context of a message when determining whether a conversation is positive or negative. Ally shows the whole conversation allowing your moderators to be confident that both Ally’s and your moderator’s interpretation of the participant’s intention is correct. We call these classified conversations “stories”.
With Ally, your moderation team can search and browse messages from the data stream. This lets them discover newly evolving patterns of behaviour and create new stories when needed. If these patterns persist, we can work with your customer service team and software to automate
the classification process.
The actions performed by your moderation team can be audited within Ally itself leaving you confident they are applying your community management guidelines rigorously and consistently.
Using AI to combat toxic behavior in game communities.
Smarter Than I: How Spirit Ally aims
to tackle toxicity
Intel and Spirit AI to Battle Toxicity
in Online Games
Like to know more about Ally?
Enter your details and we’ll be in touch within 48 hours.