Intel Is Using AI to Fight Toxic Voice Chat in Games
Intel is teaming with Spirit AI to provide automated voice chat moderation designed to curb toxic behavior.
SAN FRANCISCO - The rise of online gaming and livestreaming have made gamers and creators more connected than ever. But it's also opened the floodgates for unpleasantly toxic behavior.
And while there are plenty of manual and automatic tools for moderating foul play in chat rooms and comments sections, curbing toxicity becomes much more difficult when it comes to actual voice chat.
That's where Intel and Spirit AI come in.
The computing giant is teaming with Spirit, which already provides automated text chat moderation via its Ally software, to provide that same degree of filtering for voice chat so that developers can ensure a safe, inclusive environment for their online games.
Intel's voice integration with Spirit's technology is still in early stages, but I saw a few demos that seem promising. Intel started by having Spirit's AI transcribe a produced NPR radio segment, just to show how the technology can capture a clear voice recording to a near word-for-word degree.
Of course, actual in-game chat is rarely crystal clear, nor is it often polite. That's why Intel then showed Spirit tracking a voice clip from a heated League of Legends match, and was able to automatically flag terms like "mentally retarded" that could be construed as hate speech.
While it seems like a ways out before Intel's Spirit voice integration will be available for the masses, the company noted that it will ultimately be up to developers to choose how to implement it.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
Kim Pallister, CTO for VR, gaming and esports at Intel, noted that one game maker might choose to issue a warning or ask the harassee if they want the offender banned, while others could set up an auto-ban system the moment foul language is detected.
"Developers will have to learn over time what works best for their game community," said Pallister.
The company did note that there are some obvious challenges to overcome when trying to automatically detect and act on real-world speech, including knowing the difference between an offensive word and a game-specific term. But even if Intel and Spirit AI's detection technology takes a while to arrive for developers, it could eventually go a long way towards making online gaming a better place.
"Somebody has to be rolling up the sleeves and working on these kinds of problems if we’re going to make gaming welcoming to everyone," said Pallister.
Be sure to check out our GDC 2019 hub page for all of the latest gaming news and hands-on impressions straight out of San Francisco.
Mike Andronico is Senior Writer at CNNUnderscored. He was formerly Managing Editor at Tom's Guide, where he wrote extensively on gaming, as well as running the show on the news front. When not at work, you can usually catch him playing Street Fighter, devouring Twitch streams and trying to convince people that Hawkeye is the best Avenger.