FACEIT with Maria Laura Scuri
Oct 30, 2019 ·
38m 35s
Download and listen anywhere
Download your favorite episodes and enjoy them, wherever you are! Sign up or log in now to access offline listening.
Description
Happy Halloween! Today, Jon Foust and Brian Dorsey chat with Maria Laura Scuri of FACEIT about ways they are reducing toxicity in gaming. FACEIT is a competitive gaming platform that...
show more
Happy Halloween! Today, Jon Foust and Brian Dorsey chat with Maria Laura Scuri of FACEIT about ways they are reducing toxicity in gaming. FACEIT is a competitive gaming platform that helps connect gamers and game competition and tournament organizers. In order to do this well, FACEIT has put a lot of energy into finding ways to keep the experience positive for everyone.
Because gaming toxicity can involve anything from verbal jabs to throwing a game, FACEIT uses a combination of data collecting programs and input from players to help identify toxic behavior. In identifying this behavior, FACEIT has to consider not only the literal words spoken or actions made, but the context around them. Is that player being rude to strangers or is he egging on a friend? The answer to this question could change the behavior from unacceptable to friendly banter. Using their own machine learning model, interactions are then given a score to determine how toxic the player was in that match.
The toxicity scores along with their program, Minerva, determine if any bans should be put on a player. FACEIT focuses on punishing player behavior, rather than the player themselves, in an effort to help players learn from the experience and change the way they interact with others in the future.
Maria’s advice to other companies looking to help reduce toxicity on their platforms is to know the context of the toxic event. Know how toxicity can express itself on your platform and find ways to deal with all of them. She also suggests tackling the issues of toxicity in small portions and celebrating the small wins! Her final piece of advice is to focus on criticizing the behavior of the user rather than attacking them personally.
show less
Because gaming toxicity can involve anything from verbal jabs to throwing a game, FACEIT uses a combination of data collecting programs and input from players to help identify toxic behavior. In identifying this behavior, FACEIT has to consider not only the literal words spoken or actions made, but the context around them. Is that player being rude to strangers or is he egging on a friend? The answer to this question could change the behavior from unacceptable to friendly banter. Using their own machine learning model, interactions are then given a score to determine how toxic the player was in that match.
The toxicity scores along with their program, Minerva, determine if any bans should be put on a player. FACEIT focuses on punishing player behavior, rather than the player themselves, in an effort to help players learn from the experience and change the way they interact with others in the future.
Maria’s advice to other companies looking to help reduce toxicity on their platforms is to know the context of the toxic event. Know how toxicity can express itself on your platform and find ways to deal with all of them. She also suggests tackling the issues of toxicity in small portions and celebrating the small wins! Her final piece of advice is to focus on criticizing the behavior of the user rather than attacking them personally.
Information
Author | Bill Scott |
Organization | Bill Scott |
Website | - |
Tags |
Copyright 2024 - Spreaker Inc. an iHeartMedia Company