Welcome or welcome back to Trust & Safety Lately, your monthly recap of online trust & safety news presented by Modulate. We've got our lineup of T&S news in gaming, social media, global regulation, and cyber security.
Here's the lineup:
โ ๏ธ Impacts of In-game Toxicity on Match Outcome
๐ซ Andrew Tate: Gone But Not Forgotten on YouTube
๐ Meta Struggles to Enforce Its Own Rules Around Manipulated Content
๐ YouTube Goes "Free Speech"
๐ฃ๏ธ AI Deepfakes Target Company Leadership
๐ฎ Endorsement System Aims to Reduce Toxicity in Marvel Rivals
๐๐ฝ Ofcom's Latest Proposals
๐ Why Care About Gaming in Online Safety Conversations?
Let's get started! ๐ธ
Data & Reports
โ ๏ธ Impacts of In-game Toxicity on Match Outcome
Activision and Caltech have teamed up to study the effects of player-to-player toxicity in Call of Duty Games, analyzing data from ToxMod to find the impact of harassment and negative behavior on the outcomes of matches and player engagement.
Players exposed to toxicity took more time to join another match -- from about 16 hours to 60 hours depending on whether toxicity came from opponents or teammates and whether the player lost or won their most recent match. Exposure to toxicity also increased the probability that a player uses similar language.
๐ซ Andrew Tate: Gone But Not Forgotten on YouTube
While infamous misogynist and alleged human trafficker Andrew Tate has been banned from having his own YouTube channel since late 2022, his content continues to proliferate through other accounts and sharing on the video platform.
The Center for Countering Digital Hate released a report in June finding that 100 of the most-viewed YouTube videos showing Tate had almost 54 million views -- from the last year alone. Some of the video content featuring Tate's misogynist rants were used in advertisements, viewable by teens and young boys as young as 13. According to the report, more than half of the videos analyzed violated YouTube's policies on hate speech (read on for an update on YouTube's content moderation policy).
๐ Meta Struggles to Enforce Its Own Rules Around Manipulated Content
"Inconsistent" and "incoherent:" two words that Meta's Oversight Board used to describe the company's approach to moderating AI-generated and manipulated media.
Engadget reports, the Board released its scathing statement in response to a recent post circulating on Meta's social platforms claiming to be recorded audio of Iraqi politicians allegedly scheming to meddle in elections. While the post was reported as misinformation, Meta never assigned review of the content to a human moderator and it was subsequently allowed to remain on platforms. The Board notes that the lack of solutions for false information appearing and spreading on Meta's social platforms is shocking, considering the company is among the leading tech giants of the world.
Following in the footsteps of X and Meta, YouTube is now asking its content moderators to favor "free speech" over harm and false information reduction.
The New York Times reviewed internal training materials, which were initially launched in mid-December 2024 with no public announcement. Prior to this policy shift, if one quarter of a video's duration contained content policy violations, YouTube would remove the content. Under its new policies, a video must content violations in at least half of its runtime before removal is considered. Here's a peek into one training example from YouTube according to The New York Times:
A video from South Korea featured two commentators talking about the countryโs former president Yoon Suk Yeol. About halfway through the more-than-three-hour video, one of the commentators said he imagined seeing Mr. Yoon turned upside down in a guillotine so that the politician โcan see the knife is going down.โ
The video was approved because most of it discussed Mr. Yoonโs impeachment and arrest. In its training material, YouTube said it had also considered the risk for harm low because โthe wish for execution by guillotine is not feasible.โ
A recent study from Ponemon Institute reports more executive leaders are being targeted by deepfake attacks.
It's not just fake video and audio claiming to be executives either. Scammers are also pretending to be family members and reaching executives directly at their homes in an effort to extract money or sensitive data from victims.
๐ฎ Endorsement System Aims to Reduce Toxicity in Marvel Rivals
Harassment and disruption are rampant in the popular Marvel Rivals game. Now developers are launching a new player reporting tool to address the issue.
Similar to the endorsement system in Overwatch 2, Marvel Rivals players will be prompted to "give credit" to their fellow players at the end of each match, giving ratings for their sportsmanship and communication skills. These scores will factor into subsequent matchmaking. The idea is to incentivize good or better behavior -- but it could lead to lumping all the most disruptive players in a match with one another.
The BBCreports on Ofcom's latest round of proposals to stop illegal content going viral, adoption of proactive prevention tools, and child protection.
Among Ofcom's recommendations are proposals that online platforms include a user reporting function to flag depictions of physical harm in livestreams. Large platforms should adopt proactive detection and removal technologies to address content deemed harmful to children. Ofcom is seeking public responses to these proposals now through October 20, 2025.
๐ Why Care About Gaming in Online Safety Conversations?
An estimated 80% of children between 5 and 18 play games, and as screen time for children and youth continues to grow, The World Economic Forum writes that safety in gaming is the cornerstone to building an overall safer online environment for kids.
The Forum shares perspectives on two recent reports from Xbox and k-ID. For Xbox's part, a relatively new safety toolkit empowers parents with educational materials that go beyond putting the onus on users to report bad behavior and content. Meanwhile k-ID gets kudos for creating age-assurance technology that helps gaming platforms ensure they're reaching the right audience for their games.