Welcome back to Trust & Safety Lately,your monthly recap of all things trust and safety with a particular focus on the gaming industry.
A quick note that Modulate will be heading to San Francisco next month for GDC. Stop by our booth on the show floor, sit in on a panel talk, or meet us at one of our networking events. See you soon at GDC!
In this month's issue:
☯️ eSafety Commission's New Study
🚫 Grooming in Games
⚖️ SCOTUS on Social Media Content Moderation
⤴️ New Trust & Safety Leadership at Social Media Giants
⭐ Yelp's Latest Transparency Report
💸 Meta and TikTok Push Back
☢️ No PVP, No Problem!
February might have been a short month, but there's no shortage of news so let's jump (leap?) into it! 🔸
Data and Reports
☯️ The Good and the Bad: eSafety Commission's New Study
Between 28% and 65% of young gamers who experienced negative behaviour while gaming said they were adversely impacted. For example, almost 2 in 3 (65%) young gamers who received or were asked to send nude images or sexual information, and 1 in 2 (52%) who had experienced hate speech, reported at least one negative impact.
It's easy to be doom and gloom when talking about online safety, especially for young people, so we're glad the study also touches on the positives of social gaming:
Most (94%) young people surveyed described positive feelings associated with their online gaming and reported that it benefited their skill development and learning (76%), social connections (58%) or emotional wellbeing (41%).
🚫 Grooming for Violence and Sexual Exploitation in Games
Early in February, the Global Network on Extremism and Technology published Grooming for Violence: Similarities Between Radicalisation and Grooming Processes in Gaming Spaces. Authors Rachel Kowert and Elizabeth D. Kilmer outline the tendency of gaming environments to prime their users toward relationship building in spaces that are often unmoderated. While relationship building through shared interests like gaming is not inherently nefarious, this is one building block in the pathway to grooming for both radicalization and sexual exploitation. As anyone working in the online games space can tell you: this is a complex, multi-modal issue that has no easy solution. Kowert and Kilmer share some guidance in their conclusion:
"... companies should examine their games’ community for norms that may make it easy for perpetrators to hide and groom potential targets."
It also goes without saying:
"... normalisation of hate and harassment in many gaming communities may make it easier for perpetrators to operate undetected."
Industry News
⚖️ SCOTUS Considers Content Moderation
Can states prevent social media companies from removing or moderating false, harmful, objectionable, or hateful content on their own platforms? This past Monday, the US Supreme Court heard oral arguments regarding two state-level laws coming out of Texas and Florida that may impact the future of the internet as we know it. Unsurprisingly, both sides point to the First Amendment's protection of free speech as grounds for the legality (or not) of the controversial laws. NPRhas a helpful roundup of some takeaways from Monday's SCOTUS session, but here's the scoop:
Censorship or editorial freedom: Portraying content moderation as censorship, solicitors general from Texas and Florida argue that social media is a public forum and that social media companies shouldn't be allowed to remove or stifle content based on political views. NetChoice, an association representing the tech companies under scrutiny, argues that censorship is a mischaracterization of content moderation practices -- instead, these companies are exercising editorial freedom.
Justices grappled with applying legal precedents: Should social media be considered more like a newspaper that has editorial freedom, or are social media platforms more akin to "common carriers" like phone service providers who cannot discriminate against customers based on their political views?
Another key point was Section 230: In Monday's hearing, Justices discussed how laws in Texas intersect with Section 230 of the Communications Decency Act of 1995, which protects platforms from liability for content published by its users. SCOTUS blog summarizes an exchange between Justice Gorsuch and Paul Clement, representing NetChoice:
"Justice Neil Gorsuch told Clement that, in his view, there is a tension between the idea that a tech company can’t be held liable for its users’ speech and the idea that moderating that content is the tech company’s speech. Is it speech for purposes of the First Amendment, he asked, but not for purposes of Section 230?"
⤴️ TikTok and Bluesky Announce New Trust & Safety Leadership
CNN reports TikTok has shifted to a new Head of Global Trust & Safety. Adam Presser will take over global trust and safety, while two other senior executives have announced their departure: Theo Bertram, VP of government relations and public policy in Europe, and Rich Waterworth, GM of EMEA operations. These leadership changes follow CEO Shou Chew's promise to invest $2 billion to trust and safety during his late-January U.S. Senate hearing testimony.
The same day, Bluesky added X's former co-lead of trust & safety Aaron Rodericks to their team. Rodericks was part of X's election integrity team until Elon Musk axed half the team.
In a statement on his hiring, Rodericks says, "People expect social media to provide a healthy level of built-in moderation, with clearly stated rules that are applied consistently. However, we’ve seen that this alone is not enough. Communities also need the ability to self-organize around more opinionated moderation principles and have the tooling to keep these efforts sustainable. I’m excited that Bluesky is taking both of these layers seriously, and I believe that their fresh approach to user choice with stackable moderation is poised to become a key part in guiding and growing healthy online conversations." TechCrunch has more.
⭐ Yelp's 2023 Transparency Report
Yelp published their 2023 Transparency Report this month, outlining safety and moderation practices through December 2023. Yelp uses large language models to help detect fraudulent accounts and reviews, or other content in violation of terms of service. Of the 22 million reviews contributed to the platform in 2023, 4% were removed by Yelp moderators. 278,600+ user accounts (and their associated reviews) were closed for violating Yelp's Terms of Service, including suspected deceptive or abusive behaviors.
💸 Meta and TikTok Sue the EU's European Commission
With the EU's DSA in full swing, Meta and TikTok don't want to pay for the fees required by the European Commission to fund enforcement costs. Under DSA rules, social media platforms operating in the EU are required to pay up to 0.05% of annual profits in order to fund DSA enforcement costs -- for a company like Meta, that could mean up to $11.8 million this year. PCmag has more.
☢️ Preventing Toxicity: That's One Way to Do it (maybe)
The new third-person shooter game Helldivers 2 released on February 8 and has received generally positive reviews. Some players even compared the game's early success to the Call of Duty franchise, with suggestions to add player vs. player (PVP) into Helldivers 2. In response, CEO and creative director Johan Pilestedt of developer Arrowhead said on X:
We're all for keeping gaming environments fun and supportive, but you have to wonder if this approach to player protection is actually working. GamesRadar reports on players still managing to find a way to be less than kind to one another, even without a PVP mode.
Industry Events
Modulate is heading back to the annual Game Developers Conference this year! We've got two talk sessions lined up about emerging tech and how data can influence the way you approach community safety. Plus, join us at one of a handful of networking and social events we're hosting or just swing by our show floor booth. See you soon!
Hosted by the Trust & Safety Professionals Association, this year's TrustCon will be here before you know it. You can register now -- keep an eye out for the full lineup of speakers and talks coming later this year.
Just announced, the third annual Trust & Safety Research Conference is back! The conference brings together trust and safety researchers and practitioners from academia, industry, civil society, and government. Submit proposals by April 30 and register beginning June 2024.
Modulate hosted a virtual chat with experts from Schell Games about the impact of content moderation in the popular Among Us VR game. If you missed the chat, click the button below to watch a recording!