Welcome back to Trust & Safety Lately,your monthly recap of all things trust & safety! From regulation to Supreme Court opinions on the legality of content moderation and platforms updating safety tools and policies, we've got it all right here -- plus, reach out to connect with the Modulate team at devcom/gamescom this year π
Without further ado, in this issue of Trust & Safety Lately we're covering:
βAsk the Industry: Combating Extremism and Toxicity in Games
π The TrustLab 2024 Online Safety Report
ποΈ Roblox has a problem (still)
β°οΈ The Uphill KOSA Climb
βοΈ Netchoice opinion
π eSafety Commissioner's deadline
π» Snapchat safety measures
π° Fortnite has its own problems
π€ Valorant updates its enforcement
πΎ Origins of video game addiction
Let's get started! πΈ
Data & Reports
βAsk the Industry: Combating Extremism and Toxicity in Games
Games research experts Elizabeth Kilmer, Zeynep Aslan, and Rachel Kowert published findings from game industry professional focus group interviews on the prevalence, challenges, and solutions around toxicity and extremism in gaming. Focus group participants were asked questions like "What do you think is the 'root cause' of extremist behavior in games?" and "What do you think is your/your team's responsibility to combat extremism in [GAME]?" Major themes include:
Games can and should be designed with positive player behavior in mind.
Modeling cultural norms through character design and representation.
The use of platform moderation services.
Rewarding players who display positive and prosocial behaviors.
One overwhelming sentiment:
"... the need for increased understanding of the critical role community management plays in shaping community norms. Community management roles and teams were consistently reported as being misunderstood, isolated, underfunded, and excluded from much of the development process."
π The TrustLab 2024 Online Safety Report
This report shines a light on the challenges, emerging threats and new solutions that Trust & Safety professionals encounter on the frontlines of preventing online harms. According to interviews with over 100 T&S professionals, universal metrics for online safety still don't exist, and third-party tools don't always get the job done. We also see a familiar refrain: investing in T&S capabilities often only happens when something has already gone awry. Talk about feeling seen and heard!
Industry News
ποΈ Roblox has a problem (still)
Bloomberg released a disturbing but necessary report on child predation in Roblox. With over 78 million daily active users and 40% of those users being preteens, Roblox has become one of the world's largest virtual social spaces used by children and teens -- and this comes with its own safety challenges.
Bloomberg's lengthy piece gives insights into safety team at Roblox through interviews with anonymous former employees, many of whom claimed that safety was nothing more than a "watchword." In cases that led to detection and law enforcement involvement, in-game currency called Robux was often used by predators to lure and groom victims. Bloomberg also reports on Roblox's growth in daily active users and overall revenue growing alongside reports of suspected child exploitation.
The report is alarming to say the least, so proceed to the full piece with caution.
Following the Bloomberg story, Roblox provided responses to Eurogamer alongside a blog post from chief safety officer Matt Kaufman, saying the original piece failed to recognize the sheer size of the Roblox community with the complexity of player safety. Kaufman also points to the millions of otherwise positive interactions players make with one another every day.
β°οΈ The Uphill KOSA Climb
Since February 2022, the US Kids Online Safety Act has made its way through legislative committees and and subcommittees, with a lot of early support from advocacy groups. While the idea of "protecting children online" has overwhelming support, the specifics of a law to achieve that goal have been called into question by both the left and right wing.
What are the major pieces of pushback from critics? Both LGBT advocates and conservatives argue that vague language in the bill would give states too much leniency in determining what constitutes "harmful content" -- is a post about how to find gender affirming care in your town considered harmful, or is a post questioning the legitimacy of elections dangerous? Another not-often-talked-about piece of the bill is Section 4.3, which calls for maximizing both privacy and safety of children. Sounds nice! But is it feasible?
Grieving parents are among the most vocal supporters of KOSA, arguing their children have been harmed or worse by internet content, specifically social media content.
The New York Times shares a summary of the bill's history, reiterating the strange alignment between left and right in opposing KOSA:
"While many child advocacy groups, teachers and parents voiced early support, free speech groups quickly raised concerns. They argued that the bill would give social media sites and regulators too much latitude to define what was harmful to youths."
Next up for KOSA: the House of Representatives.
βοΈ Netchoice opinion
In the beginning of July, the US Supreme Court published their stance on two state-level laws which seek to ban social platforms from moderating content that appears on their sites. Writing for the majority, Justice Kagan's opinions on the case signal a lean toward interpreting algorithm-driven content delivery and content moderation as forms of expression to be protected by the First Amendment. In other words, a social platform's ability to moderate content is in and of itself a form of speech, which ought to be protected.
The Verge shares more on the implications of Netchoice from early in July.
π eSafety Commissioner's deadline
The October 3, 2024 deadline is coming for internet platforms to create and share policies for preventing children from seeing pornography and other inappropriate materials. Reuters has more.
Plus, read this handy breakdown on The Conversation, which calls out the age-old privacy vs. safety debate, parental controls, and age verification/assurance tools, concluding:
However, it would require tech firms to work together to implement an integrated and comprehensive set of safety measures to enhance online child safety.
That goal is laudable and may well be achievable. However, whether it can be done in just six months remains to be seen.
π» Snapchat safety measures
Snapchat announced its newest safety tools and measures to protect teens from sexual exploitation. Features include expanded in-app warning popups and enhanced blocking capabilities. Previously, a blocked user could simply make a new Snapchat account to get around a block -- now all Snapchat accounts made on the same device will be automatically blocked.
These new safety tools come as the FBI warns of increased cases of teen sexual exploitation.
π° Fortnite has its own problems
While Roblox struggles with child safety, Fortnite has its own challenges: extremist violence content. The Global Project Against Hate and Extremism shared recent findings uncovering over a dozen player-generated game modes that center around antisemitic and extremist imagery or ideology, including one game map that recreates a World War II era concentration camps. Despite Fortnite having content policies that bar the production of hate symbols and real-world terrorist or violent organizations, the problem persists. More from Wired.
These challenges aren't unexpected, especially for a game as large as Fortnite where user generated content is produced so quickly and prolifically. But transparency and stories like this are still important for the industry at large.
π€ Valorant updates its enforcement
Over two months after popular streamer Taylor Morgan shared a recording of abuse she received over voice chat while playing Valorant, Riot Games published its latest patch with the hopes to improve policy enforcement.
Behavioral Intervention Update: People who engage in comms abuse will experience increased penalties.
πΎ Origins of video game addiction
From The Guardian: maybe there are factors outside of kids themselves and the games they play that leads to compulsive video game addiction. Offering alternative perspectives to a story originally published in The Observer magazine, games editors Keith Stuart and Keza MacDonald point to overly anxious parents and caregivers tracking the every movement of kids through smartphones, excessive surveillance in public spaces like parks, and lack of reliable access to mental health resources for young people. It's no wonder, they write, that "teens withdraw to online video game worlds, the last spaces they have left that remain unmediated by their parents or other authority figures β the last places where they are mostly beyond the reach of adult control."
Industry Events
Devcom / Gamescom 2024
August 18-25, 2024
Cologne, Germany
Modulate is headed to devcom and gamescom later this month! This is the official game developer event of gamescom and Europeβs biggest game developer community-driven industry conference.