Welcome to the first edition of Trust & Safety Lately, your source for industry insights, updates, and thought-provoking content — from legal regulation impacting the gaming industry and other online social spaces to the latest data and reports to inform your work.
Here's what we're covering today:
🏛️ Upcoming internet regulations and how they'll affect games studios
👪 The concerns parents have for their children online
We're excited to be here with you. Let's get into it. 🔸
The Upcoming Wave of Regulations
Over a dozen new and pending internet trust and safety regulations are slated to seriously impact game developers in the near future, from the United States and the EU to Australia, Ireland, the U.K. and Singapore. The regulations target the rise of hate speech, harassment and misinformation driven by major world events, including COVID-related misinformation, potential election influence and the rise of white supremacist extremism. Updates to privacy laws are also in place, including California adopting the Age Appropriate Design Code Act (modeled off the UK's Children's code) and updates to COPPA under discussion at the federal level.
And as the DSA and other regulations begin kicking into force in 2024, experts only expect enforcement to become more common. Unfortunately for some studios, “No one reported it! We didn’t know there was illegal content! 🙈” won’t cut it anymore. Today, regulators and consumers are looking for evidence that the studios are taking the problem seriously, with a focus on harm reduction. In other words, game studios must now proactively minimize any harm on their platform since they could be liable for such harms even if it was never reported by users. 🔸
🙉 Who’s moderating the moderators? The final word from Australia’s eSafety Commissioner
eSafety Commissioner Julie Inman Grant announced final decisions on world-first industry codes to regulate online content. The major takeaway? Inman Grant declined to take up drafted codes created by games industry companies “due to the failure of the codes to provide appropriate community safeguards, which is a requirement for registration.”
😑 Not so surprising: Extremism proliferates in gaming spaces
Ask anyone who’s played a multiplayer game in the last decade and they’ll tell you radicalism and identity-based hate isn’t new. The Times follows up on NYU Stern’s report on extremism in gaming, with a pretty poignant summary of the issue at hand: “In any case, with three billion people playing worldwide, the task of monitoring what is happening at any given moment is virtually impossible.”
🍗 More shakeups: Twitter’s head of trust and safety has resigned
Twitter’s head of trust and safety, Ella Irwin, resigned earlier this month after replacing Yoel Roth in June 2022. This comes after drastic cost cutting and layoffs that included teams working on efforts to prevent harmful and illegal content, protect election integrity, and surface accurate information on the site.
🔦 White House shines a spotlight on harmful effects of social media
We’re seeing more and more government entities like the White House turning their attention to the online sector, especially as young people grow up with technologies and social avenues not previously available to previous generations.
Will Meta release its open-source LLM more widely? It seems Meta is unfazed by Congressional questioning on generative AI and its moves toward creating an AI regulatory body.
NYU Stern Center for Business and Human Rights: “Gaming the System: How Extremists Exploit Gaming Sites and What Can Be Done to Counter Them” The NYU Stern Center’s representative survey, conducted in January 2023, found that 51% of gamers in five of the top video game markets globally had come across some form of extremist statement or narrative while playing multiplayer games in the past year.
(Relatedly: ToxMod now detects violent radicalism and extremism in games.)
Teleperformance: “Protecting Children Online: The Importance of Content Moderation” A 2021 report by the Pew Research Center found that 81% of parents in the United States are concerned about the potential risks their child may face online, including exposure to inappropriate content and online predators.
U.S. Surgeon General: “Social Media and Youth Mental Health” The Surgeon General's latest study on social media found that 64% of adolescents are “often” or “sometimes” exposed to hate-based content, and almost 75% of adolescents say social media sites are doing a fair to poor job of addressing online harassment.