Tune into our upcoming webinar with Tech Coalition on navigating child safety online
View in browser
TRUST & SAFETY Lately _ light

Welcome back to Trust & Safety Lately; and a special welcome to those of you who joined us through our friends at the Everything in Moderation newsletter! November was a busy time in the trust & safety space! From Xbox's latest transparency report to the state of trust & safety as a service. Plus, as big tech companies continue to slash their T&S teams and functions, where are those experts going? 

 

We've got all this and more:

  • 📬 Xbox releases its Q3 2023 transparency report
  • ✨ Kids want video games this holiday season
  • 🤖 Roblox faces lawsuit
  • 📣 Former head of Twitter's trust & safety speaks out
  • 🚨 New Ofcom rules impacting organizations operating in the UK
  • 🚧 Trust & safety as a service?
  • 💃🏽 Upcoming events -- tune into our Modulate x Tech Coalition webinar

Strap in and let's get into it! 🔸

T&S Lately divider

Data and Reports

 

📬 Xbox Transparency Report

Xbox published their third transparency report this month. From Xbox Wire, here are a few top takeaways:

  1. 87% (17.09M) of total enforcements this period were through our proactive moderation efforts. 
  2. "...over 4.7M pieces of content were blocked before reaching players, including a 135k increase (+39% from the last period) in imagery..."
  3. "Early insights indicate that the majority of players do not violate the Community Standards after receiving an enforcement and engage positively with the community."

While Xbox is leaning into proactive, tech-driven content moderation tools, Microsoft is also taking steps to offer "more traditional" safety resources to its Minecraft players. Take, for example, the new Official Minecraft Server List, developed in partnership between Minecraft and GamerSafer to point users (especially underage players and their parents) toward third-party servers that meet Minecraft Usage Guidelines.

 

✨ ESA Survey Shows What Kids Want

A timely survey of Americans age 10-17 from the Entertainment Software Association shows that video games are overwhelmingly the top choice for holiday gifts at 72%. The ESA also shares some tips for parents to help keep kids safe like following ESRB ratings, utilizing parental controls, and having a conversation about appropriate online behavior.

T&S Lately divider

Industry News 

 

 

🤖 Roblox Faces Class-action Lawsuit 

Parents in California have filed a class-action lawsuit against the Roblox Corporation for alleged "intentional and negligent misrepresentation, unjust enrichment, violations of California’s Unfair Competition Law, False Advertising Law, Consumer Legal Remedies Act, and State Consumer Protection Acts." The lawsuit also includes concerns around ease of in-app purchases. While further details haven't been shared publicly yet, it certainly underscores the increased attention being paid to this space. Read the full press release here. 

 

Unsurprisingly, the basis of this lawsuit echoes findings from a recent survey by WebPurify confirming parents of underage gamers are most concerned about multiplayer games and UGC. 

 

📣 Twitter's Trust & Safety (back then)

The former head of trust & safety, who goes by Del Harvey, gave an exclusive interview to Wired this month. Harvey outlines her time at Twitter, which was just a scrappy startup back when she first joined the company. There's a lot to unpack, from decision-making around geopolitical strife to processes for handling election misinformation. Not least of all, here's a quote from Harvey that will surely resonate with anyone working in the trust & safety space:

 

"When trust and safety is going well, no one thinks about it or talks about it."

 

🚨 Ofcom Publishes First Set of Proposed Guidelines

UK telecommunications regulator Ofcom released its first set of compliance proposals earlier this month. The largest shift for any company operating online social or UGC spaces (online and mobile games, social media, and even adult sites) is the requirement to implement proactive policies to quash the spread of things like CSAM, extremist messages, violent content and revenge porn just to name a few. This is only the first of a multi-stage rollout of regulation guidance from Ofcom as it prepares to enforce the UK's Online Safety Bill. And if companies don't comply? From The Verge: "If a firm is found to be in breach, Ofcom can levy fines of up to £18 million (around $22 million) or 10 percent of worldwide turnover — whichever is higher. Offending sites can even be blocked in the UK."

 

🚧 Trust & Safety as a Service? 

Also from Wired, are big tech companies turning to external service providers to moderate content and guide policies on their platforms... even as they gut internal trust & safety teams and eliminate ethics functions? There's certainly room for growth as "UK Department for Science Innovation, and Technology estimated that 'Safety Tech,' which includes everything from content moderation to scam detection, was on track to hit £1 billion ($1.22 billion) in revenue by the mid-2020’s." Increasingly, safety tech refers to machine learning tools to help humans make more (and more accurate) moderation decisions. Scale is great, but on the flip side one error in a machine learning model might mean widespread issues for all customers using that tool. 

 

So if big tech is ditching internal trust & safety, where are these former content moderation and trust & safety experts going? CNBC also covers the industry shift, speaking with former meta and Google employees who are taking advantage of the void in the market to grow (and sell) the content moderation piece of trust & safety. 

T&S Lately divider

Industry Events

Webinar promo graphic

Play Safe, Play Smart: Navigating Child Safety in Gaming Communities

December 7, 2023 | Virtual

Register here

 

Join Modulate's CEO and Co-founder Mike Pappas for a free webinar on all things content moderation and child safety in gaming. Mike is joined by Lauren Tharp, Technology Innovation Lead of The Tech Coalition, which gathers member organizations to stop child sexual exploitation and abuse online.

 

secondary-color-light-reversed

Modulate, 212 Elm St, Suite 300, Somerville, MA 02144, USA

Manage preferences