🤖 Roblox Faces Class-action Lawsuit
Parents in California have filed a class-action lawsuit against the Roblox Corporation for alleged "intentional and negligent misrepresentation, unjust enrichment, violations of California’s Unfair Competition Law, False Advertising Law, Consumer Legal Remedies Act, and State Consumer Protection Acts." The lawsuit also includes concerns around ease of in-app purchases. While further details haven't been shared publicly yet, it certainly underscores the increased attention being paid to this space. Read the full press release here.
Unsurprisingly, the basis of this lawsuit echoes findings from a recent survey by WebPurify confirming parents of underage gamers are most concerned about multiplayer games and UGC.
📣 Twitter's Trust & Safety (back then)
The former head of trust & safety, who goes by Del Harvey, gave an exclusive interview to Wired this month. Harvey outlines her time at Twitter, which was just a scrappy startup back when she first joined the company. There's a lot to unpack, from decision-making around geopolitical strife to processes for handling election misinformation. Not least of all, here's a quote from Harvey that will surely resonate with anyone working in the trust & safety space:
"When trust and safety is going well, no one thinks about it or talks about it."
🚨 Ofcom Publishes First Set of Proposed Guidelines
UK telecommunications regulator Ofcom released its first set of compliance proposals earlier this month. The largest shift for any company operating online social or UGC spaces (online and mobile games, social media, and even adult sites) is the requirement to implement proactive policies to quash the spread of things like CSAM, extremist messages, violent content and revenge porn just to name a few. This is only the first of a multi-stage rollout of regulation guidance from Ofcom as it prepares to enforce the UK's Online Safety Bill. And if companies don't comply? From The Verge: "If a firm is found to be in breach, Ofcom can levy fines of up to £18 million (around $22 million) or 10 percent of worldwide turnover — whichever is higher. Offending sites can even be blocked in the UK."
🚧 Trust & Safety as a Service?
Also from Wired, are big tech companies turning to external service providers to moderate content and guide policies on their platforms... even as they gut internal trust & safety teams and eliminate ethics functions? There's certainly room for growth as "UK Department for Science Innovation, and Technology estimated that 'Safety Tech,' which includes everything from content moderation to scam detection, was on track to hit £1 billion ($1.22 billion) in revenue by the mid-2020’s." Increasingly, safety tech refers to machine learning tools to help humans make more (and more accurate) moderation decisions. Scale is great, but on the flip side one error in a machine learning model might mean widespread issues for all customers using that tool.
So if big tech is ditching internal trust & safety, where are these former content moderation and trust & safety experts going? CNBC also covers the industry shift, speaking with former meta and Google employees who are taking advantage of the void in the market to grow (and sell) the content moderation piece of trust & safety.