🎢 Trust & Safety Simulator You Never Knew You Needed
TechCrunch reports on the newest sim game (no, not EA's latest installment to the Sims franchise) that lets you play as a trust & safety officer in a fictional social media company. Created by the Atlantic Council’s Task Force for a Trustworthy Future Web and developed by Techdirt, this simulator puts you up against industry regulation, bots, misinformation, and more as you try to balance revenue, user base trust, and your CEO's confidence. Check out the game for yourself and if you enjoyed your time cosplaying as a trust and safety team leader, try Techdirt's earlier game Moderator Mayhem.
💸 AU Safety Laws Already Impacting the Industry
Both X and Google are under scrutiny by Australia's eSafety Commission for failing to adequately report on policies and practices to combat child sexual exploitation on their platforms. ABC News reports that among many other reasons, X did not provide an employee count of its trust and safety team, and was ultimately fined $385,000 USD.
👊🏽 Content Moderation in the Age of Wartime Misinformation
Social platform companies like Meta and TikTok announced content moderation practices to combat misinformation around the Israel-Hamas war. On Meta's side, they've established a special operations center staffed with fluent speakers of Hebrew and Arabic to review violent content that violates community guidelines, plus users can "lock" their profiles depending on their regional location. Similarly, TikTok created a command center staffed with Hebrew and Arabic speakers, and has added an "opt-in" screen that users will encounter before viewing potentially shocking or graphic content.