Stanford researchers find Mastodon has a massive child abuse material problem. Instagram too.
"Mastodon, the decentralized network viewed as a viable alternative to Twitter, isrife with child sexual abuse material (CSAM), according to a new study from Stanford’s Internet Observatory (via The Washington Post). In just two days, researchers found 112 instances of known CSAM across 325,000 posts on the platform — with the first instance showing up after just five minutes of searching.
…
“We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” David Thiel, one of the report’s researchers, saidin a statement to The Washington Post. “A lot of it is just a result of what seems to be a lack of tooling that centralized social media platforms use to address child safety concerns.”
https://www.theverge.com/2023/7/24/23806093/mastodon-csam-study-decentralized-network
Also by David Thiel (no relation to Peter Thiel that this anon could find) last month:
"Cross-Platform Dynamics of Self-Generated CSAM
"A Stanford Internet Observatory investigation identified large networks of accounts, purportedly operated by minors, selling self-generated illicit sexual content. Platforms have updated safety measures based on the findings, but more work is needed.
David Thiel, Renee DiResta, Alex Stamos - June 6, 2023
"A new Stanford Internet Observatory report investigates networks onInstagram and Twitterthat are involved inadvertising and trading self-generated child sexual abuse material (SG-CSAM). Our investigation finds that large networks of accounts,purportedly operated by minors, are openly advertising SG-CSAM for sale on social media.Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers."
https://fsi.stanford.edu/publication/cross-platform-dynamics-self-generated-csam