Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Web Observatory
Twitter’s logo might be seen on the skin of its headquarters. The Twitter logo is displayed on the outside of Twitter headquarters. … [+]
Twitter has did not remove images of kid sexual abuse over recent months—regardless that they were flagged as such, a brand new report will allege this week.
Stanford Web Observatory researchers say that the corporate didn’t cope with forty items of Child Sexual Abuse Material (CSAM) between the months of March and May of this 12 months.
Microsoft’s PhotoDNA was then used to go looking for images containing CSAM. PhotoDNA robotically hashes images and compares them with known illegal images of minors held on the National Center for Missing & Exploited Children (NCMEC)—and highlighted 40 matches.
The team reports that “the investigation found problems with Twitter’s CSAM detector mechanisms. We reported this issue in April to NCMEC, but the issue persevered.”
We approached an intermediary for a briefing, as we had no Trust and Safety contact at Twitter. Twitter received notification of the issue and it seems that the difficulty has been resolved by May 20.
Research reminiscent of that is about to turn out to be far harder—or at any rate far costlier—following Elon Musk’s decision to start out charging $42,000 per thirty days for its previously free API. Stanford Web Observatory has been forced recently to stop using its enterprise version of the software. The free version, nevertheless, is just able to provide read-only access. There are also concerns about researchers being forced to erase data collected previously under an agreement.
After highlighting the disinformation that was spread on Twitter through the U.S. presidential elections in 2020, it has been a continuing thorn for Twitter. Musk called the platform a “propaganda system” at the moment.
Wall Street Journal will publish more research results later this month.
The report states that Twitter “is just not the only real platform that deals with CSAM neither is it the most important focus of our upcoming study.” We’re grateful to Twitter for helping to enhance child safety and we thank them.
Twitter Safety announced in January that they were “moving quicker than ever” to eliminate CSAM. In January, Twitter Safety reported that that they had “moved faster than ever” to remove CSAM.
Several reports since have shown that CSAM continues to be an issue on the platform. The Latest York Times reported in February that Twitter took twice as long after Elon Musk’s takeover to remove CSAM flagged child safety groups.
It still replies to any press queries with an emoji of a bathroom.

