Michael H. Keller / New York Times:
To avoid erroneously flagging CSAM, Meta’s training docs tell content moderators to “err on the side of an adult” when judging people’s age in photos or videos — The company reports millions of photos and videos of suspected child sexual abuse each year.
To avoid erroneously flagging CSAM, Meta’s training docs tell content moderators to “err on the side of an adult” when judging people’s age in photos or videos (Michael H. Keller/New York Times)
