Messenger for sex chat
Filters and moderators are essential for a clean experience, said Claire Quinn, safety chief at a smaller site aimed at kids and young teens, Wee World.But the programs and people cost money and can depress ad rates.They called the media company, which then alerted authorities.Other sites aimed at children agree that such crises are rarities.Most sex crimes against children are committed by people the children know, rather than strangers.Even those companies with state-of-the-art defenses spend far more time trying to stop online bullying and attempts to sneak profanity past automatic word filters than they do fending off sex predators.Things like too many 'unrequited' messages, or those that go unresponded to, also factor in, because they correlate with spamming or attempts to groom in quantity, as does analysis of the actual chats of convicted pedophiles.
In addition, Facebook doesn't probe deeply into what it thinks are pre-existing relationships.
Metaverse Chief Executive Amy Pritchard said that in five years her staff only intercepted something 'terrifying' once, about a month ago, when a man on a discussion board for a major media company was asking for the email address of a young site user.
Software recognised that the same person had been making similar requests of others and flagged the account for Metaverse moderators.
Under a 1998 law known as COPPA, for the Children's Online Privacy Protection Act, sites directed at those 12 and under must have verified parental consent before collecting data on children.
Some sites go much further: Disney's Club Penguin offers a choice of viewing either filtered chat that avoids blacklisted words or chats that contain only words that the company has pre-approved.