/// Making Club Penguin Safe for Kids Is Never "Done," Disney Says
If you have young children, you probably know Club Penguin: A massively multiplayer game owned by Disney that’s a runaway hit with kids. At its Kelowna, Canada, headquarters this week, the company took members of the media behind the scenes and explained its efforts to preserve parents’ trust that those kids are safe. “We believe Club Penguin is a safe start to social,” said Chris Heatherly, Disney Interactive’s VP and GM, who replaced Club Penguin founder Lane Merrifield last October. Heatherly and other Club Penguin employees described the site as a social destination, similar in some ways to Facebook, which officially does not allow users under the age of 13 (not that that has ever stopped anyone). Facebook’s 13-plus rule derives from laws like the Children’s Online Privacy Protection Act, which outlawed the online collection of personal information about the under-13 set. So, for Club Penguin’s primary audience of 8- to 12-year-olds, moderators manually screen out usernames that contain players’ real names and ask for parents’ email addresses — not players’ — during registration. The company announced an upgrade to its “Safe Chat” technology, which screens out words, phrases and slang, not to mention players’ “dictionary dancing” attempts to circumvent the filter by combining words that sound like banned words. For example, to prevent bullies from saying “you’re gay,” the company also blocks out phrases like “you’re grey.” But the effectiveness of auto-moderation based on whitelisted and blacklisted words is limited, company reps said. Social tech director Marc Silbey gave the example of the word “beach,” which some players have tried to use in lieu of the banned word “bitch.” Trying to censor phrases that sound like “you are a beach,” Sibley said, can also ensnare harmless sentences like “you want to come to the beach?” The new upgrade, officially called “dynamic validation,” feeds the entirety of what kids type into a real-time search engine that tries to divine meaning and appropriateness. Silbey said about 80 percent of what Club Penguin players type is ultimately ruled “safe,” and that the new technology can automatically validate 90 percent of that without the need for a human moderator’s judgment.