Facebook Unveils New Controls For Kids Using Its Platforms | Chicago News

In this June 4, 2012 file photo, an unidentified 11-year-old girl logs into Facebook on her iPhone at her home in Palo Alto, Calif. (AP Photo / Paul Sakuma, File)

NEW YORK (AP) – Facebook, following damning testimonials that its platforms harm children, will introduce several features, including getting teens to take a break using its Instagram photo-sharing app and ‘push’ adolescents if they repeatedly watch the same content that is not conducive to their well-being.

Facebook, based in Menlo Park, Calif., Also plans to introduce new controls for adults of teens on a voluntary basis so parents or guardians can monitor what their teens are doing online. The initiatives come after Facebook announced late last month that it was suspending work on its Instagram for Kids project. But critics say the plan lacks details and they are skeptical about the effectiveness of the new features.

The new controls were described on Sunday by Nick Clegg, Facebook’s vice president for global affairs, who covered various Sunday news broadcasts, including CNN’s “State of the Union” and “This Week with George. Stephanopoulos ”from ABC where he was grilled about Facebook. the use of algorithms as well as its role in disseminating damaging misinformation prior to the January 6 Capitol riots.

“We are constantly iterating to improve our products,” Clegg told Dana Bash Sunday on “State of the Union”. “We can’t just flick the wand to make everyone’s life perfect. What we can do is improve our products, so that our products are also safe and comfortable to use.

Clegg said Facebook has invested $ 13 billion in the past few years to keep the platform safe and the company has 40,000 people working on these issues.

The flurry of interviews came after whistleblower Frances Haugen, a former Facebook data scientist, appeared before Congress last week to accuse the social media platform of not making changes to Instagram after internal research showed apparent prejudice to some teens and being dishonest in its public fight against hate and misinformation. Haugen’s accusations were supported by tens of thousands of pages of internal research documents that she secretly copied before quitting her job in the company’s civic integrity unit.

Josh Golin, executive director of Fairplay, a watchdog for the media and children’s marketing industry, said he didn’t think the introduction of controls to help parents supervise teens would be effective because many teens create secret accounts anyway. He also doubted the effectiveness of pushing teens to take a break or walk away from harmful content. He noted that Facebook needs to show exactly how they would implement it and come up with research showing that these tools are effective.

“There are huge reasons to be skeptical,” he said. He added that regulators need to restrict what Facebook does with its algorithms.

He said he also believes Facebook should cancel its Instagram project for kids.

When Clegg was toasted by Bash and Stephanopoulos in separate talks about using algorithms to amplify disinformation ahead of the January 6 riots, he responded that if Facebook took out algorithms, people would see more, not less talk. hate, and more, not less, misinformation.

Clegg told the two hosts that the algorithms served as “giant spam filters.”

Democratic Senator Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust and Consumer Rights, told Bash in a separate interview on Sunday that it was time to put update children’s privacy laws and provide more transparency in the use of algorithms.

“I appreciate that he’s ready to talk about things, but I think the time for conversation is over,” Klobuchar said, referring to Clegg’s plan. “The time to act is now.”