Social Platforms Flamed over Child Safety

Tech giants Snap, TikTok, and YouTube are being grilled by US senators over worries that their apps, like Facebook, are harmful to children.

Senator Richard Blumenthal, a Connecticut Democrat, told the companies that everything they do is to add users, especially kids, and keep them on apps for longer, reports CNET.

Blumenthal said he has heard from parents about the “rabbit hole” teenagers go down when they log onto Snap, TikTok, and YouTube.

His office, which created accounts on TikTok and YouTube as part of their own research, also found that extreme dieting and eating disorder content is easy to find on these platforms.

“Like Big Tobacco, Big Tech has lured teens despite knowing its products can be harmful,” he said.

The hearing comes amid heightened scrutiny of social media giant Facebook, which is the subject of a series of stories based on leaked documents that suggest the company knows about the harm its platforms cause to the mental health of teenagers, as well as democracy and developing countries.

Frances Haugen, the former Facebook product manager who collected the cache of internal research and communications, has already testified before the Senate subcommittee. She also testified to Parliament on Monday.

The hearing also marks the first time Snap and TikTok have testified.

Snap is being represented by Jennifer Stout, the vice president of global public policy; TikTok by Michael Beckerman, vice president and head of public policy, Americas; and YouTube by Leslie Miller, vice president of government affairs and public policy.

Stout tried to distinguish Snapchat from its rival Facebook, noting that the company was built as “an antidote to social media”.

Unlike Facebook, Snapchat doesn’t have a News Feed and the disappearing app is being used by people to communicate privately with their friends.

“We have a moral responsibility to take into account the best interest of our users and everything we do. And we understand that there is more work to be done,” she said.

Beckerman said TikTok has built features to protect younger users. People under 16 have their TikTok accounts set to private automatically.

“There is no finish line when it comes to protecting children and teens. The challenges are complex and we are determined to work hard and keep the platform safe and create age-appropriate experiences,” he said.

YouTube told Congress in prepared remarks that it removed 7 million accounts believed to belong to young children and preteens in the first three quarters.

The company said on YouTube Kids and YouTube, autoplay videos are off by default for users under 18.

YouTube also plans to launch more parental controls in the YouTube Kids app autoplay is off by default, including the ability for a parent to choose a locked default autoplay setting.

Leave a Reply

Your email address will not be published. Required fields are marked *