San Francisco: As social media platforms are increasingly scrutinizing their privacy protection measures, TikTok on Thursday became the latest technology company to announce stricter protections for teenagers.
The short video-sharing app will roll out a number of features in the coming months, including a default curb for 16 and 17-year-olds on in-app messaging unless it is switched to a different setting.
Under 16s will see a pop-up message when they publish their first video, asking them to choose who can watch.
And users aged 16 and 17 will be able to receive a pop-up asking them to confirm who can download their videos. Downloads are already disabled on content posted by under 16s.
The Chinese-owned platform will also stop sending push notifications to users aged 13 to 16 from 9pm — and an hour later for 16 to 17-year-olds — with the aim of reducing their screen time at night.
Alexandra Evans, head of public policy for child safety, and Aruna Sharma, head of global privacy, build on previous measures to protect young users from predators, bullies, and other online dangers.
Evans and Sharma said: “It is important to ensure that stronger active protection measures are taken to help protect the safety of young people. We have been constantly introducing changes to support age-appropriate experiences on our platform.”
“We hope especially to help our young teenagers develop positive digital habits as early as possible.”
Google, YouTube, and Facebook-Instagram have recently strengthened their defenses against young users, and critics have been urging Facebook to abandon its plans to launch a children’s version of Instagram.
According to data from market tracking agency App Annie, TikTok was the world’s most downloaded app last year, surpassing Facebook and its messaging platform.
According to market tracking agency App Annie, despite former President Donald Trump’s efforts to ban or force the sale of the video app to US investors, the app’s popularity is still soaring.