Snapchat-owner, Snap, has unveiled a number of features focused at enhancing 13–17-year-olds’ online safety. These modifications, which will be implemented over the next several weeks, are designed to protect teenagers from unwanted contact, offer a more age-appropriate content experience, and make it easier to delete accounts that promote questionable content.
The business is also providing more family-friendly materials, including as an updated parents’ handbook that can be found at parents.snapchat.com. This overview covers the platform’s safeguards for teenagers, parental controls, and a brand-new YouTube series providing further details.
“Our latest features are thoughtful in-app tools designed to empower teens to make smarter choices and openly discuss online safety,” said Uthara Ganesh, Snap’s Head of Public Policy for South Asia. Our top priority is the security and welfare of our community in India, which consists of more than 200 million users.
The in-app warnings function is one of the new features. If someone tries to add a teen as a friend without having common contacts with them or being in their contacts, this function sends them a pop-up warning. The warning urges the kid to think carefully before connecting with this individual and discourages it if there isn’t a basis for trust.
The site already requires that a user between the ages of 13 and 17 have several mutual connections with them before they may appear in search results. “This bar is being raised to require an even greater number of mutual friends, determined by the number of friends a Snapchatter has,” the business said. The intention is to further reduce the likelihood that teenagers will interact with strangers.
A new strike system has also been implemented to deal with accounts that promote content that is improper for children. This mechanism makes it possible to quickly delete any objectionable information that has been found or reported. According to Snap, accounts that frequently attempt to violate the rules will be banned.
Snapchat has introduced new features to improve online safety for 13-17-year-olds. These modifications aim to protect them from unwanted contact, provide a more age-appropriate content experience, and make it easier to delete accounts that promote questionable content. The company is also providing more family-friendly materials, including an updated parents’ handbook and a new YouTube series. The in-app warnings function urges users to think carefully before connecting with a teen without common contacts or being in their contacts. The site now requires users between the ages of 13 and 17 to have multiple mutual connections before appearing in search results. A new strike system has been implemented to deal with accounts that promote inappropriate content for children, allowing for quick deletion of objectionable information. Accounts that frequently violate rules will be banned.