Snapchat to introduce family safety tools to protect minors using its app

After a series of big internet companies took steps to better safeguard children on their services, Snapchat is gearing up to launch a set of “family engagement” capabilities in the near future.

In an interview at the WSJ Tech Live conference this week, Snap CEO Evan Spiegel revealed the upcoming offering. He said that the new product will basically serve as a family center that allows parents more access to how adolescents use the service and provides privacy settings.

According to Spiegel, Snapchat’s more private character as a tool for interacting with friends is what sets it apart from some of its social media rivals. Snapchat user accounts are already private by default, according to Spiegel.

This service is built to promote a safe experience for everyone, regardless of age, but we never market our service to anyone under the age of 13,” he stated before adding that Snap is currently working on additional features to make parents more comfortable with the app.

There is a family center for young people and their parents to use Snapchat together that we haven’t revealed yet,” Spiegel added. With this device, parents will be able to see who their teenagers are messaging on Snapchat and how their privacy settings are configured without having to dig through their kid’s phone.

As Spiegel put it, “I hope that at least helps start a dialogue between young people and their parents about what they’re experiencing on our service.” For both parents and adolescents, having these sorts of talks is a valuable educational experience. It gives parents the chance to help lead their children through social media’s challenges, such as how to handle uncomfortable circumstances, such as when a stranger approaches you.

After the parents of a boy who died from a drug overdose asked Snap to collaborate with third-party parental control software programs, Snap said in June that this kind of work was on their agenda. Snap stated at the time that it was being cautious when disclosing private user data to third parties and that it was looking into establishing its own parental controls as a possible solution to that problem. As a result of this problem, the business recently launched measures to combat illegal drug sales on Snapchat.

A Snap representative, when asked about Spiegel’s statements at the WSJ event, acknowledged that the new tools for family interaction will have both an educational component and tools for parents to utilize.

To assist educate and empowering young people, we want to help parents partner with their children as they navigate the digital world. That’s what a spokesman for the organization stated.

Parental tools are being developed to help parents better protect their children in ways that don’t compromise privacy or data security, are legally compliant, and are offered free of charge to families within Snapchat,” the company said. “When we build new products or features, we try to do it in a way that reflects natural human behaviors and relationships.

The business promised to share more information on the family-friendly products “soon.”

Snap’s new global head of Platform Safety, Jacqueline Beauchere, has recently joined the business from Microsoft, where she worked as a chief online safety officer and will be in charge of the parental controls there.

Regulators are paying more attention than ever to social media giants like Facebook and Google as well as other Big Tech businesses.

A bill being considered by American politicians would compel internet firms to adopt new protections aimed at protecting children while using their services, but companies are already preparing for the impending crackdown by applying their own interpretations of the regulations as they see fit.

Parental controls have already been installed or default privacy settings have been changed for several of the main tech platforms utilized by teenagers.

In 2020, TikTok, for example, led the way with the release of its “Family Safety Mode” feature, having recently put a multimillion-dollar FTC fine behind it for its children’s privacy abuses. Spring 2020 will see the global release of these tools.

At the beginning of this year, TikTok announced that it will modify the privacy settings and defaults for all of its users under the age of 18 years old.

Later that summer, Instagram restricted ad targeting in addition to changing its default settings for minors and rolled out new adolescent safety measures. YouTube’s parental controls were updated in February, and Google’s modest protections were enhanced in August across all of its services, as well. Making default settings more private and restricting ad targeting was part of this as well.

Although Snap routinely claims how popular its app is among a younger audience, it has yet to take any such steps. This number has stayed relatively constant over the last few years, according to Snap’s current estimate of the app’s audience: 90% of 13- to 24-year-olds in the United States.

However, further legislative requirements for modest protections are not opposed by the corporation.

Spiegel agreed with other recent tech executives that legislation may be required. But he was careful to point out that regulating Big Tech won’t solve all of society’s problems.

Regulating is not a substitute for moral duty and for corporate practices that enhance the well-being of your community, Spiegel said. “Regulation just occurs much too late,” Spiegel added. Regulators would constantly be playing catch-up until companies actively promote the health and well-being of their community, he added. “So, I think regulation absolutely may be essential in some of these areas.”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.