If you are Under 18, Google has updated policy for you, have a look

Google is strengthening its rules to make the internet a more secure environment for children and teenagers. In a recent announcement, the search engine giant said that it would offer children greater control over their online behavior. The important part is that individuals under the age of eighteen will now have the option of requesting that their pictures be removed from Google Search results. If a young user is unable to apply on their own, their parents or guardians may do so on their behalf by contacting Google.

Google Updates Policy for Under 18 users

Despite this, Google has said that the pictures would not be removed off the web. “The following week, we will roll out a new policy that will allow anybody under the age of 18, as well as their parent or guardian, to request that their pictures be removed from Google Image results. Of course, deleting a picture from Search does not remove it from the web, but we think that this adjustment will assist in giving young people more control over their photos on the internet,” the search engine giant said in a blog post.

As of the right moment, Google does not allow children under the age of thirteen to register for a Google account. However, it does not have an algorithm to determine whether or not a person under the age of majority has lied about his or her age. Google is implementing modifications to its applications, including YouTube, the Google Search app, the Google Assistant, and others, to address the issues identified.

SafeSearch Feature

A Google spokesperson said that it would not show adult material that young people have not specifically seek out on the internet. The Google SafeSearch feature, which helps filter out explicit results when activated, is already turned on by default for all signed-in users under the age of 13 who have accounts managed by Family Link, according to the company.

According to the organization, it will enable SafeSearch protection for current users under the age of 18 and make this the default option for teenagers creating new accounts in the future. The SafeSearch will also include its SafeSearch technology into the web browser on smart screens in the future.

Google is introducing a new safety area on the Google Play Store that will inform parents of their children about whether apps adhere to the family’s safety standards. Specifically, Google said that “apps will be obliged to explain how they use the data that they gather in more detail, making it simpler for parents to determine whether the app is appropriate for their kid before they download it.”

Google will not only make adjustments to its applications, but it will also allow parents to establish screen time restrictions and reminders for the devices their children use under supervision. New Digital Wellbeing filters, which will be available in the future months on Assistant-enabled smart devices and allow users to restrict news, podcasts, and website access, will be released by Google in the coming months.