Young online users can be exposed to harmful content and critics say the big companies are not doing enough to limit these risks. Google announced rules against ads targeting users under the age of 18 and said it will further expand the types of age-sensitive ad categories that are blocked for users up to 18 and will turn on safe searching filters for users up to that age.
Anyone under 18, or their parents, will be able to request removal of their pictures from Google Image results.
“Of course, removing an image from Search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online”, wrote product and ux director Mindy Brooks in a blog post.
In coming months, the company will globally block ad targeting based on the age, gender, or interests of people under 18.
”Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally”, Brooks wrote.
YouTube will change the default upload setting to the most private option available for teens ages 13-17. Among other measures meant to protect young peoples´ privacy is turning off location history at Google accounts for users under 18 without an option to turn it on.
Addressing the difficulty of identifying users´ age and making age-based rules efficient Brooks wrote:
“Having an accurate age for a user can be an important element in providing experiences tailored to their needs. Yet, knowing the accurate age of our users across multiple products and surfaces, while at the same time respecting their privacy and ensuring that our services remain accessible, is a complex challenge. It will require input from regulators, lawmakers, industry bodies, technology providers, and others to address it – and to ensure that we all build a safer internet for kids.”