Instagram Will Begin Blurring Explicit Images Sent To Minors
Amid increased concerns about the emotional and mental impact that social media can have on young people, several major tech companies have announced new strategies aimed at protecting minors from some of the most prevalent dangers.
Meta, which owns Facebook and Instagram, among other companies, announced this week that it would begin blurring images containing nudity in messages to users under the age of 18. Ongoing advancements in machine learning will help the company achieve its goal of determining whether a photo should be obscured.
Instagram will automatically blur nude images in direct messages sent to users under 18 by default and encourage adult users through a notification to turn on the feature, the company announced. https://t.co/UzQe3dRDW2
— The Hill (@thehill) April 11, 2024
In addition to automatically blurring such images for minors, Meta confirmed that it would also notify adult users of the option and encourage all users to implement the new function.
“Because the images are analyzed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meat won’t have access to these images — unless someone chooses to report them to us,” a corporate statement explained.
Instagram messages are not currently encrypted as they are on some other Meta platforms, including WhatsApp, but the company has announced its intentions to encrypt those direct messages in the near future.
The image-blurring feature is just the latest effort by Meta, which has faced global backlash from users and regulators regarding harmful content, to protect minors on its platforms.
Earlier this year, the company confirmed plans to more aggressively hide posts deemed particularly sensitive.
In October, attorneys general from 33 states representing both major political parties sued Meta over allegations that it engaged in deceptive practices in assuring the public that it had taken steps to protect young users from prevalent dangers on its platforms.
“We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families,” a company spokesperson said at the time. “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”
YouTube, which is owned by Google’s parent company Alphabet, also announced a new measure this week aimed at protecting children, indicating that minors with supervised accounts will soon be unable to post comments on videos.