– Meta is testing new features on Instagram to protect young people from unwanted nudity or sextortion scams
– The new features include Nudity Protection in DMs, safety tips for those sending or receiving nudes, and technology to detect potential sextortion accounts
– Meta has been criticized in the past for its slow and patchy approach to protecting young users, but is now taking steps to improve safety on the platform
Meta, the parent company of Instagram, is implementing new features to protect young people from unwanted nudity and sextortion scams. One feature called “Nudity Protection in DMs” automatically blurs images containing nudity, while users under 18 will have this feature turned on by default. There will be warnings encouraging users not to respond to such images and to block the senders. Safety tips are also being provided to educate users on potential risks when sending or receiving nudes.
Meta is also developing technology to identify accounts potentially involved in sextortion scams and implementing restrictions on how these accounts can interact with users. Accounts flagged as potential sextortionists will have limits on messaging and interacting with others. Safety notices will be shown to users chatting with these accounts, urging them to report any threats to share private images. The company is also testing features to hide and restrict potential sextortion accounts from contacting teenagers.
These measures are part of Meta’s ongoing efforts to enhance child safety on its platforms. The company has faced criticism in the past for its approach to handling issues like revenge porn and sextortion. With increasing regulatory scrutiny and the implementation of laws like the Digital Services Act in the EU, tech giants like Meta are under pressure to prioritize the protection of minors. These new protective measures are a step towards improving safety for young users on Instagram.