Twelve years after its launch Instagram is reportedly testing software to detect unsolicited nude photos to stop them from being sent to the DMs.
In 2010 Instagram hit the world by storm with the picture-sharing trend taking users time from Facebook and Twitter to share pictures of their food. At that time the app was only available on iPhone then branched out to other operating systems and gained even more popularity. Facebook eventually purchased the app added video and then added direct messaging. With direct messaging we watched the app go from friendly and harmless to a place for people to voice their hatred and unwanted opinions in the DMs.
Instagram Reportedly Working On A Tool To Protect Users From Receiving Unwanted Nude Photos
According to The Verge, Instagram is working on a way to protect users from receiving unwanted nude photos in their direct messages. Meta has revealed the feature will be similar to the ‘hidden words’ features which allow users to filter out messages containing offensive material. While the feature is still in the early development stages it is long overdue for not just Instagram but social media platforms in general.
You can get a sneak peek at the features below.