Tech News Desk – Meta has introduced a ‘take it down’ tool for Facebook and Instagram that will give teens the option of deleting nude pictures they’ve previously uploaded. This tool is powered by the National Center for Missing and Exploited Children (NCMEC). The purpose of launching this feature is to reduce the issue of sextortion. Actually, nowadays youth share nude pictures with each other on Facebook and Instagram, after which they are blackmailed or intimidated on the basis of these pictures. Many times these pictures are threatened to go viral on the internet. To avoid this, users get involved in such wrongdoings for years and years and don’t know what they do when asked by the person in front. To end all this, Meta has launched this tool.
With the help of this new tool, children or their parents, if any such photo (nude or half nude) has been uploaded on these platforms in the past, can prevent it from being deleted or spread. As soon as users appeal for deletion of the photo, then this photo will be converted into a digital fingerprint called hash and then it will be shared to NCMEC. If someone tries to upload your photo on the internet, Facebook will block it with this hash matching technique and will not allow it to be uploaded.
but here’s the trick
Note, this tool is released by Meta only for Facebook and Instagram. If someone shares such a picture on WhatsApp, it cannot be deleted. Also, if someone tampers or edits that picture (previously uploaded or sent), then this picture will not be deleted from the platform as it is considered as a new picture. For this, the new picture will have to be reported again.
Recently Meta has started paid verification service
Meta CEO Mark Zuckerberg announced a paid verification service for Instagram and Facebook some time back. At present, paid verification service has been started in Australia and New Zealand. Here web users have to pay Rs 990 per month and Android and iOS users have to pay Rs 1240.