Some accounts also allowed buyers to perform “commission specific tasks” or arrange “meet ups”.
Simultaneously, the investigation found that Instagram allowed users to search by pedo-sexual abuse hashtags such as #pedowwhore, #preteensex and #pedobait.
Responding to this, a Meta spokesperson said that the report says that these accounts are often linked to “off-platform content trading sites”.
“We have comprehensive and robust policies against child nudity, abuse and exploitation, including child sexual abuse material and inappropriate interactions with children,” a Meta spokesperson told NDTV.
The spokesperson further added, “We remove content that sexually exploits minors and removes accounts, groups, pages and profiles dedicated to sharing innocent images of children with captions, hashtags or comments containing inappropriate signals.” Are.”
Focus on keeping teenagers safe: META
The company also highlighted that its focus is on keeping teens safe by preventing unwanted contact between teens and adults they don’t know.
“We prevent potentially suspicious adults from finding, following, or interacting with teens, automatically placing teens on private accounts when they connect to Instagram, and notifying teens that these adults try to follow or message them.”
Claim to remove material related to child sexual abuse
Meta said it has invested heavily in developing technology “that finds child abuse material before someone reports it to us”. A company spokesperson said that in the fourth quarter of 2022, its technology has removed more than 34 million (34 million) pieces of child sexual abuse material from Facebook and Instagram.
read this also :
* Meta started verified account service in India, know how much you can get Blue Tick
* Meta fired again many employees, top executive in India also in the list
* Meta India Partnership chief Manish Chopra resigns