The Federal Investigation Agency (FIA) stated yesterday that the number of reported child sexual abuse cases stood at only 343 in the last five years, and urged people to report and register complaints against sexual predators.
This was revealed by its Director General (DG), Mohammed Tahir Rai, at the Child Protection and Digital Safety Dialogue organized by Zindagi Trust and Meta (formerly known as Facebook) at the Government Elementary College of Education in Hussainabad, Karachi. Also in attendance were representatives from the FIA’s Cyber Crime Wing.
The DG FIA said, “We have a responsibility towards the protection of children, all of us do. And my duty is to enforce the laws here. The protection of the rights of the child in cyberspace is extremely important. Unfortunately, we are among the countries with the highest incidences, which amounted to 2.1 million in 2021”.
“But our nation is not reporting these to the law enforcement agencies and not to Meta also,” he continued and added, “Due to this only 343 such cases have been reported in the last five years. I am glad that Meta is removing such material from its platform”.
Meta revealed that over two million cases of child exploitation-related images were shared on Facebook in 2021 in Pakistan alone.
Its Manager Trust and Safety, Law Enforcement, and Outreach, Michael Yoon, who attended the event online, said that it is working with the FIA and other law enforcement agencies in Pakistan to nab people involved in cybercrime, including blackmail, extortion, etc.
Meta’s Director of Public Policy for South Asia, Sarim Aziz, emphasized that it has a zero-tolerance policy for content related to child exploitation and abuse.
“We use the latest ground-breaking technology to prevent, detect and remove content related to child abuse and violence,” he affirmed.
Since this is a highly sensitive matter, we take a comprehensive approach to keep our family of apps free of such malicious content and continue to collaborate with local authorities, rights bodies, and relevant stakeholders to fully ensure that the victims are not repeatedly traumatized by sharing evidence of their ordeal. Instead, the content should be reported through proper channels so that the government agencies can move against those responsible.
Member National Commission on the Rights of Child (NCRC), Iqbal Detho, urged for the Pakistan Electronic Criminal Act (PECA) to be reviewed and amendments to the Pakistan Penal Code (PPC) for child safety in digital space. He also called for the implementation of policymaking at the provincial level for the dispensation of justice and the provision of legal assistance to victims.
Speaking online, Meta’s Head of Safety Policy, Shireen Vakil, discussed the safety measures available that young Meta users can avail of to protect themselves from online predators. She also mentioned that Meta uses artificial intelligence and machine learning to detect objectionable content and remove it accordingly.
The founder and President of Zindagi Trust, Shehzad Roy, remarked that whenever he came across disturbing content, his first instinct was to share it in a bid to help trace the culprits.
“However, I have realized that the content can be distressing for many viewers and is a breach of privacy of the victims,” he said.
We want to push for justice for victims and advocate for safer digital spaces instead of sensationalizing traumatic events. Zindagi Trust takes pride in being a vocal advocate of policy-level change for children’s safety and education. We rely on the support of government institutions and social media companies like Meta to help us achieve our goals.
In addition to discussing how to make digital platforms safer for children and sharing policy recommendations with government agencies for the creation of better reporting mechanisms, the collaboration also entailed the launching of a campaign titled #ReportDontShare that pushes for the reporting of inappropriate content to help remove it instead of sharing it further on social media platforms.
The organizers screened a video they had created to educate people about the dangers of sharing inappropriate content and to urge them to report improper content via proper channels instead of disseminating it.