Apple has made a bold decision to scan iPhones in order to reveal any child abuse-related material. This form of scanning technology will aid the identification of images linked to child sexual abuse material (CSAM). Such an initiative sought to break new ground in the technological industry. The tech-giant company contends to be already possessing the relevant software that will be utilized in this plan. Since the outburst of sexual abuse content on the internet was, to some extent, aided by technological advancements, the technology must also be used to stamp out these heinous acts. It becomes one of the notable efforts by a technology company in trying to make use of innovative ideas to help mankind.
By using this scanning technology, Apple hopes to pinpoint any content inappropriate for public consumption, especially the kind that harms children. The abuse and molestation of kids have grown rampant in the previous years in most countries. Initially, female children were prone to abuse by the males, but this has evolved to incorporate all children deemed vulnerable by most nations’ laws. A lot of convictions are done annually regarding paedophiliac cases, with a lot of adults incarcerated. This issue is marred by controversies, especially regarding the “appropriate age of consent,” and different views on the issue have left young people prone to abuse by older generations.
As a tech company, Apple feels the need to curb the abuse of its devices by users. The technology age brought in a new form of sexual digital content that fills the internet. Although it has made life easy for humanity, the internet remains an open door for anyone willing to enter with little consequences for any unruly behavior. Such an entity deemed to be all-encompassing can cause a lot of problems, and this is evidenced in increased sexual content on the web. Technological devices enable users to deliver their content on a global scale, and with the improvements on these devices, sexual content is now in high-grade graphics.
From this background, Apple is prepared to extend its operations in unraveling abusers who misuse iPhones. The plan is to search for common CSAM on images that one intends to upload onto iCloud Photos. If such images are found, tech personnel will examine them and report them to law enforcement agencies. According to Apple, this system will compare the images on a device to known child sexual abuse images stored in a database compiled by the US National Center for Missing and Exploited Children (NCMEC) and other organizations involved in children’s welfare [Source]. The images are rendered into “hashes” numerical codes, which are matched to the images in a device. The system also possesses the ability to match edited images linked to the original copies [Source].
Officials at Apple announced that the iOS and iPadOS versions expected to be released later this year would be a form of “new cryptography applications to help limit the spread of CSAM online while designing for user privacy.” In relation to the system’s accuracy, they also posited that it has an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.” Apple noted how personnel would be assigned to review every flagged to ensure that a ‘real match’ is found.’ Measures such as disabling a user’s account can be taken, and a report sent to law enforcement agents [Source].
Some users are concerned about privacy issues if this system is launched. Others even argued that it could be manipulated to spy on people, especially by powerful governments and influential figures. In this modern world, people’s cell phones contain their ‘whole lives’ embedded in documents, pictures, videos, and audio, so it becomes scary to know that they have access to those personal files. Also, after hacking spyware Pegasus’s data leakage, most people are afraid of governments spying on them. But Apple has assured the users that their right to privacy will be respected since one’s photos will only be accessed if any CSAM content is found. This means if there is no CSAM, then the images will not be viewed by Apple, so this is a safe system to use, and no one must be scared of any misconduct. The implementation of the system will depict how effective it is in due course because technology is always prone to abuse by humans.
The strategy by Apple has gained support from other stakeholders who argue that prohibited content should be identified before it is posted online. Apple will deter some abusers from storing CSAM in their devices. Such small steps in ending child abuse must be commended while ensuring that the system is not manipulated for selfish gains or citizens’ illegal spying.