FacebookInstagramTwitterContact

 

Kate Shares New Photo Of Smiling Charlotte To Celebrate Her Ninth Birthday           >>           Reginald The Cat Has A Filthy Habit For Stealing Underwear From Strangers           >>           Travis Kelce Makes Surprise Appearance At Pre-2024 Kentucky Derby Party           >>           Anna Nicole Smith's Daughter Dannielynn Birkhead, 17, Debuts New Look At Kentucky Derby           >>           Turmeric Extract Combats The Joint-Damaging Effects Of Arthritis           >>           Cranberries Prevent Cancer And Many Other Chronic Diseases           >>           Boeing Starliner Rolls Out To Launch Pad For 1st Astronaut Flight On May 6 (Photos)           >>           Parrots in captivity seem to enjoy video-chatting with their friends on Messenger           >>           Google prohibits ads promoting websites and apps that generate deepfake porn           >>           Threads Now Lets You Control Who Can Quote Your Posts           >>          

 

SHARE THIS ARTICLE




REACH US


GENERAL INQUIRY

[email protected]

 

ADVERTISING

[email protected]

 

PRESS RELEASE

[email protected]

 

HOTLINE

+673 222-0178 [Office Hour]

+673 223-6740 [Fax]

 



Upcoming Events





Prayer Times


The prayer times for Brunei-Muara and Temburong districts. For Tutong add 1 minute and for Belait add 3 minutes.


Imsak

: 05:01 AM

Subuh

: 05:11 AM

Syuruk

: 06:29 AM

Doha

: 06:51 AM

Zohor

: 12:32 PM

Asar

: 03:44 PM

Maghrib

: 06:32 PM

Isyak

: 07:42 PM

 



The Business Directory


 

 



Security & Privacy


  Home > Security & Privacy


Apple Reportedly Plans To Begin Scanning Iphones In The US For Child Abuse Images (Updated)


Mike Segar / reuters

 


 August 6th, 2021  |  16:21 PM  |   301 views

CALIFORNIA, UNITED STATES

 

The 'neuralMatch' system would 'continuously scan' US iPhones for illegal images.

 

Apple is reportedly planning an update that would allow it to scan iPhones for images of child sexual abuse. According to the Financial Times, the company has been briefing security researchers on the “neuralMatch” system, which would “continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system.”

 

The system would “proactively alert a team of human reviewers if it believes illegal imagery is detected” and human reviewers would alert law enforcement if the images were verified. The neuralMatch system, which was trained using a database from the National Center for Missing and Exploited Children, will be limited to iPhones in the United States to start, the report says.

 

The move would be somewhat of an about face for Apple, which has previously stood up to law enforcement to defend users’ privacy. The company famously clashed with the FBI in 2016 after it refused to unlock an iPhone belonging to the man behind the San Bernardino terror attack. CEO Tim Cook said at the time that the government’s request was “chilling” and would have far-reaching consequences that could effectively create a backdoor for more government surveillance. (The FBI ultimately turned to an outside security firm to unlock the phone.)

 

Now, security researchers are raising similar concerns. Though there’s broad support for increasing efforts to fight child abuse, researchers who spoke to the FT said that it could open the door for authoritarian regimes to spy on their citizens, since a system designed to detect one type of imagery could be expanded to other types of content, like terrorism or other content perceived as “anti-government.”

 

At the same time, Apple and other companies have faced mounting pressure to find ways to cooperate with law enforcement. As the report points out, social media platforms and cloud storage providers like iCloud already have systems to detect child sexual abuse imagery, but extending such efforts to images on a device would be a significant shift for the company.

 

Apple declined to comment to FT, but the company could release more details about its plans “as soon as this week.”

 

Update 8/5 4pm ET: Apple confirmed plans to start testing a system that would be able to detect images of child sexual abuse stored in iCloud Photos in the United States. "Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations," the company wrote in a statement. "Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."

 

The update will be rolling out at a later date, along with several other child safety features, including new parental controls that can detect explicit photos in children's Messages.

 


 

Source:
courtesy of ENGADGET

by Karissa Bell

 

If you have any stories or news that you would like to share with the global online community, please feel free to share it with us by contacting us directly at [email protected]

 

Related News


Lahad Datu Murder: Remand Of 13 Students Extende

 2024-03-30 07:57:54

North Korean Weapons Are Killing Ukrainians. The Implications Are Far Bigger

 2024-05-05 10:30:19

Have The Wheels Come Off For Tesla?

 2024-05-04 07:51:07