Apple expands iCloud encryption as it moves away from controversial CSAM scan plans

It can be harder for hackers to grab your iCloud data - and even Apple is rethinking its access to sensitive content. The company is introducing a series of security measures that range from extensive end-to-end encryption to reversing a controversial program intended to identify potential child sex offenders. The launch is headlined by Advanced Data Protection, an optional feature that applies end-to-end encryption to more iCloud data. While Apple already protected 14 categories of data, the new offering protects 23, including iCloud device backups, photos and notes. Your Calendar, Contacts, and iCloud Mail are still not encrypted to support global systems.

Advanced Data Protection is available today in the US through Apple's beta software program. Americans will have broader access by the end of 2022. Other countries will have access in early 2023. You will need to set up an alternative recovery method if you enable the technology, as Apple will not have the keys needed to recover your data.

The two additional safeguards are more about preventing misuse of accounts and devices. iMessage Contact Key Verification will help those who face "extraordinary" threats (like activists, government officials, and journalists) ensure that chat participants are genuine. You will receive an automatic alert if a state-sponsored hacker or similar intruder manages to add a malicious device to an account. Users with the feature enabled can even compare verification codes via FaceTime, secure calls, and in person.

iCloud users will also have the option of using hardware security keys as part of two-factor authentication. This includes both pluggable keys and NFC keys that should only be placed near your iPhone. iMessage and Security Key protections will be available worldwide in 2023.

At the same time, Apple is backing away from its controversial efforts to filter child pornography (CSAM). The company tells Wired it set aside technology that would have detected known CSAM photos in iCloud and flagged accounts for review if they had a number of toxic images. The change of mind comes after "extensive consultation" with experts, according to Apple - the company decided it could protect children without researching this data. Instead, it focuses on opt-in communications safety features that warn parents of nudity in iMessage photos as well as CSAM search attempts using Safari, Siri, and Spotlight. p>

Apple plans to expand communications security to recognize nudity in videos as well as content from other communications apps. The company further hopes to enable third-party support for the feature so that many apps can report child abuse. There is no deadline for when these additional features will arrive, but Apple added that it will continue to make it easier to report exploits.

The tech giant touts the new security features as useful tools for its most privacy- and security-conscious users, whether they're high-profile targets or just people willing to trade a bit of convenience versus peace of mind. However, they could also create new conflicts between Apple and law enforcement. The FBI and other agencies have frequently attacked Apple for making it difficult to hack suspects' iPhones through iOS's end-to-end encryption. Now the police could also be excluded from iCloud data that they could before...

Apple expands iCloud encryption as it moves away from controversial CSAM scan plans

It can be harder for hackers to grab your iCloud data - and even Apple is rethinking its access to sensitive content. The company is introducing a series of security measures that range from extensive end-to-end encryption to reversing a controversial program intended to identify potential child sex offenders. The launch is headlined by Advanced Data Protection, an optional feature that applies end-to-end encryption to more iCloud data. While Apple already protected 14 categories of data, the new offering protects 23, including iCloud device backups, photos and notes. Your Calendar, Contacts, and iCloud Mail are still not encrypted to support global systems.

Advanced Data Protection is available today in the US through Apple's beta software program. Americans will have broader access by the end of 2022. Other countries will have access in early 2023. You will need to set up an alternative recovery method if you enable the technology, as Apple will not have the keys needed to recover your data.

The two additional safeguards are more about preventing misuse of accounts and devices. iMessage Contact Key Verification will help those who face "extraordinary" threats (like activists, government officials, and journalists) ensure that chat participants are genuine. You will receive an automatic alert if a state-sponsored hacker or similar intruder manages to add a malicious device to an account. Users with the feature enabled can even compare verification codes via FaceTime, secure calls, and in person.

iCloud users will also have the option of using hardware security keys as part of two-factor authentication. This includes both pluggable keys and NFC keys that should only be placed near your iPhone. iMessage and Security Key protections will be available worldwide in 2023.

At the same time, Apple is backing away from its controversial efforts to filter child pornography (CSAM). The company tells Wired it set aside technology that would have detected known CSAM photos in iCloud and flagged accounts for review if they had a number of toxic images. The change of mind comes after "extensive consultation" with experts, according to Apple - the company decided it could protect children without researching this data. Instead, it focuses on opt-in communications safety features that warn parents of nudity in iMessage photos as well as CSAM search attempts using Safari, Siri, and Spotlight. p>

Apple plans to expand communications security to recognize nudity in videos as well as content from other communications apps. The company further hopes to enable third-party support for the feature so that many apps can report child abuse. There is no deadline for when these additional features will arrive, but Apple added that it will continue to make it easier to report exploits.

The tech giant touts the new security features as useful tools for its most privacy- and security-conscious users, whether they're high-profile targets or just people willing to trade a bit of convenience versus peace of mind. However, they could also create new conflicts between Apple and law enforcement. The FBI and other agencies have frequently attacked Apple for making it difficult to hack suspects' iPhones through iOS's end-to-end encryption. Now the police could also be excluded from iCloud data that they could before...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow