Here's how Apple's keeping your cloud-processed AI data safe (and why it matters)

With Private Cloud Compute, Apple is using custom chips that are more secure than traditional servers and can be verified by outside security experts.
Written by Tiernan Ray, Senior Contributing Writer
Mac Pro Apple store


Jason Hiner/ZDNET

On the heels of Microsoft's debacle with its Copilot+ PC -- the AI's "Recall" feature has been lambasted as a massive security violation in artificial intelligence -- Apple on Monday used its annual developer conference, WWDC, to promise "groundbreaking" privacy protections in AI. 

Also: Everything Apple announced at WWDC 2024, including iOS 18, Siri, AI, and more

In conjunction with a broad artificial intelligence offering across MacOS "Sequoia," iPadOS 18 and iOS 18, called "Apple Intelligence," Apple's head of software engineering, Craig Federighi, announced the company will run some AI models on-device, but also run some in a secure cloud computing environment when they require extra horsepower.

Called "Private Cloud Compute," the service "allows Apple intelligence to flex and scale its computational capacity and draw on even larger server-based models for more complex requests while protecting your privacy," said Federighi. 

The servers underlying Private Cloud Compute are "servers we've especially created using Apple silicon," said Federighi, confirming rumors last month that Apple would use its own custom silicon in place of Intel and AMD chips that typically power data center servers.

The servers and their chips "offer the privacy and security of your iPhone from the silicon on up, draw on the security properties of the Swift programming language, and run software with transparency built in," said Federighi. 

Also: Forget LastPass: Apple unveils 'Passwords' manager app at WWDC 2024

"When you make a request, Apple Intelligence analyzes whether it can be processed on device," he explained. "If it needs greater computational capacity, it can draw on private cloud compute and send only the data that's relevant to your task."

Apple emphasized that user data will not be gathered by the company, in contrast to the general AI industry practice of using individuals' and companies' data for training AI models. "Your data is never stored or made accessible to Apple," said Federighi. 

Siri Demonstration at WWDC

The company also announced a partnership with OpenAI to integrate ChatGPT into Siri.

Screenshot by Nina Raemont

Federighi emphasized outside scrutiny of the Private Cloud Compute servers by security experts, stating, "And just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise. In fact, private cloud compute cryptographically ensures your iPhone, iPad, and Mac will refuse to talk to a server unless its software has been publicly logged for inspection."

Federighi did not go into detail about how the Private Cloud Compute servers will be inspected or audited by security researchers.

Also: AI advancements in medicine and education lead ZDNET's Innovation Index

Said Federighi, "This sets a brand new standard for privacy and AI and unlocks intelligence you can trust."

Apple also announced during the keynote that it is partnering with OpenAI to offer free use of ChatGPT, with GPT-4o, on its devices. The company emphasized that any use of ChatGPT will first ask the Apple device user's permission. 

"Your requests and information will not be logged," said an Apple spokesperson in the recorded video of the keynote, adding, "Of course, you're in control over when ChatGPT is used and will be asked before any of your information is shared."

Editorial standards