Apple will pay up to $1 million to anyone who finds a privacy flaw inside Apple Intelligence
Apple Intelligence just got its first bug bounty
Apple made a very big deal about Apple Intelligence’s privacy credentials when it launched the AI suite earlier this year. There has been some skepticism about those claims, especially from people like Elon Musk who took particular offense to Apple’s partnership with ChatGPT. But now Apple is putting its money where its mouth is, launching the first Apple Intelligence Bug Bounty.
Specifically, Apple is inviting hackers to investigate the Private Cloud Compute (PCC) feature. While on-device AI is inherently more private because all the data stays on the phone, cloud-computing is a different matter. PCC is Apple’s attempt to fix that issue, and offer cloud-based AI processing without compromising on data security and user privacy.
But clearly Apple isn’t expecting us all to take its word for it, and is actively inviting security researchers and “anyone with interest and a technical curiosity” to independently verify the company’s claims about PCC. It would be a huge blow to Apple if this system were somehow compromised and bad actors got access to supposedly-secure user data.
The point of bug bounties is to incentivize hackers and other security professionals. Hackers are an intrepid bunch, and can often find ways to stress test systems that in-house developers never thought of. And by reporting any problems they come across, they form a mutually beneficial arrangement with Apple. Apple gets to fix security flaws quietly, without user data being exposed to the wrong people, and the hackers get paid for their effort.
In the case of PCC, Apple is offering various rewards depending on the issue reported, but the maximum has now been increased to $1 million. That sum is only available for “Arbitrary code execution with arbitrary entitlements” during a “remote attack on request data." That should tell you how seriously Apple is taking this, or how confident it is that PCC is secure.
Overall, a million dollars is a small price to pay to avoid the PR disaster that would occur if criminals found a way in.
To facilitate this, Apple is offering various tools and resources to aid bug bounty hunters in their work. They include a security guide with PCC’s technical details, source code for ““certain key components of PCC that help to implement its security and privacy requirements” and a “Virtual Research Environment” for doing security analysis of PCC. The latter requires you to have a Mac with Apple Silicon, at least 16GB of RAM and access to the macOS Sequoia 15.1 developer preview.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
Privacy is always a concern when you’re using online services, and cloud-AI is absolutely no different. Thankfully Apple does still seem to be sticking with its usual Privacy-centric mandate, and is openly looking for ways to ensure things are secure. It won’t please everyone, but it’s better than nothing.
More from Tom's Guide
Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.