In This Article:
Ahead of the debut of Apple's private AI cloud next week, dubbed Private Cloud Compute, the technology giant says it will pay security researchers up to $1 million to find vulnerabilities that can compromise the security of its private AI cloud.
In a post on Apple's security blog, the company said it would pay up to the maximum $1 million bounty to anyone who reports exploits capable of remotely running malicious code on its Private Cloud Compute servers. Apple said it would also award researchers up to $250,000 for privately reporting exploits capable of extracting users' sensitive information or the prompts that customers submit to the company's private cloud.
Apple said it would "consider any security issue that has a significant impact" outside of a published category, including up to $150,000 for exploits capable of accessing sensitive user information from a privileged network position.
"We award maximum amounts for vulnerabilities that compromise user data and inference request data outside the [private cloud compute] trust boundary," Apple said.
This is Apple's latest logical extension of its bug bounty program, which offers hackers and security researchers financial rewards to privately report flaws and vulnerabilities that could be used to compromise its customers' devices or accounts.
In recent years, Apple has opened up the security of its flagship iPhones by creating a special researcher-only iPhone designed for hacking, in an effort to improve the device's security, which has been frequently targeted by spyware makers in recent years.
Apple revealed more about the security of its Private Cloud Computer service in a blog post, as well as its source code and documentation.
Apple bills its Private Cloud Compute as an online extension of its customers' on-device AI model, dubbed Apple Intelligence, which can handle far heavier-lift AI tasks in a way that Apple says preserves the customers' privacy.
This article originally appeared on TechCrunch at https://techcrunch.com/2024/10/24/apple-will-pay-security-researchers-up-to-1-million-to-hack-its-private-ai-cloud/