Last week, Apple posted about a new security research challenge for hackers to try and test the security of the company’s servers that host its just-launched Apple Intelligence features.
If you’re successful, you could earn up to $1 million.
Related: Hackers Targeted a $12 Billion Cybersecurity Company With a Deepfake of Its CEO. This 1 Small Detail Made It Unsuccessful.
Apple is trying to protect its Private Cloud Compute (PCC) servers, which will process some Apple Intelligence requests, from bad actors and cyberattacks, ZDNet reports.
The company is looking to identify vulnerability in three main areas: accidental data disclosures, external compromises from user requests, and physical or internal access, according to the outlet.
Apple’s guide, Private Cloud Compute Security Guide, explains the ins and outs of how PCC works for anyone who thinks they can hack into the system. ZDNet notes that Apple tested the system with internal experts and other researchers in the lead-up to Apple Intelligence’s launch on Monday.
If you think you have what it takes, here is how much Apple is paying and why:
Remote attack on request data:
Arbitrary code execution with arbitrary entitlements – $1,000,000Access to a user’s request data or sensitive information about the user’s requests outside the trust boundary – $250,000
Attack on request data from a privileged network position:
Access to a user’s request data or other sensitive information about the user outside the trust boundary – $150,000Ability to execute unattested code – $100,000Accidental or unexpected data disclosure due to deployment or configuration issue – $50,000
For more information on the challenge, click here.