Monday, March 31, 2025

the Case of Worldcoin – Communications of the ACM

Computer sciencethe Case of Worldcoin – Communications of the ACM


At a time when governments are promoting greater uptake of artificial intelligence, privacy enhancing technologies (PETs) can be essential. Yet, PETs are used for privacy washing.

Google develops and deploys a range of PETs, not to protect people’s privacy, but for privacy washing through misleading privacy claims. Examples include Google’s deal with Mastercard to track retail sales, the Privacy Sandbox initiative, and digital fingerprinting. Google is a leader in—but does not have a monopoly on—privacy washing.

Recently, EU data protection regulators, led by BayLDA, the Data Protection Authority for the German state of Bavaria, published a decision related to Worldcoin’s attempt to privacy wash through PETs. The decision shows that privacy is not guaranteed by using PETs.

PETs are a range of techniques that claim to protect the privacy of people: when you communicate and when data about you is processed. An example of a PET that promises to process data privately and securely is secure multiparty computation (SMPC). One of the approaches to SMPC includes splitting the data into shares held by different parties, instead of processing the data by a central server.

Worldcoin provides a World ID that it claims could be used to prove that a person is a ‘real person’ in the digital environment. The company turns iris data into a binary code, which the company calls ‘iris-code.’ For ‘privacy’ reasons, Worldcoin has proposed to split the iris-code into shares held by two parties.

Worldcoin believes that SMPC allows them to avoid data protection law in the EU. This belief is rooted in at least two assumptions: (1) by splitting the data into shares, and (2) by contractually preventing the parties from combining the shares, personal data is not processed. If personal data is not processed, then the data processing is not within the scope of EU data protection law. To the best of my knowledge, these assumptions had not been tested by regulators or the courts in the EU until the recent decision.

Deployments of SMPC rely on the assumption that the parties will not collude. Worldcoin stated that they relied on contractual provisions to prevent collusion and merging of the shares. However, such contractual provisions are insufficient unless there is a specific law (for example, medical secrecy) that prevents collusion. Worldcoin could not rely on any such law for its purpose.

BayLDA argues in its decision that Worldcoin relies on “the inherent individuality of each person’s iris” and “[w]hether the SMPC shares can or cannot be merged is irrelevant for the characterisation of the SMPC shares as personal data” when the purpose is to identify or recognize a person. “If data is processed for the purpose of identifying a person, the argument that it is not personal data constitutes a contradiction in itself.”

Furthermore, Worldcoin uses AWS servers offered by Amazon for both parties in the SMPC. Thus, all the shares are stored and processed on AWS servers. In the academic literature on SMPC, typically the parties participate in an SMPC protocol in the threat model—but not the cloud service provider. In the real world, this is important. In BayLDA’s assessment, “The assumption that AWS could merge the SMPC shares on its own authority is also not so remote that this possibility is to be excluded from consideration … the merger does not involve a practical disproportionate effort, regardless of contractual limitations.” A similar concern was previously observed in a privacy washing proposal from Apple, where the two parties were iCloud and iPhone, both controlled by Apple.

The BayLDA decision might be the first example of privacy washing through PETs that have been assessed by regulators. It will likely not be the last.

Greater use of PETs might result in a wide spectrum of privacy washing. The computing community should prevent privacy washing and only support PETs’ use for the public good. A recent Dagstuhl seminar on privacy washing that I co-organized is a step in that direction.

Kris Shrishak is a public interest technologist and an 
Enforce senior fellow at the Irish Council for Civil Liberties. His work 
focuses on tech policy, privacy tech, anti-surveillance, emerging 
technologies, and algorithmic decision-making.

Check out our other content

Check out other tags:

Most Popular Articles