Deception, Exploited Employees, and Free Money: How Worldcoin Recruited Its First Half Million Test Users

Deception, Exploited Employees, and Free Money: How Worldcoin Recruited Its First Half Million Test Users

In the end, it was something Blania casually said during our interview in early March that helped us finally understand Worldcoin.

“We’ll have privacy experts take our systems apart again and again before we actually deploy them on a large scale,” he said, answering a question about last fall’s privacy-related backlash.

Blania had just shared how his company had gotten 450,000 people on board with Worldcoin — meaning the orbs scanned 450,000 sets of eyes, faces and bodies and stored all that data to train the neural network. The company recognized this data collection as problematic and wanted to stop it. Still, it didn’t offer these early users the same privacy protections. We were stunned by this seeming contradiction: were we… those who lack vision and the ability to see the bigger picture? After all, compared to the company’s stated goal of signing up one billion users, maybe 450,000 is small.

But each of those 450,000 is a person, with his or her own hopes, lives and rights that have nothing to do with the ambitions of a Silicon Valley startup.

Talking to Blania revealed something that we struggled to understand: how a company could speak so passionately about its privacy-preserving protocols while clearly violating the privacy of so many. Our interview helped us see that, for Worldcoin, these legions of test users weren’t meant to be for the most part end users. Rather, their eyes, bodies and life patterns were simply grist to Worldcoin’s neural networks. The lower-level orb operators, meanwhile, were given pennies to feed the algorithm, often privately wrestling with their own moral scruples. The massive effort to teach Worldcoin’s AI to recognize who or what a human was was ironically dehumanizing for those involved.

When we submitted seven pages of findings and questions to Worldcoin, the company’s response was that almost all of the negative we discovered was simply “isolated incident”.[s]That wouldn’t matter in the end anyway, because the next (public) iteration would be better. We believe that rights to privacy and anonymity are fundamental, therefore anyone who signs up for Worldcoin will be able to do so without sharing in the coming weeks each of their biometrics with us,” the company wrote. That nearly half a million people had already undergone their tests seemed of little consequence.

What really matters are the results: that Worldcoin will have an attractive user number to bolster its sales pitch as Web3’s preferred identity solution. And when the real monetization products – be it the orbs, the Web3 passport, the currency itself, or all of the above – are launched for the intended users, everything will be ready, with no messy signs of the labor or the human body parts behind it.

Leave a Comment

Your email address will not be published.