The founder of dating app Bumble recently predicted we will soon have personalized AI assistants dating each other on our behalf. In an age of already rampant social atomization, the prospect promises to cocoon us into ever more insular social milieus.

Whitney Wolfe Herd, founder and chief executive officer of the Bumble dating app, speaking in San Francisco, California, on May 9, 2024. (David Paul Morris / Bloomberg via Getty Images)

Comments about the future of dating by the founder of online dating app Bumble have recently been making the rounds in the media. According to Whitney Wolfe Herd, it won’t be long before our own personal AI assistants are dating the AI counterparts of our potential matches. AI “concierges” at apps like Bumble will have conversations with each other to screen potential dates and determine whether a match would be worth making, before the human even enters the decision-making process.

The suggestion has produced a slew of hilarious memes featuring imaginary chats between flirtatious robots. But the episode also makes visible some significant developments in the relationship between our everyday lives and our contemporary form of platform capitalism. It shows us that even as more people become closer to one another in terms of our economic situations, the logic of our technology is to stratify us along lines of class, race, and other categories and mystify that fact.

Bumble founder Whitney Wolfe Herd says the future of dating is having your AI date other people’s AI and recommend the best matches for you to meet pic.twitter.com/9GEEvpuiKZ

— Tsarathustra (@tsarnick) May 10, 2024

The first thing to note about the AI dating concierge idea is the potentially distracting anthropomorphic element. When robots and algorithms appear in our society as humanlike, they are often presented as new innovations (Microsoft’s Tay, Hanson Robotics’ Sophia, and so on). In fact, these products generally use older AI technologies, which are “humanized” for marketing reasons when the product is ready for the next step in its commercialization.

So when we see humanistic robots, we should be thinking less about what is new and emerging (or asking whether they are really becoming like us) and more about what these products show us about the computation that is already operational. What does the emergence of an algorithm humanized into a dating concierge tell us about what our technology has already been doing?

Stratification Machines

One thing the dating concierge indicates is that whatever transformation to the world of love and dating that the digital world orchestrates, it is now imposing certain universal harms. For a long time, a leading concern with data-driven dating apps was that they would privilege certain kinds of subjects over others. Bumble’s initial “unique selling proposition” was that it would be the dating app for women (it requires women to send the first message to any match); Tinder was at the center of a debate about whether it inherited racial biases, while OKCupid data was used by a far-right fake science laboratory to argue for eugenics. There was a prominent fear that the digitization of dating might benefit or harm some groups disproportionately, along lines of race, gender, and the like.

With the introduction of the AI concierge, the change in optics is clear: no longer will the algorithm work like an independent matchmaker (think Alfred Hitchcock’s Frenzy) that operates behind the scenes to make objective calls about who might suit who. Instead, it would function more like a personal lawyer, negotiating and advocating for the interests of its user. With the former, the worry was that the matchmaker might be biased in one direction or the other. With the latter, we all hope for equal treatment because we can all avail ourselves of the services of our own personal lobbyist.

But this doesn’t mean that we are all treated well. In fact, it shows that the app is simply discriminating against each of us, on behalf of each of us, along a potentially huge variety of dimensions that we may or may not be aware of.

One of the most important examples I found in doing research for my book on dating was Baihe, one of the largest Chinese dating apps. Baihe uses the Zhima Credit score, a complicated credit-checking and performance ranking system developed by Ant Financial Services Group, a company closely affiliated with the Alibaba Group. The score combines a typical credit check with other social credit points, including judgments on “the online characteristics of a user’s friends and the interactions between the user and his/her friends.” In short, the app ensures we match with those with a similar credit score and whose social milieus are comparable. It is an algorithm that reflects and reinforces social stratification.

Though such Chinese credit systems have been perceived as dystopian in the West, their use in apps like Baihe is not far from the logic of the most popular apps in the West, and the concept of the Bumble concierge seems to make that visible. It offers us the idea of an assistant that works for us to vet and check matches, and potentially even to blacklist and block others. It may make us feel special — and different — because it is designed to identify the differences that matter for romantic compatibility. But it affects us all in the same way: all users are subjected to this kind of social sorting in which the decisions we make about who we meet and who we desire are outsourced to algorithms that work to confine us to certain patterns and social circles.

Reflecting and Obscuring Class

Of course, there is nothing particularly new in using wealth as a factor in matchmaking, and there are many “elite” dating apps explicitly based around doing so. No doubt throughout history we have picked our partners based on socioeconomic status, among other demographic features. But the effect of dating apps on the whole today — one likely to be exacerbated by the introduction of AI handling the “early in the game” dates for us — is to narrow our potential dating pool and amplify differences between us even more.

Part of the appeal of dating app algorithms may be that users believe they need to be neatly defined and sorted according to their special characteristics and preferences — including, perhaps, their racial or ethnic identity, or their class position. AI concierges now promise to aid us in our quest to find true love by narrowing down the field with unprecedented precision.

Yet with the gap between median incomes and mean incomes widening like never before, it is not a big stretch to say that the vast majority of app users (like the population as a whole) are becoming more and more alike in terms of their economic position — just as the 1 percent gets further away from the rest of us, the 99 percent get closer together. Now more than ever, it makes sense to notice our shared universal experience of an unequal and exploitative economic system, rather than focusing on the minute differences between us and those proximate to us.

While our interests converge in this way as we all face the pressures of contemporary capitalism, dating-app technologies work harder and harder to stratify us, in terms of differences of wealth and income, race, sexuality, and so on. There are so many differences between us all, the Wolfe Herds of the world imply, that we need our own personal AI to root out the CVs of unqualified candidates. This process suggests that what is preventing us from finding the right person is that the differences haven’t been correctly understood or appreciated.

This fantasy appeals to us not just because it promises that we will find love, but that the system’s improving efficiency is taking us toward it. But plenty of people met partners without AI dating servants. And the relentless stratification and increasing insularity that the dating app algorithms promote is at odds with the wider sociability and solidarity (including with those who are not just like us) we will require to demand the systemic change we do badly need.

Leave A Comment