Dating apps are like ice cream: They come in all different flavors and styles but offer users the same glimmering hope — love at first swipe. The technological advancements that have defined the digital age, specifically the internet and mobile devices, have led to an increased interest in creating an online presence to remain connected to others.
But under the hood of dating apps lie complex algorithms that detect, track and store everything users click and interact with upon logging in. These dating apps, also pegged “swipe apps,” create convenient milieus for the amplification of unconscious forms of racism.
Internet dating is not a new concept: It originated with the onset of the internet in the mid-to-late nineties. It was 1995 when Match.com first launched publicly as a global online dating service. A decade later, online dating is the second most popular industry in paid online content, raking in $1.9 billion annually.
Following the launch of online dating services like Match.com and Plenty of Fish, location-based mobile dating applications emerged in 2007, which quickly became a daily fixture for users, as they could enjoy the luxury of bringing dating apps along with them wherever they went. Internet dating has now become a tool millions put to frequent use across modern society.
From the original formats of Grindr, Tinder and OkCupid that have long held the reins of the online dating world, users now have access to all kinds of apps, each offering to make the process of searching for a sweetheart more convenient. As of 2020, a Pew Research Center report revealed 30 percent of American adults disclosed they have used a dating site or app. Demographically, online dating is most common among Americans in their mid-20s through mid-40s.
Nancy Jo Sales is a New York Times bestselling author and award-winning journalist who has long raised the alarm about the dark side of dating apps. “The marketing of the online dating industry is very powerful. They invest a lot in marketing and they have very powerful and successful narratives that you can find your soulmate with the touch of a screen,” she said. “And those are things that sound really good and sound really easy for something that’s actually very difficult, and this made all the worse, all the harder, and all the riskier by this very industry that purports to be helping by doing the opposite.”
The simplicity, efficiency and minimal effort needed to start dating is, as Sales said, particularly attractive. In a matter of seconds, an internet search can provide you with hundreds of options. Beyond the influence of technological and social change, dating apps fulfill modern society’s need for convenience. It then takes mere minutes to register: a couple of taps of your phone, the option of a bio about as long as a tweet, the input of few photos and the software will analyze that information into displaying potential matches.
“I just thought, it’s so instant. The swiping looks addictive,” Sales said. “I immediately thought, oh God, this is gonna change everything. This is just gonna change everything about dating, relating to mating. And it felt very ominous.”
The kind of interface that dating apps like Tinder popularized made finding a date an addictive quest rather than a tiring pursuit. But intimate platforms have become centralized marketplaces for finding romantic partners, aided by easy-to-use features that allow users to apply racial preferences, or be affected by the oblivious enforcement of them through the app’s algorithm and thus perceive such choices as non-racist.
Most dating apps use collaborative filtering, meaning the algorithm bases prediction on both personal preferences and the opinion of the majority. According to findings by a Buzzfeed reporter in 2016, a dating app called CoffeeMeetsBagel was showing users potential partners who were the same race even when users selected no racial preference. Furthermore, Tinder’s recommender system uses an “attractiveness” scale that is more likely to magnify racially based outcomes. Their hierarchical ranking system recommends users to each other, embedding outright racism to unconscious preferences into it. Swipe apps unconsciously create a racial hierarchy of users by assessing them according to pooled preferences and using those assessments as the basis for recommendations.
Following the death of George Floyd, certain apps removed ethnicity filters in solidarity. For example, Grindr removed its ethnicity filter to “stand in solidarity with the #BlackLivesMatter movement.” Additionally, gay dating app Scruff announced on Twitter it would “fight against systemic racism and historic oppression of the Black community,” by removing its ethnic filters. So why does Hinge still have an ethnicity filter, among other specific filters unavailable on more popular apps like Tinder and Bumble? According to the app, it justifies the move based on “frequent requests from users of minority groups – including Black people searching for Black Love, a complex topic of love and healing stemming from generations of injustice and South Asian communities seeking someone with the same traditional values – we created the ethnicity preference option to support people of color looking to find a partner with shared cultural experiences and background.”
The extreme oppression people of color can face on dating apps rests on racial animus and overt prejudice — a belief that different races are unworthy of respect or affection — with users allowed to explicitly or implicitly demote, fetishize, or flat-out exclude potential partners on the basis of race and other protected characteristics. A platform’s menu of specific categories for searching, sorting, and filtering legitimizes them as socially reasonable bases for including or excluding potential partners, with certain people screened out of the “dating pool” entirely before ever being recognized as potential partners.
Dr. Meredith Clark is a leading journalist, associate professor at Northeastern University and social media scholar with extensive knowledge in race, ethnicity and activism.
She warned, “The user level can prove problematic though, because it suggests that in the design of the app, there’s almost a way to — and I really hate this term — a sort of an ethnic cleansing of sorts.”
In other words, the mindless swiping on profiles feels less direct than checking a box that blatantly indicates a preference, making it seem less racist. Focusing on the individual “ism” of racism often distracts from the subtle ways racial bias gets embedded in code, as it does in online algorithms. Each algorithm ultimately reproduces a pattern of human bias and discrimination that deepens pre-existing divisions in the dating world.
For example, White people are 10 times less likely to message Black people than the other way around. According to a study by Cornell researchers, men who used online applications heavily viewed multiculturalism less favorably, and viewed sexual racism as more acceptable. According to a study in 2019, “the dating app environment itself — in which whiteness is ‘the hallmark of desirability’ — led to higher rates of depression and negative self-worth.”
Clark noted, “We negotiate, in our physical lives, the people we intend to select for friends, for mates, for partners, what have you — both with whom we feel an affinity to and who we sort of identify with,” she said. “When you couple that almost innate preference with the way that we have been socialized to think about what is attractive in the United States in particular, you find that racial biases and biases that are predicated on phenotype essentially are reproduced by dating apps.”
Dr. Jess Carbino, relationship and online dating expert, and former sociologist for Tinder and Bumble, reaffirms this. “Obviously [online dating] could be perpetuating other forms of inequality, by virtue of how that is occurring. Absolutely,” she said. “But what I’m saying is I don’t believe online dating is theoretically enhancing the likelihood of somebody engaging in that type of behavior, but rather mirroring what happens in daily life and a daily institutional context as it relates to assorted mating patterns.”
Many users of apps don’t particularly care about how their matches get to them, or what that reflects about their swiping. With options to discriminate based on race or religion like Hinge, users like 20-year-old liberal musician Derek feel as if it is acceptable to privately “discriminate” in their choices of matches.
Carbino doesn’t believe that people fundamentally are trying to discriminate against people in the context of dating. “I think that in dating it is reasonable to expect that people might have certain preferences that would not be acceptable in other contexts of social life…” she said. “In theory, they are engaging in a form of discrimination, but I don’t believe that it is from most people coming from a place of bigotry. I think that is the fundamental difference between dating and other forms of institutional racism.”
Others are more concerned about the effect of the algorithm. Massachusetts resident Alexiah Jaelyn expressed concern that the algorithm is problematic. “You have to tell a tiny bit about yourself, but if they are showing you people based on that little bit of information, who aren’t they showing you, you know?”
John Benkovich, a 24-year-old system administrator at Draper Labs in Cambridge echoes this sentiment. “I think we don’t understand the full ramifications of the technology we’re using,” he said. “And what sincere negative consequences it might incur.”
Many users like Julissa, a 23-year-old Black, queer lesbian from Roxbury, see the algorithm of dating apps as an echo chamber, repeating and amplifying information that we, the users of dating apps, give to it. It targets millennials and Gen Z, in her opinion, because they “have the most access and societal stress to use the internet.”
And Sales reiterates how these apps further “trap” users on their platforms. “They have designed it to be addictive. They’re in the business of connecting people allegedly and having people make relationships with each other, but it seems like their actual goal is to get you to have a relationship with their app,” she said. “Whether or not you have a relationship with a person is secondary to that. And actually if you were to have a relationship with a person and leave the app, that would be antithetical to their actual business model, which is that they want you on there because they want to be collecting your data.”
Or, as Audrey, a 23-year-old queer musician from Brockton, jokes, when it comes to our trust in technology — “It’s pitch black, it’s nighttime and Mark Zuckerburg is sneaking out of the corner with a pair of VR glasses to put on your face while you’re in Velma cosplay.”
The architects of these intimate platforms need to examine more thoroughly the way preferences are used and how they may be unduly mapping onto historical patterns of discrimination. While human bias cannot be accounted for entirely, it is necessary to expect developers of online platforms to design them to combat bias and discrimination. There must be a conscious and purposeful intervention in the decision-making standards of users.
“I think there’s a possibility of creating those apps so that they do the best that they can to mitigate those stereotypes. But our social reality is structured by a history that we cannot change, that we are forever, simply trying to repay,” Clark said.
“The best that I can hope for is design that takes that history into account.”
Leave a Reply