002: Reimagining

DL and Looking? So Are the Data Miners, and They Already Know What You’re Into

This piece was written by Sergio E. Molina and featured in the Federal Bar Association’s LGBTQ+ Law section.

Ask any member of the LGBTQ+ community about their coming out process, and the majority will likely tell you a similar story: episodes of self-doubt and self-loathing, fear of judgement and abandonment, experiences of trust and mistrust, and an eventual courage to live freely and truthfully. But what if while you were online, your data outed you before you even had time to think that it could? In a tech-driven society where apps like Grindr have redefined the LGBTQ+ experience, corporate data miners have begun to move our concerns from the pages of science fiction into the realm of our modern reality, with our laws failing to develop quickly enough to combat it. 

Over the last two decades, our society has seen a technological revolution that has brought upon developments like personal computers, cellphones, and the marriage of the two. These advancements have had a unique benefit to a once underground and geographically fragmented LGBTQ+ community that existed only in the shadows of gay bars and that communicated only through coded languages. With the advent of smartphone consumption, LGBTQ+ individuals had in the palms of their hands the key to freely finding resources and communities that they once worked secretly to identify. Now, the Internet plays a major role in self-discovery; be it through surfing the web to explore interests and sexual preferences or meeting people online who offer comfort and relatability, the LGBTQ+ community has seen an increased exposure that has, in many ways, advanced its acceptance. However, as the pendulum swings from secrecy to visibility with little legislation to regulate it, we’ve begun to recognize all too well technology’s double-edged sword, and the threat that it creates for the LGBTQ+ community in particular. 

Coming out—and being outed—poses a very significant risk with often not many legal protections or ramifications. There are, unfortunately, too many instances in which sexual orientation and gender identity have been the cause of embarrassment, harassment, and the loss of employment, friends, family, and even life. In fact, these risks were at the center of tragedies like those of Tyler Clementi, Channing Smith, and many other LGBTQ+ individuals like them who took their lives following invasions of their privacy through the use of computers, smartphones, webcams, and social media. However, sometimes these risks, be they to those in the closet or not, don’t manifest themselves as human antagonists, but rather, as the ghosts in our machines and the entities that profit off of them. Where there is the use of technology and the monetization of data, there is a trail of breadcrumbs that ultimately identifies, exposes, and threatens its user, and no company has greater access to the LGBTQ+ community’s data bread basket than Grindr. 

Where there is the use of technology and the monetization of data, there is a trail of breadcrumbs that ultimately identifies, exposes, and threatens its user.

Since its creation in 2009, Grindr has revolutionized the way LGBTQ+ individuals communicate and, ultimately, hook up. The social media platform describes itself as the largest social networking app for gay, bisexual, trans, and queer people, with millions of daily users across the globe. Grindr’s business model focuses on the use of geolocation services to connect users with others in their immediate vicinity—a major evolution from trips to the local bathhouses that were prominent in the past. As with most social interfaces, there comes with a Grindr profile registration the acceptance of terms and policies that most users don’t read. According to Grindr’s Privacy and Cookie Policy, it collects personal user data such as e-mail addresses, Internet Protocol (IP) addresses, messages, photos, locations, audios, and videos, which—as many Grindr users know—is more often than not highly sensitive user data. By simply clicking through the policy agreement, many users don’t realize that while some registered to discreetly explore their sexuality, they also consented to Grindr’s disclosure of their personal data to various third-party affiliates. With the information available to them, Grindr and its affiliates have all the tools necessary to create individualized user profiles that, if placed in the wrong hands, can be exposed and exploited.

In 2018, BuzzFeed broke a story reporting that Grindr had shared its users’ available HIV statuses with two third-party affiliates, Apptimize and Localytics. Additionally, Grindr also provided these two entities with users’ location, phone, and email data. While many who joined Grindr were selectively open with other users about their participation, sexuality, and HIV status, Grindr’s sharing of their personally identifiable information (PID) outed them, all the while unbeknownst to the users themselves. Shortly after the public became aware of this data transfer, Grindr announced that it would stop sharing its users’ HIV statuses with third-party affiliates. However, perhaps the greatest and most obvious risk to Grindr users is the focus of the app itself: publicized user location data.  

Grindr operates by displaying how far one user is from another, sometimes even down to the yards. Every user has one of two options: either they display their location, or they keep it hidden. Despite the fact that the latter of the two alternatives seems like a sensible solution to the threat of stalkers, criminals, or even foreign powers, both are easily overcome.  For instance, if a user’s displayed location data is accurate, their precise location can be determined by way of a trilateration process:

Imagine a man shows up on a dating app as ‘200m away.’ You can draw a 200m (650ft) radius around your own location on a map and know he is somewhere on the edge of that circle. If you then move down the road and the same man shows up as 350m away, and you move again and he is 100m away, you can then draw all of these circles on the map at the same time and where they intersect will reveal exactly where the man is.

Even for those users that choose to hide their location, the illusory solution falls short when, even without it, there are still methods to determine a user’s location. Researchers in 2016, for example, demonstrated that it was possible to locate a target by surrounding them with several fake profiles and moving the fake profiles around the map as a way of narrowing down their location. 

In 2019, these concerns moved past a sense of mere paranoia and into full governmental oversight when the Committee on Foreign Investment in the United States (CFIUS) stated that Grindr’s being owned by Beijing Kunlun Tech Co., Ltc., a Chinese gaming company, constituted a national security risk. Although members of CFIUS were not allowed to expand on its rationale, it’s reasonable to deduce that the simplicity with which Grindr’s user data can be weaponized was a major contributing factor in CFIUS’ finding. It’s within the realm of possibility that, being under Chinese ownership, the Chinese government could compel Grindr to produce user information through which the blackmail of its prominent users might be possible. In fact, this kind of data exploitation is very much in line with the more archaic forms of data exploitation that have threatened the LGBTQ+ community throughout various points in history.

During the height of the Nazi regime, the Gestapo raided sex research institutions and confiscated extensive lists containing the names and addresses of local homosexuals. Those listed became the targets of the Reich Central Office for the Combatting of Homosexuality and Abortion. The Nazis arrested over 100,000 men as homosexuals and took them to concentration camps where they were denied support groups, experimented on, and brutally murdered. Similarly, in the United States during the McCarthyist anti-communism campaign of the mid-1900s, the federal government gathered data on homosexuals through community member interrogations and raided community safe spaces to investigate “the alleged employment of homosexuals in the government service” through a congressional subcommittee created for that particular purpose. Any federal employees suspected of being homosexual were terminated and outed publicly; hundreds lost their jobs and became exposed to harm. 

With the LGBTQ+ community’s data being a historical target for persecution, the question now becomes what can—or must—be done to shelter the community from exploitation and danger. While the Supreme Court of the United States has looked favorably on broadly interpreting the 14th Amendment’s “liberty” clause as encompassing a right of privacy in Meyer v. Nebraska and its progeny—particularly, Lawrence v. Texas for its application to the LGBTQ+ community—it has still yet to address whether this right to privacy extends to user data, as other foreign courts have.  Nonetheless, where the American judiciary might be slow to tackle this issue, domestic legislation seems to be driving the discussion.

Today, although we see various sector-specific laws that include data-privacy clauses, there exists no federally-enacted user data protection law. Instead, state laws like those in Massachusetts and New York have provided a foundation for American data protection legislations, with the California Consumer Privacy Act (CCPA) being, perhaps, the most comprehensive among them. The CCPA establishes numerous data collection disclosures for companies and data miners. Most notable, however, is that the CCPA grants citizens the rights to opt out of having personal information sold to third parties and to request access to, and deletion of, personal data, which can be of great significance to the safety of LGBTQ+ individuals threatened by the exploitation of their data. It’s likely for this reason, among many others, that the CCPA has inspired recent action on Capitol Hill.

On November 5, 2019, California Congresswomen Anna Eshoo and Zoe Lofgren introduced the Online Privacy Act (OPA) in the United States House of Representatives. The bill offers individuals many of the same user data rights as the CCPA, such as the rights to request access to, and deletion of, individual users’ data, in addition to the right to revoke consent given for its collection. Where the OPA excels is not only in its recognition of the various types of economic, physical, psychological, and reputational harms that may arise from the collection and disclosure of data, but also in its provisions prohibiting the discriminatory processing of data against members of protected classes, which the OPA explicitly defines as including actual or perceived sexual orientation and gender identity of an individual or group of individuals.  However, whether the bill gets enacted into federal law, or when, remains to be seen.  Although there seems to be hope on the horizon for tomorrow, there is still much to be done before putting to rest the concerns of today. 

Ask any member of the LGBTQ+ community about their coming out process, and the majority will likely tell you a similar story: episodes of self-doubt and self-loathing, fear of judgement and abandonment, experiences of trust and mistrust, and an eventual courage to live freely and truthfully. But what if while you were online, your data outed you before you even had time to think that it could? In a tech-driven society where apps like Grindr have redefined the LGBTQ+ experience, corporate data miners have begun to move our concerns from the pages of science fiction into the realm of our modern reality, with our laws failing to develop quickly enough to combat it. 

Over the last two decades, our society has seen a technological revolution that has brought upon developments like personal computers, cellphones, and the marriage of the two. These advancements have had a unique benefit to a once underground and geographically fragmented LGBTQ+ community that existed only in the shadows of gay bars and that communicated only through coded languages. With the advent of smartphone consumption, LGBTQ+ individuals had in the palms of their hands the key to freely finding resources and communities that they once worked secretly to identify. Now, the Internet plays a major role in self-discovery; be it through surfing the web to explore interests and sexual preferences or meeting people online who offer comfort and relatability, the LGBTQ+ community has seen an increased exposure that has, in many ways, advanced its acceptance. However, as the pendulum swings from secrecy to visibility with little legislation to regulate it, we’ve begun to recognize all too well technology’s double-edged sword, and the threat that it creates for the LGBTQ+ community in particular. 

Coming out—and being outed—poses a very significant risk with often not many legal protections or ramifications. There are, unfortunately, too many instances in which sexual orientation and gender identity have been the cause of embarrassment, harassment, and the loss of employment, friends, family, and even life. In fact, these risks were at the center of tragedies like those of Tyler Clementi, Channing Smith, and many other LGBTQ+ individuals like them who took their lives following invasions of their privacy through the use of computers, smartphones, webcams, and social media. However, sometimes these risks, be they to those in the closet or not, don’t manifest themselves as human antagonists, but rather, as the ghosts in our machines and the entities that profit off of them. Where there is the use of technology and the monetization of data, there is a trail of breadcrumbs that ultimately identifies, exposes, and threatens its user, and no company has greater access to the LGBTQ+ community’s data bread basket than Grindr. 

Since its creation in 2009, Grindr has revolutionized the way LGBTQ+ individuals communicate and, ultimately, hook up. The social media platform describes itself as the largest social networking app for gay, bisexual, trans, and queer people, with millions of daily users across the globe. Grindr’s business model focuses on the use of geolocation services to connect users with others in their immediate vicinity—a major evolution from trips to the local bathhouses that were prominent in the past. As with most social interfaces, there comes with a Grindr profile registration the acceptance of terms and policies that most users don’t read. According to Grindr’s Privacy and Cookie Policy, it collects personal user data such as e-mail addresses, Internet Protocol (IP) addresses, messages, photos, locations, audios, and videos, which—as many Grindr users know—is more often than not highly sensitive user data. By simply clicking through the policy agreement, many users don’t realize that while some registered to discreetly explore their sexuality, they also consented to Grindr’s disclosure of their personal data to various third-party affiliates. With the information available to them, Grindr and its affiliates have all the tools necessary to create individualized user profiles that, if placed in the wrong hands, can be exposed and exploited.

In 2018, BuzzFeed broke a story reporting that Grindr had shared its users’ available HIV statuses with two third-party affiliates, Apptimize and Localytics. Additionally, Grindr also provided these two entities with users’ location, phone, and email data. While many who joined Grindr were selectively open with other users about their participation, sexuality, and HIV status, Grindr’s sharing of their personally identifiable information (PID) outed them, all the while unbeknownst to the users themselves. Shortly after the public became aware of this data transfer, Grindr announced that it would stop sharing its users’ HIV statuses with third-party affiliates. However, perhaps the greatest and most obvious risk to Grindr users is the focus of the app itself: publicized user location data.  

Grindr operates by displaying how far one user is from another, sometimes even down to the yards. Every user has one of two options: either they display their location, or they keep it hidden. Despite the fact that the latter of the two alternatives seems like a sensible solution to the threat of stalkers, criminals, or even foreign powers, both are easily overcome.  For instance, if a user’s displayed location data is accurate, their precise location can be determined by way of a trilateration process:

Imagine a man shows up on a dating app as ‘200m away.’ You can draw a 200m (650ft) radius around your own location on a map and know he is somewhere on the edge of that circle. If you then move down the road and the same man shows up as 350m away, and you move again and he is 100m away, you can then draw all of these circles on the map at the same time and where they intersect will reveal exactly where the man is.

Even for those users that choose to hide their location, the illusory solution falls short when, even without it, there are still methods to determine a user’s location. Researchers in 2016, for example, demonstrated that it was possible to locate a target by surrounding them with several fake profiles and moving the fake profiles around the map as a way of narrowing down their location. 

In 2019, these concerns moved past a sense of mere paranoia and into full governmental oversight when the Committee on Foreign Investment in the United States (CFIUS) stated that Grindr’s being owned by Beijing Kunlun Tech Co., Ltc., a Chinese gaming company, constituted a national security risk. Although members of CFIUS were not allowed to expand on its rationale, it’s reasonable to deduce that the simplicity with which Grindr’s user data can be weaponized was a major contributing factor in CFIUS’ finding. It’s within the realm of possibility that, being under Chinese ownership, the Chinese government could compel Grindr to produce user information through which the blackmail of its prominent users might be possible. In fact, this kind of data exploitation is very much in line with the more archaic forms of data exploitation that have threatened the LGBTQ+ community throughout various points in history.

During the height of the Nazi regime, the Gestapo raided sex research institutions and confiscated extensive lists containing the names and addresses of local homosexuals. Those listed became the targets of the Reich Central Office for the Combatting of Homosexuality and Abortion. The Nazis arrested over 100,000 men as homosexuals and took them to concentration camps where they were denied support groups, experimented on, and brutally murdered. Similarly, in the United States during the McCarthyist anti-communism campaign of the mid-1900s, the federal government gathered data on homosexuals through community member interrogations and raided community safe spaces to investigate “the alleged employment of homosexuals in the government service” through a congressional subcommittee created for that particular purpose. Any federal employees suspected of being homosexual were terminated and outed publicly; hundreds lost their jobs and became exposed to harm. 

With the LGBTQ+ community’s data being a historical target for persecution, the question now becomes what can—or must—be done to shelter the community from exploitation and danger. While the Supreme Court of the United States has looked favorably on broadly interpreting the 14th Amendment’s “liberty” clause as encompassing a right of privacy in Meyer v. Nebraska and its progeny—particularly, Lawrence v. Texas for its application to the LGBTQ+ community—it has still yet to address whether this right to privacy extends to user data, as other foreign courts have.  Nonetheless, where the American judiciary might be slow to tackle this issue, domestic legislation seems to be driving the discussion.

Today, although we see various sector-specific laws that include data-privacy clauses, there exists no federally-enacted user data protection law. Instead, state laws like those in Massachusetts and New York have provided a foundation for American data protection legislations, with the California Consumer Privacy Act (CCPA) being, perhaps, the most comprehensive among them. The CCPA establishes numerous data collection disclosures for companies and data miners. Most notable, however, is that the CCPA grants citizens the rights to opt out of having personal information sold to third parties and to request access to, and deletion of, personal data, which can be of great significance to the safety of LGBTQ+ individuals threatened by the exploitation of their data. It’s likely for this reason, among many others, that the CCPA has inspired recent action on Capitol Hill.

While data privacy should be a priority for any tech user, it is of paramount importance to the LGBTQ+ community, and putting it at the forefront of our fight for our civil rights will protect all of us.

On November 5, 2019, California Congresswomen Anna Eshoo and Zoe Lofgren introduced the Online Privacy Act (OPA) in the United States House of Representatives. The bill offers individuals many of the same user data rights as the CCPA, such as the rights to request access to, and deletion of, individual users’ data, in addition to the right to revoke consent given for its collection. Where the OPA excels is not only in its recognition of the various types of economic, physical, psychological, and reputational harms that may arise from the collection and disclosure of data, but also in its provisions prohibiting the discriminatory processing of data against members of protected classes, which the OPA explicitly defines as including actual or perceived sexual orientation and gender identity of an individual or group of individuals.  However, whether the bill gets enacted into federal law, or when, remains to be seen.  Although there seems to be hope on the horizon for tomorrow, there is still much to be done before putting to rest the concerns of today. 

The impacts that data collection has had on members of the LGBTQ+ community are widespread; as we now have the world’s information at our fingertips, so too do tech companies have ours at theirs. In this brave new world where our private, online activity has the power to out us to the public, we must look to taking a two-pronged approach:  in the short-term, advocating for the enactment of federal legislation such as the OPA or another similar to it, and in the long-term, advocating for the judicial recognition of data privacy as a basic human right. While data privacy should be a priority for any tech user, it is of paramount importance to the LGBTQ+ community, and putting it at the forefront of our fight for our civil rights will protect all of us, especially those that prefer to explore discreetly for those looking on the DL.