A Tipping Point for DNA Tests and Surveillance Technologies?
On January 23, CNBC broke the news that 23andMe is laying off 100 of its 700 staff. That may not mean that the consumer genetics fad is “over,” but it does confirm a growing trend of public skepticism. This is happening in a broader context of concerns about privacy and corporate control that may lead to better regulation of these technologies.
In the weeks leading up to the 23andMe news, there was a remarkable spate of media stories involving problems with or concerns about DNA-based technology. The articles cover a range of subjects and raise a number of important issues. Some stories point to flaws in data collection or analysis, some uncover abuse of accurate information, and some reveal how companies test the appropriate limits of personal privacy.
The list presented here is far from complete (just look at these search results from The New York Times, The Washington Post, and Google):
- The dark side of our genealogy craze (Washington Post, December 13)
- We’re entering a new phase in law enforcement’s use of consumer genetic data (Slate, December 19)
- A genetic test led seven women in one family to have major surgery. Then the odds changed. (Wall Street Journal, December 20)
- Pentagon warns military members DNA kits pose ‘personal and operational risks’ (Yahoo, December 23)
- Why are you publicly sharing your child’s DNA information? (New York Times, January 2)
- Hobbyist DNA services may be open to genetic hacking (EurekAlert, January 7)
- 23andMe has sold the rights to develop a drug based on its users’ DNA (New Scientist, January 10)
- You need a good reason to curb privacy. None exists for collecting DNA at the border. (Washington Post, January 11)
For example, several women who relied on the best science of 2016 underwent double mastectomies and removal of their ovaries and fallopian tubes, although the best science of 2019 suggests these invasive surgeries were unnecessary. The US military is interested in using DNA to identify enemies but is also concerned about reciprocity – and explicitly worried about potential inaccuracies in health information. There seems to be increasing unease about the idea that the police could access your DNA ancestry records, even in the service of catching murderers.
This shift may represent a healthy contrast to what used to be quite pernicious hyperbole about the usefulness of DNA testing. Such tests do have significant medical applications, and many people do have an understandable interest in their ancestors. But commercial interests have oversold the promise of these technologies.
None of this should really be news. Helen Wallace published a piece titled “The misleading marketing of genetic tests” in the April 2005 edition of GeneWatch. The ETC Group published a Report in March 2008 on “Direct-to-Consumer DNA Testing and the Myth of Personalized Medicine.” Biopolitical Times in 2008 published posts about “making money off of the public’s often-misplaced fascination with DNA,” as well as “The spitterati and trickle-down genomics” and “Genomes of the Rich and Famous.”
We told you so.
The good news is, mainstream media now, at last, seems ready to grapple with both the limitations of and the threats of these technologies.
The bad news, however, is that these instances of actual and potential abuse of DNA technologies are only a subset of what Professor Shoshana Zuboff has termed “Surveillance Capitalism.” Roger McNamee, a Silicon Valley veteran and the subject of a long profile in The New Yorker (December 2), explains the evolution of the web in “A Brief History of How Your Privacy Was Stolen” (New York Times, June 3):
The early threats to privacy – identity and financial theft – were replaced by a greater threat few people recognized: business models based on surveillance and manipulation.
Technologies that often seem innocuous, even cool (e.g., finding out where your ancestors came from or sharing photos on social media) are combining with each other and morphing into dangerous tools that governments or corporations can use in massive violations of human rights.
Facial recognition is a good example. A company called Clearview has developed facial recognition software that even Google has stayed away from because of possible abuse. It works by “scraping” images from public pages, in apparent defiance of Facebook’s terms of service. According to the New York Times:
The tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew.
Those business models, in turn, are being adapted by governments. Increasingly, and globally, the use of video surveillance is being combined with facial recognition software to generate “a roll-call of human rights violations.”
And soon, cameras may not be required: Already, China is attempting to control the Uighurs by using DNA samples “to create an image of a person’s face.” Meanwhile, the U.S. government is beginning to collect DNA from immigrants and storing it permanently in the criminal database.
There is a chance of challenging these ominous developments. Nationally, the 2008 Genetic Information Nondiscrimination Act (GINA) does seem to have done some good, not so much in preventing genetic discrimination in hiring, but “as a privacy law” – or at least the start of one. It is helpful that The New York Times has a Privacy Project and the Washington Post, fearing a “privacy doomsday,” has been featuring the issue and calling for Congressional action.
Facial recognition is the current focal point: A politically diverse coalition, including the right-wing Representative Jim Jordan and the leftist Representative Alexandria Ocasio-Cortez, opposes facial recognition technology. Indeed, a number of cities – including San Francisco, Oakland, and Somerville, all with close ties to the tech industry – and three states have banned its use by the police and other agencies. There may be a realistic chance of further legislation.
The AI Now research institute at New York University is keeping tabs on issues of privacy, bias and predictive policing. Amnesty International issued a report in November on Surveillance Giants. Nature published an article last week on “The battle for ethical AI.” Even Google’s CEO has called for “regulation of artificial intelligence.”
This is the context for the slowdown in the Consumer DNA sector that has been coming for a while. Bioinformatics and computational biology are established fields; “machine learning algorithms” are extremely relevant to genomic analysis and prediction. We can expect to see concerns about genetic privacy and genetic surveillance catching up with the belated emergence of concerns about digital privacy and surveillance. People are getting wary.
We may be a long way from reining in the technology behemoths that look to exploit our preferences, and protecting ourselves from intrusive and authoritarian governments, but informed skepticism about accuracy, surveillance, and manipulation would be a good start. Research, legislation, and public opinion need to work together. Could this recent spate of stories indicate that we are reaching a tipping point?
Update: Since this post was published, conversations around surveillance, privacy, and related issues continue to develop in the media. We will add to this list as related articles are released.
- Are You in a Gang Database? (New York Times, February 4)
- Facial Recognition Moves Into a New Front: Schools (New York Times, February 6)
- Myriad Genetics Shares Drop as CEO Resigns, Sales Fall Short (Bloomberg, February 6)
- DNA firms are set to profit from your data as testing demand falls (New Scientist, February 7)
- Federal Agencies Use Cellphone Location Data for Immigration Enforcement (Wall Street Journal, February 7)
- He Combs the Web for Russian Bots. That Makes Him a Target. (New York Times, February 9)
- Consumer DNA testing is a bust: Here’s how companies like Ancestry and 23andMe can survive (CNBC, February 9)
- Data of All 6.5 Million Israeli Voters Is Leaked (New York Times, February 10)
- Clearview brings privacy concerns from facial recognition into focus (Axios, February 10)
- California senator proposes tighter regulations on direct-to-consumer genetics testing companies (TechCrunch, February 11)