Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> why would ANYONE in their right mind give their DNA to a company in Silicon Valley that is explicitely using Google as a model for Data privacy ? (The founder of 23AndMe is the ex-wife of Google founder).

I don't think a former personal relationship indicates the respective companies operate the same way.

> Even worse, people are actually PAYING quite a lot to get the privilege of having that company playing with your most private data.

Paying for something is a good thing. Historically in the Valley, companies for whose services you do not pay money tend to abuse your data as they use it as a source to earn money. I think in general, companies take more care of your data if the contract is explicit: I pay you money for this service, and expect corresponding controls over my privacy. Now, I'm not saying it's the case at all companies, just that I think the trend you mention is reversed.

> This field needs to be heavily regulated.

I agree. We absolutely should have better laws, and people should be able to delete their data without question (and not have remnants stored). In the EU, that wouldn't fly.

Disclaimer: I worked at 23andMe.



sort by: page size:

> > > I just want to pay for a service or product and have that be the end of my interaction with the company as well as the end of the company's use, possession, and monetization of my data.

> Do you really compare DNA and phone numbers?

I think you've let the prior context of the discussion influence your interpretation of my point. My comment had two two main points. First, that we've never really had a system where transactions did not convey some additional information about us, our preferences, and the market (what's more, "my data" is an interesting and often under-explored topic, IMO. If data wants to be free, and you go spreading it around, do you really have a singular claim to it anymore?). Second, there are competing interests of the individual and society when we are talking about aggregating data that might yield substantial benefits for people in general (as in the human race). We've made considerations about this before, such as with vaccination.

I'm not making a case that a private company should be able to keep all info about your genetic makeup and profit from it privately. I am taking a position that we should carefully consider the possible benefits of aggregating genetic info, the possible downsides in loss of privacy and other possible problems, see if there are ways we can increase the benefits and/or decrease the downsides, and try to make an informed decision, with the minimum FUD possible.


>Commercialization of my data without consent or compensation is my fear.

I agree sort of, but how interesting is it that many people seem to really care when it’s their immutable genetic information to 23&Me - but when it’s their habitual and social information to Google or Facebook they seem care a lot less.


> Have you ever entered your phone number or member number at a business for a discount?

But I can choose not to.

"May I have your email for our records?"

"Thanks for asking, but you may not."

> Personal liberty is important, but it doesn't trump all considerations. If we can actually utilize the information to help research new treatments and further our knowledge of the human body, allowing the collection, and careful protection and anonymozation of the data is for the greater good, and worth the risk Imo.

That's fine, and it brings to mind organ donation, but this should be absolutely crystal clear. If I'm giving them my DNA, I want to know exactly how it can be used, how it will be anonymized, to whom they can sell it, whether I can opt out at some point, etc.

Personal liberty absolutely ought to trump considerations here; one reason I haven't sent a DNA sample to 23andMe, etc., is because I can't control what happens to my data after that point, and I'm not even sure if I can find out.


> The information that has been exposed from this incident includes full names, usernames, profile photos, sex, date of birth, genetic ancestry results, and geographical location.

Is 23andMe going to actually be held responsible?

I think both our industry and our information infrastructure would be vastly better if companies were forced to be serious about security when they are collecting and holding private data.


>Back then, few people had the mindset of, "if they own my data, they own me." But we're starting to see it take hold.

Really? You're being either very generous or very naive here, because even back then it seemed blindingly obvious that it's a bad bloody idea to trust a tech company of nearly any kind to safeguard your data securely or honestly. Then double the paranoia when it comes to your genetic information. For somebody working in the tech space in particular to have not been be cynical about this is plainly absurd.


> I know [privacy is] an issue around 23&me

In what regard? I’ve not heard of any privacy related issue with 23andme, and I follow the field quite closely (professional interest). On the contrary, 23andme has been known to uphold customer privacy in the face of government requests [1]. The FTC is investigating 23andme and other companies (which is a good thing), but there is no indication that they’ve found any violation.

[1] https://www.23andme.com/transparency-report/


> In the article there is a quote, “There is no personal data more sensitive than our DNA.“

> This seems a bogus assertion to me. I can imagine many diagnostic health test results that would be more sensitive to leak than DNA (e.g., STD and drug tests).

> DNA sequencing is eventually going to be so cheap and ubiquitous that it will happen to everyone anyway.

> Having published my own results (https://enki.org/2017/10/17/publishing-my-genome/), I really don’t buy into the idea that DNA is the most personal data that can be leaked.

> (Nevertheless, there should be more obvious warnings to customers about how their data will be monetized.)

> In the article there is a quote, “There is no personal data more sensitive than our DNA.“

> This seems a bogus assertion to me. I can imagine many diagnostic health test results that would be more sensitive to leak than DNA (e.g., STD and drug tests).

It's still PHI, and in this case subject to at least the Data Protection Act. The walk-in centers at Gatwick and Heathrow may have tested non-UK citizens. If so, they may run into compliance issues with GDPR or other privacy regs. Not sure how that plays out.

Guess I'll have to wait for the investigation results....

> Having published my own results (https://enki.org/2017/10/17/publishing-my-genome/), I really don’t buy into the idea that DNA is the most personal data that can be leaked.

Being as (at least in the US) DNA is used as evidence to conclusively identify (or exclude) those accused of crime (i.e. rape kits, etc) I'd say that a court of law would consider it to be essentially personal info.


>No it does not, it would be irresponsible to do that on private data.

Doing irresponsible things on private data is hot business model of the day. I'm not saying it's google; I'm saying common expectations about "responsibility" are worse than useless.

>We have always maintained that you control your data and we process it according to the agreement(s) we have with you.

Ah the "we surveil you fair and square, get over it" clause.


> In the long run, I think keeping your genetic information private will be untenable- the potential benefits will outweigh the drawbacks.

Can you give an example?

> Plus, anyone sufficiently motivated could get your DNA somehow, you shed your DNA everywhere you go, no getting around that.

That assumes there's someone out to get you specifically. That's like saying there's no point in having 2FA or strong passwords, because the FSB, the FBI and Mossad can get in anyway. Having my DNA because you vacuumed it up off the subway floor is significantly less useful to anyone without it being explicitly tied to me.


> Most people assume there is no way a legit company would do something terribly evil.

Are you kidding? Cynicism is endemic in the U.S. today; likewise for anti-corporatism. You can see it in the comments.

Personally, I'm in the same boat as the OP: I was a very early 23&Me customer and persuaded many family members to also participate, with the knowledge that 23&Me would monetize the data through research. They never claimed to be a non-profit. They were relatively transparent about their opt-in and opt-out policies, though after the FDA debacle their "simplified" website became significantly more complex and opaque. I also participated in a Kaiser Permanente whole genome sequencing research program, donated my blood samples, and then never heard anything ever again. I hope they're doing good science, especially because donating my whole genome is significantly more invasive from a privacy perspective than a few thousand SNPs.

If we sat around and waited for the perfect environment that addressed everybody's concerns, we'll be waiting forever. In the meantime, the less selfish among us are putting our own privacy at risk for the greater good. And the more of us that do that, the more political pressure there will be to sustain and expand existing legal protections prohibiting discrimination.

Not only am I glad 23&Me is selling the data (and this isn't the first time), I hope that overtime they'll have enough income that they can start sharing much of their data for free with schools and labs around the world. The latter may never come to pass, but I can't let perfection be the enemy of the good. And I won't let fear and uncertainty turn me into a perpetual privacy victim.

EDIT: I just received an e-mail from KP regarding their research project. It's the first I've noticed, but judging by my e-mail log they seem to have sent a handful over the past several years which I apparently deleted as spam. Here's the project website: https://researchbank.kaiserpermanente.org/


> how will my life be negatively impacted by this?

Your would-be future employers may reject you because of this data. Why hire someone with a higher risk of certain diseases or disables? It'd be illegal, but companies don't care about breaking the law if it's profitable and it'd basically take a whistleblower for anyone to know it happened. They certainly won't tell you that's why you weren't hired.

You could be denied housing or be targeted by extremists. More likely though, you'll be targeted by pharmaceutical companies. If the police didn't already have a copy of your DNA on file you might now have a place in every police line up, in any state in the US, for every crime committed where DNA evidence is collected. You could get wrongly flagged as a match through human error or statistics but either way it'll be on you to hire the lawyer who will have to prove your innocence.

We're moving toward a digital caste system (several really) where the data governments and corporations have on you will determine what you're allowed to do, how much you'll pay for things, and what opportunities you'll have. Every scrap of data you surrender will be used against you by anyone willing to pay for it, used in whatever way they think will benefit them, at any time, and you'll probably never even realize what happened. Just like right now, where companies don't tell you that they used your personal data to determine how long to leave you on hold. There's no telling what kinds of harms this could bring you, and there's no taking your data back to prevent any of it either.

I hope that data never comes back to haunt you. I'd sure hate to need to count on that never happening though.


> The data they collected about you is their data.

That's my point. I don't think it should be. They can have access to my data to conduct our business transaction, but I disagree with the idea that my information somehow belongs to them just because they had access to it at some point.


> i think maybe that this is not a technical problem, but more an ethical one. under the open data approach, if you want to study humans you probably would need to get express informed consent that indicates that their data will be public and that it could be linked back to them.

As someone who wants science to advance, I want highly trusted researchers to be able to do studies that involve my private, personal data, that I would not consent to being public and linked back to me.

It is highly important to me that we allow these studies to not use open data.

A great example of this is the US college scorecard, which uses very private tax returns to measure how much college degrees and majors contribute to income (not the only value of college education, but certainly an important one):

https://collegescorecard.ed.gov/

Only high degrees of trust allowed this data to be published on extremely private information, and I think that makes for a better world. I am pro-open data, but research on non-open data should absolutely exist.

For instance, should any research about mental health for transgender people be abolished? Because anything on that subject is not going to be open, or at the least those who would be open to their data being public are a probably non-representative subset.


> You can not completely anonymize data with any reliability.

Well... there's actually a field for that. I forgot what they call that field because of how niche it is but my friend at google is doing just that.

He said there are math theorem to prove that it's sufficiently anonymize.

He gave an example of how Netflix competition with the data they gave researchers were able to deanonymize it. And his job was to prevent that at google.

I can see why if you're trying to sell users data while maintaining privacy.


> They effectively sell user information by way of allowing their customers to target individuals for a price.

That's not selling user information in the same way that Ford doesn't sell automobile factories.

> They aren't directly selling the data itself

Nor are they indirectly selling the data. No one else gets the data.

> they are selling a service that is an extension of that data.

They are using the data to produce a service that they are selling, in the same way that Ford uses a factory to make a car. No one else gets the data, in the same way that no one else gets the factory from Ford.

> I think the semantics are negligible.

I think there is a substantive difference between transferring data to third parties and over whom neither the person the data describes nor Google exercises control and retaining the data internally and using it to provide a service to third parties. And I think that is a critical difference when it comes to privacy.

I mean, you erase the distinction between a armed security guard (who sells a service which involves a firearm) and an arms dealer (who sells firearms).


>That sounds very nice, do tell us if you have come up with or know of a (successful) business model where you do not have to collect the data of users. Nothing comes for free.

Really? I'm not so sure about that.

If the FBI wants my data, they'll need to come to my premise with a warrant.

Otherwise, they're SOL. And how much does that cost me over and above what I spend for the infrastructure I require? Zero.

How many companies that mine my personal information (for whatever reasons) do I use? Zero.

You are substituting your trained-in prejudices for the laws of nature, IMHO.


>If the company did not disclose this in massive font to potential customers ahead of time, then I do think this company should very quickly go out of business.

Disclose what? The point of most of these services is that you are adding yourself to a searchable database where you can find relations. The FBI seems to be using the service for the explicit reason it exists.

I'm very uncomfortable with the thought that one of these services could hand over specific DNA information to groups that could be used to profile you (employers, insurance, etc) but this use case really shouldn't come as surprising. It seems like complaining that the FBI found you by searching for your name on Facebook.


> And even if you disagree on that, I think it should be their right to sell their privacy for convenience if they want to.

Then it also has to be a right to not sell your privacy for convencience.

Clearly communicate that you do sell my data to bolster your revenue and give me an option to pay more instead.

It's not a legitimate business model to simply not tell me that you're selling my data and then magically have lower prices.


> I wish the privacy cost was stated in as clear terms at transaction time as the monetary cost

Privacy Center link directly from the home page: https://www.23andme.com/privacy/

Privacy Policy Highlights https://www.23andme.com/about/privacy/

Full Policy https://www.23andme.com/about/privacy/#Full

This information isn't hard to find--you have to first not assume it doesn't exist I guess.

* Disclosure, former 23andMe employee.

next

Legal | privacy