Maybe... But if this is a real problem with Omegle (the article mentions that it is a common grooming platform) then it's not like there's nothing they could do. Did it even have age verification?
On balance I still think Omegle should win the case, but I don't think it's entirely without merit.
And so now they’ll have to choose one of the other, even seedier bars. The kids aren’t any better off.
You’ve made a poor analogy, though, since obviously it’s very trivial for a bar to validate age at the door. If such a thing were easy for Omegle to do, I’m sure they would.
Omegle cannot perfectly protect their users from other users, and cannot determine who is a minor without ID verification (as we have seen in Virginia and Mississippi for porn sites).
They clearly want the entire product category illegal. Omegle might be gone, but many similar services remain.
> But even so, they very early on faced the same kind of criticism as Omegle, shifted to registration-required and adults-only very early on
Over the years, the handful of times I’ve gone on CR, I’ve never seen any form of required registration or age verification beyond maybe an “I’m 18+” checkbox or something.
Until October 2022 the terms of service stated that users 13+ could use the site with parents permission, between the Sept 30th [1] and Oct 6th [2] the terms changed to 18+ (a couple of months after the A.M. lawsuit was filed, or at least the 2nd amended complaint was).
"In or about 2014" (the lawsuits wording) A.M. was paired with the abuser the terms stated " Do not use Omegle if you are under 13. If you are under 18, use it only with a parent/guardian's permission." [3]
I'm not saying they had an obligation legally, but personally thinking if you allow minors on a site esp, where you know people get their junk out to flash to other users, you prob should segregate those <18 yo and those >=18 yo. How you do that effectively? I dunno, but age gating (even minimal age gating) makes easier to argue that people are willfully misrepresenting themselves to your service and can't be expected to police EVERY user on the site, esp when they lie to you about their age. (Also would have helped against the claims that Omegle were serving ads for adult sites to minors too).
EDIT: However, taking from the lawsuit
> 39. In or about 2014, the Omegle Predator logged onto Omegle and was paired via text chat with A.M., an 11-year-old girl living with her family in Michigan. This was A.M.’s first time using Omegle alone. Other times, she and her friends had used it to have age-appropriate video chats at sleepovers.
> 40. On the Omegle platform, the Omegle Predator asked A.M. her age to which she responded, “Eleven.” The Omegle Predator continued the conversation and convinced A.M. that it was okay for them to keep communicating.
> 41. By the end of this 15-minute chat, A.M. found herself believing the Omegle Predator and trusting that he would help her “feel better”—something he had promised her.
> 42. The Omegle Predator asked A.M. for her contact information so they could stay in touch after the video chat ended
> 43. That same night, the Omegle Predator strategically gained A.M.’s trust and induced A.M. to send him photos of herself. First of her smile, and eventually, of her breasts, vagina, and other parts of her body. The Omegle Predator convinced A.M. that it was integral to her “healing” to trust him even if she felt uncomfortable
So I'm not 100% sure that anything bad even happened between A.M. and the predator on Omegle (Though I still skimming though the complaint) but then happened off-site afterwards. Not sure how you can police users interactions when they take conversations off-site.
EDIT 2: Also A.M. stated in the initial chat that she was 11, so wasn't allowed to use the site per Omegle's terms, but as in other cases, if sites "know" they have users below the age of 13 and are collecting personal information about those people they are running a foul of COPPA (just to name one child protection law).
EDIT 3: Would have been an interesting "Product Liability" case if it had gone to trial, plaintiffs argument seems to be "because omegle knew it had issues in the past with predators using the site, they should have and could have done more to protect others using the site from such predators, and so the product itself is faulty". Defense would prob said something along the lines of "no bad actions between A.M. and her abuser happened on the site during their initial chat, they then took their chat off-site, omegle can't be expected to police users off-site." among other defenses. Personally I believe it would be a crapshoot on an outcome because in civil cases its not "beyond a reasonable doubt" but a "preponderance of the evidence" (more likely to be true than false) and jurors don't like it when bad things happen to children.
I didn’t claim that it’s impossible to imagine ways that this could be done. I was pointing out that no such methods are actually in common use, which means that when people think about validating their age online, they are concerned about their privacy.
Plus as a user I refuse to use it even if the consequence is that I have to skip the service in question. I may be in the minority to people that like being parented, but I don't see the incentive of a service to use this tech.
There are enough problems to be solved and age verification doesn't solve any at all.
Heh. Reminds me of a public debate a few years ago about forcing porn platforms to securely verify the age of their users. I think a government member said something like "We could use FranceConnect (the government SSO service) for authentication in these cases".
How are you verifying age? The "this is impossible"s are about it being accurate on age not the disabling of chat once you have an accurate age. The point being the law is useless theatre that does nothing to prevent kids from using it but annoys everyone else anyways.
And from Archive.org, here's how Omegle's looked when the girl who is now suing them joined the site. "Tiny text" isn't an exaggeration. The call to action to start a text chat is a 200x50 button -- the 'don't use if you're under 13' text is 0.75em font:
It should be obvious to everyone, not just 'some lawyer' that the former is more of a barrier than the latter. There's also the concept of overt acts in many statutes - lying to a website by clicking a button that says "I'm over 18" when you're not demonstrates that you read the disclaimer and disregarded it, where you can plausibly claim you never saw the copy when it's just legalese on the bottom of the page.
They're not saying age verification is wrong, but dozens of skeezy sites taking in personally identifiable information who should have no business in taking in personally identifiable information is a bad idea.
Pornhub advocate for devices to do authentication of ID (i.e. Apple has FaceID and a bunch of stuff for reading IDs already as part of their digital ID initiative) and then attesting the user is of age. This could remain entirely anonymous and more secure than a kid inputting his dad's drivers license number that he stole from the dad's wallet.
> You seem to believe that the burden to verify age is too onerous for online dating apps. I think that we should require age verification for dating services across the board, and it should be up to the online apps to compete with offline dating services on equal footing.
It is a burden for the apps themselves but it's not something they can't overcome. The real issue is end-users wanting some degree of privacy and not wanting to submit their real identity to who knows what's on the other end. For example, a gay dating app, or an app for people into BDSM or whatever the thing might be that they're into, or someone who is already in a relationship but wants to see what's out there - a significant percentage of those people will never submit their ID and won't use the app. Even if it’s just some vanilla dating app, how do you know the operators aren’t just in it to easily skim tons of IDs in some vast identity fraud scheme? I certainly wouldnt submit my ID to such services. It also doesn't solve the original issue, since teens will find a way to get around the ID check - use someone else's account, use photoshop to alter an ID, get a fake ID, etc etc.
> Condoms aren't 100% effective. Why should laws be 100% effective?
Yes, nothing is 100% effective. Yet proponents of these laws try to push things to 100% without stopping to check how we're doing so far. The article says 60 cases of child sex offenses since 2015. So 15 / year, about 1 a month, out of how many hundreds of millions/billions of people using these apps? Also, the article counts 16/17 year olds as children, despite them being above the age of consent in the UK and in many states in the US, so who knows how many of those 60 are actually under 16 and not just 16 or 17. Sounds like we have this issue 99% solved with current enforcement methods. Do we really need to institute onerous burdens on apps and end-users for the sake of that last 1%? We can eliminate automobile accident deaths by instituting a nationwide speed limit of 10 MPH on all roads at all times but we don't do that because of the obvious cost/benefit concerns.
If you're concerned about your child going on these apps, give them a feature phone until they're of age. That's what I'd do.
Note that they do not object to age verification. They object to the specific way North Carolina is requiring it to be done:
> Aylo has publicly supported age verification of users for years, but we believe that any law to this effect must preserve user safety and privacy, and must effectively protect children from accessing content intended for adults.
> Unfortunately, the way many jurisdictions worldwide have chosen to implement age verification is ineffective, haphazard, and dangerous. Any regulations that require hundreds of thousands of adult sites to collect significant amounts of highly sensitive personal information is putting user safety in jeopardy. Moreover, as experience has demonstrated, unless properly enforced, users will simply access non-compliant sites or find other methods of evading these laws
Using modern cryptographic techniques (such as blind signatures or zero-knowledge proofs) it is possible to design a system whereby you can prove your age to porn site P without P receiving any information they did not already have other than that you are older than their age threshold. In particular this would even work for anonymous users.
There would be another site V involved in the verification. You would have to give V your real identity and show them your proof of age documents, but V would not get any information about what site you trying to get verified for.
If V were a site that already has your real identity then using V for age verification would not be giving them anything that they didn't already have.
It might be possible for someone who obtains records of both P and V to get an idea of the real identities of porn site account owners by trying to match up the timing. This risk can be greatly reduced by having just one or two V sites, so that they are high traffic, and by having some random delays in the verification protocol.
That way someone trying to figure out if I was using say Pornhub might find out from V that I was doing the V side of a verification at say 2024-06-01 01:44:21, and they might be able to find out from Pornhub if they had any verifications using V that started within a few minutes before that and completed within a few minutes after that.
But with only one or two V sites, there will be way more verifications that happened at V at times compatible with those Pornhub verifications. They would not be able to tell if mine at 2024-06-01 01:44:21 is one of those Pornhub ones or one of the many more going on around that time for other sites.
It is a little counterintuitive, but the more sites that require age verification the better the privacy protection, and the fewer the number of V sites, the better the privacy protection.
That suggests that if we are going to require some sites to do age verification, to do it in the most privacy preserving way (1) it should be done nationally rather than as a patchwork of state verification laws, and (2) V should be a government site.
How is age verification supposed to work? I don't suppose users of the site are going to provide legal documents just to use it. It's tantamount to shutting it down.
I ask because there was a similar moral outrage around age verification for access to porn sites that I recall being a big issue a while ago. I don't recall exactly how it played out in court, but it appeared to amount to nothing, which I can't help but to feel was due to the fact that mechanisms to verify someone's age online are either trivial to circumvent or present such a high barrier to entry that no reasonable user would surmount it.
reply