Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Yes, they should. The moment they don't allow minors in their site they have to implement the measures to enforce it.

Omegle became a safe haven for pedophiles and sex predators, and they are responsable for enabling them and not protecting their users.

There are other chat and video-chat sites that not only enforce their rules, they protect their users and ban those who don't follow the rules.

No, don't expect that from car manufacturers, they make cars not rules. Omegle, instead, made 'the car' and the rule not allowing minors in the site to avoid their responsibilities by law. They didn't enforce that rule and endangered them.



sort by: page size:

Another case of not making perfect the enemy of good. Some percentage of children who see a disclaimer saying, "Do not use if you're under 18, click here to confirm you're 18+" and decide not to lie and login -- so as a base level, sites that are dangerous for kids should do that.. the should also do a bunch of other stuff, and it certainly should be mitigating to Omegle's liability that they were doing a bunch of other stuff, but they apparently didn't do a few easy things which may cost them.

The legal situation is more complicated than blaming the parents. To extend your analogy: If someone had a business that rented cars and somehow 11 year olds were renting the cars and driving them, the rental car company couldn’t shrug it off and blame the parents.

That’s why this is complicated: If a business knows criminal or dangerous activity is taking place on their platform, there is some obligation to make a good faith effort to address the situation. The expectation isn’t perfect enforcement because it’s not reasonable to shut every large business down as soon as 1 incident occurs, but if a platform becomes known as a haven for certain types of behavior then their liability continues to go up. Given how many people in this thread are joking about how Omegle was known as a free-for-all platform for people exposing themselves and as a platform for bored kids, it’s not surprising that the lawsuits are coming. Also, given their limited monetization options it’s not surprising that they choose to close rather than deal with legal battles.


Why? Anything on Omegle could happen in a park. Or a library. Omegle can't keep kids off their platform: their parents need to.

I think parents should take some responsibility here.

The park analogy that the creator of omegle made is interesting, we don't shutdown parks because a crime may take place there.

And why not just remove video chat completely and keep the text chat and with higher moderation and tools, it should have been possible to keep the problems at bay.

We might try to out reason the following argument but I do believe it demands reflection. Omegle was about the only popular mass media which was fully distributed, allowed for random encounters with people.

I am reminded of Aaron Swartz. And, I have a deep suspicion that Omegle's shutdown being part of the lawsuit settlement has a lot more to the story. And probably the creator also had to sign NDA or something.


This may be an unpopular view, but predators will always find tools to meet children online. Facebook, Snapchat, TikTok, Roblox, Minecraft and 100’s of other sites exist. It’s the equivalent of taking out a cartel boss, there’s plenty of other cartel members that will fill the void.

From what I understand, Omegle made significant efforts in terms of moderation. Parents and guardians need to have ultimate accountability for educating and moderating their children’s internet consumption.


Focusing on nothing but parental neglect doesn't do much for the victims, though.

Are we to look at all the kids that get groomed and manipulated by predators on a platform like Omegle and say "lol that sucks, wish you had better parents tho" or can we also elevate our expectations of platforms that connect kids to adults?

For a platform that connects kids to rando adults, I would expect some sort of filter. Even a $1 join fee would have been better than what Omegle had (nothing).


the worst thing that could happen on omegle is that a child shows herself naked. this in fact, is not the end of the world, and certainly doesn't justify some bullshit where we have to ask for permission before making a website or internet service plus photo ID for this and that party or whatever the hell you consider part of your solution.

In that article she was 11 when using Omegle. She was using internet without appropriate supervision. That's almost like allowing a kid to drive a car, leading to an accident, and then trying to ban cars because they are dangerous to kids.

The internet is a great place, but it's an adult place. You can find absolutely horrible (adult) things on wikipedia that a kid should not be learning about at the age of 11. And I don't think we want to close wikipedia.

The responsibility of internet platforms is needed, but it's not an excuse for parental neglect.


Well parents should be responsible for looking after their children, but laws exist partly to protect the vulnerable. Not everyone has parents who are up to the job for one reason or other - that doesn't mean the law doesn't have a role in protecting them. In fact I would argue that the law has a greater role in that case. Lots of products have mandatory child-safe protections built in (eg bottles of bleech are required to have a child-safe cap) so we as a society have decided that some protections over and above defering to parents can be appropriate for products that can cause harm.

I don't know Omegle so don't know what the balance should be here, but lots of tech products are built with a "move fast, figure out the complicated bits later", which is right but which doesn't fit well with these sorts of nuances.


Walmart doesn't invite people of all ages to hang out in a private room together, with no supervision, no rules, no limits.

Parents tend to assume that "the internet" is regulated, somehow, whether by laws or market pressures. The thinking goes something like "Instagram is safe, right, because how could it not be? It's used by so many people, and if it could harm our kids, how would it be allowed to exist?" - right or wrong, people expect platforms to be held to some standard, and, right or wrong, put trust in the platforms to meet their expectations of safety.

The thing about Omegle was that it very much was the private room scenario I described above. I left out the part that made the room "safe" - the eject button. But persuasive people can persuade other people, especially children, to avoid that eject button, and while that only happened to some of the 74 million people using the site, it happened to people. And for those it happened to, those encounters wouldn't haven't happened without Omegle's help.

If you don't believe that, consider all those commenting here about how unique and special Omegle was for people who were good to one another. There's, thankfully, a lot of those comments.

But both things can be true, and were true when Omegle was operating. With 74 million people using it, the smallest of fractions of a percent still represent more than zero people experiencing harm that Omegle enabled.

The parents blame the platforms because the platforms enabled the harm.


Realistically, kids are on the Internet.

I don't know when you were born, but my relationship to the Internet started probably around the time I was 7 or 8. My school had computers with Internet, there were two computers at home. My parents could have limited my Internet use but they couldn't have stopped me. There is not a guard standing by every computer stopping me from being Online if I am under 18 years of age.

I still don't think Omegle is at fault, but we have to assume kids are on the Internet.


I expect at least some kids to be scared off by this.

The BBC article above states that Omegle is being mentioned in 50 pedophilia cases in the last 2 years. If 20% of kids would be scared to click "I'm older than 13", that would be 10 cases fewer.


Quite frankly: that line should be parents to draw and enforce, within reason. The liability should not be on the dating sites to prevent it from happening, not should the be forgiven for ignoring it when it does happen.

I'm in the unenviable situation of having had one of my big projects shut down for precisely this reason: it's too expensive to moderate sites that kids may use. The reason for that is actually very simple, proved by data, and verboten to say.


Somewhere along the way I feel it became normal to just let your children do whatever they want online with no supervision and no parental controls.

And at the same time I do think computer providers, Windows, Mac OS and all that don't offer good enough parental control.

Age verification is a problem as well, but it's foolish to think every website and app will implement proper safeguards. I mean, Omegle could simply be replaced by some darker Russian clone with even less effort put towards fighting crime.

Instead there should be opt in. When a child user is logged in to Windows et all, an allow list should always be in place. And only apps and websites that claim to be child safe should be included.

And parents must make sure to only let their kids use child accounts.

The idea that some KYC would be forced on all online website and apps just doesn't make sense otherwise.

And now it would be fair to sue websites that claim to be child safe and have opted in, if they turn out not to be.


Social media companies say you have to be over 13 to use the service, and then they fail to enforce their own rules a lot of the time. It's a disclaimer to avoid legal action by shifting responsibility to the user more than a real feature to stop children accessing their applications.

Laws like the one proposed in the article are a way to push social media companies to start enforcing those rules better.


Your claim that parents have clicked yes to authorize all the kids accounts and activities is a total joke.

When you say "parental rights" - the issue is that parents DON'T have admin rights to their kids social media stuff.

Most of these companies do some kind of checkbox the 13 year old checks that confirms that if they are under 13 their parents have approved. The idea this is "parental rights" has to be a sick sick joke.

NONE of these companies send anyone anywhere to actually check that the parents OK'ed going on omegle etc.

Tweak the law so that the companies have to actually verify parental approval. Ie, something in person with parent and the kid. And that parents then are the admins on the account and get both see all activity and also control minutes / week of activity on the platform.


>“Omegle for 12 years old” prompted Bing to suggest searching for “Kids On Omegle Showing”,

Results called "kids on omegle showing" suggests that the kids were being prompted by predators to produce child pornography on social networks. There has got to be some rule about letting kids access these social networking platforms. Who could possibly think it's a good idea to let a child post their photos videos and profile information online and leave that open to the public for any predator who wants to reach out to them. And what's worse these kids are probably using these things unsupervised.

I wonder how a search company could hope to really effectively combat this content considering it's probably constantly being produced and circulated on a daily basis. Although one should expect them to keep track of and closely monitor keyword phrases routinely associated with child porn.


I don't disagree, but also I don't think it's unreasonable that a site like that which also hosts children's forums is going to have parents wanting regulation.

Pornhub doesn't have a part of the website for minor teenagers or Minecraft. Neopets doesn't have a porn or gore section. Even 4chan separates their SFW and NSFW boards onto different sites (though obviously even their SFW boards are not child friendly and don't pretend to be).

You can't do anything at the DNS level for reddit (short of blocking it all) because it's all one site. And it uses TLS, so you'd need to MITM to do partial filtering, which is beyond the capability of most people. I assume Instagram and tiktok are similar/even harder to filter.

Porn sites at least used to ask for credit cards. Now they just ask yes/no are you 18, and they have children's sections. They should really be working to clean up their act in a privacy/autonomy friendly way (e.g. through labeling and partnering with browsers) before they're forced into these kinds of laws. Or stop targeting children as a market and ban anyone that hints that they are under 18.


The law itself disagrees. It's quite explicit that the point is to not allow kids into chat rooms at all without parental consent. It does not mention anything about preventing kids from accidentally joining chat just that they are not the ones able to provide consent to allow chat.
next

Legal | privacy