Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I personally don't know if you can be exposed to information not suited to underage kids in chatgpt (which was the reasoning of the regulator), and in general am not a huge fan of putting rules in place because it's the internet...you can always work around blocks, but at least in google you have safesearch which hides some content before the kid becomes too smart to find it anyway.


sort by: page size:

How do you make sure that children don't get inappropriate content? I know ChatGPT is pretty good at filtering already but to me it seems like a high risk undertaking - a single lapse can sink your ship.

Kids often get around those limitations pretty creatively: http://lanayarosh.com/2014/06/whitelist-chat-as-a-strategy-t...

Books are under control, children can not enter a library and get Adult books. The internet in general can have filters and many Parents activate personal firewalls for this purpose. Even Google has safeSearch that filters results for explicit adult or violent content. ChatGpt right now does not have these "parental controls". yes you can bypass most of them but you need to know technical details. they can not be bypassed just by saying "disable the filters".

Books are definitely not under control, and neither is internet search.

Parents can try to block chatgpt with a firewall if they wish. It’s no less likely to work than blocking other internet sites.

Edit: Also, LLMs do have mandatory parental controls. They work about as well as book censorship, safe search or internet blacklists (very, very poorly).


The content you do not want your child to interact with is your personal decision, and of course, varies from parent to parent. There is no permutation of acceptable administrative oversight to this issue I can imagine that would satisfy everyone reasonably, nor is there one that would not have chilling effects on free speech.

When it comes to content that is illegal specifically - that of course, should fall on tech companies to moderate. But that is the exception, in my view.

In short, your child's oversight is not one-size-fits-all - it is strictly your business, and perhaps your school's and childcare professionals'.


I feel like there are three places that you can conceivably do age restrictions, each with their respective pros and cons.

On the client: The typical parental controls situation with a blocklist. Responsible adult sites could send an HTTP header like X-Adult-Content or something to ensure they would be blocked by clients with the controls on. This could be enforced by regulation, with devices for children required to respect that header in addition to shipping with a blocklist. This couldn't be bypassed with something like a VPN (which you probably couldn't install with parental controls on anyway), because it would be baked into the client.

In the network: Similar blocklist style situation - this already exists to a certain extent in some countries. Mobile phone providers and ISPs would be required to block adult content (via dns or similar) by default, with a toggle to switch it off available to the adult account owner. We already block illegal content this way.

On the server side: This requires adult websites to be cooperative, which they will only be if they care about the jurisdiction in question. This potentially pushes kids/others to access sketchier sites which wouldn't be blocked.

I am strongly for pushing this down the stack to the client. I don't have kids, but I'm pretty sure iOS and Android as well as MacOS and Windows all have robust parental control systems. I'm sure people use them, and we can safely encourage people to use them more. If they aren't on by default already for children, they could be.

Short of biometric verification with presence detection I don't see how it's possible to do remote age verification in a good enough way that you couldn't just use an adult's credit card or device to bypass it.

Anyone who knows anything about tech knows this is just a wedge issue, and people aren't interested in solutions to the actual problem of kids accessing inappropriate material, only soundbites.


The law itself disagrees. It's quite explicit that the point is to not allow kids into chat rooms at all without parental consent. It does not mention anything about preventing kids from accidentally joining chat just that they are not the ones able to provide consent to allow chat.

It's pretty hard to block these sites because you have to know the domain name/information on each one. Assuming large-scale voluntary compliance, and I think the chances are good that all the "big" producers will move over, it's much easier to just filter the xxx domain and be done with it.

Usually the idea is to keep it away from children because children will willfully look at it without understanding. That page that says "Absolutely DO NOT look at this if you're under 18" has no relevance to an under-18 year old.

It's the same reason we put all other restraints on teenagers; if they were capable of detecting trouble and reliably refusing to participate therein, then they wouldn't need all of the things that are in place to protect them from themselves, foremost among these being parents.


It seems they’re worried about peer to peer interactions.

My experience regulating my daughters internet access is that interactions with other kids have the highest potential for toxicity.

They seem ok with kids passively consuming a stream of adult generated content.

They’re also excluding SaaS which just emulates desktop software.

I think that’s about right.


Here's a better compromise, if we must find a compromise, that Google won't be so in favor of:

"No parent, or guardian, shall permit their children under the age of 13, to possess or easily have access to any device, which provides unfiltered and unrestricted access to the internet in such a way that said parent, or guardian, is not fully aware of each and every website that the child may access with said device."

[As for anyone like, what about the vulnerable? There's this thing called a library, and don't tell me public schools don't have resources.]


What about choosing not to encrypt chats involving children? Good compromise?

The alternative is to take advantage of tools already out on the market developed privately to help people lock down and filter content from the internet. Content and privacy filters have been out forever...use those instead of the government heavy handedly imposing something.

My opinion (and strictly an opinion) is if diverse types of information are more readily available you can either lock down your child's access to that information OR you can prepare them for what they might encounter. Separately its also good to guide them on access and responsible use of the world's knowledge (good, bad, moral, immoral) which is literally at their fingertips


I'd argue that kids need the tools and techniques to understand how to deal with dodgy online content, rather than locking it away. Prohibition tends to drive things underground; I'd rather have (and am having - I have 2 teenage kids!) realistic and honest conversations about porn and other content. I don't lock my router to stop adult site access, instead we're having a conversation about it.

I'm under no illusions, btw. I know they'll access it. I know I did as a teen (albeit magazines rather than the web) but I think a dialogue is sorely missing about damage, the impact on women, body image, etc etc. The more this is pushed into a black/white old enough / not old enough over-simplified scenario, the less we can have nuanced conversations about it all.

I suspect this whole move is purely an optics piece. That doesn't make it any less dangerous, in fact it maybe makes it more dangerous, but it explains the incompetence and lack of thought behind it.


When it comes to parents approving such measures, I feel it is mostly an easy way out for them either due to inefficient parenting or extreme insecurity/protectionism.

If the kid is too young and not too knowledgeable about the world then perhaps it would make sense to not be left alone with an -online- tablet. If the kid is older then parents should have by that time invested time and talked with their child and let it know of what dangers may await online.

Finally and with regards to the 'indecent'/'porn' aspect of the filter, if a child is traumatized after viewing a pair of boobs or a vagina then there is something wrong with the upbringing the parents gave to it. Having sexual education websites blocked by the filter makes the matter all the more worst.


Not sure you have had kids?

You generally can't supervise them 100% of the time, and that would also be harmful to their development.

It's almost impossible to control what your kid can do or see or who they interact with once they get on the internet. I had long conversations with my son about the dangers, and thankfully he seems to have absorbed them, but not all do.

I wish there had been safe, non toxic areas, equivalent to playing in the back yard, for kids.

For the record, I don't think banning social media for under 14s will work particularly well. It's more about providing safer places than banning existing ones IMHO.


I generally trust my kids and have had the appropriate content talk with both of them several times. But they're kids. My 8 y/o boy will still search for "sexy girls" sometimes. If it would just show him swimsuit models or Playboy-level material, I'd be happy enough. But with any search, he's only a couple clicks away from the some of the most hardcore fetish porn ever created.

I still prefer monitoring / reporting to filtering / blocking though. It's more important for me to know what they're looking at and be able to talk to them about it than it is for me to block everything that might be offensive.


Blocking something is very different than looking up what they click. After 12-14 it can very well get dystopian in my opinion. Kids have and need to have secrets from their parents at that point.

You also don't want moral busybodies at school make mountains out of molehills to make it more traumatic than the content they perhaps consumed and wasn't age appropriate. A while ago parents were oblivious to what content their children consume and didn't even have blocks. The children survived too although a bit more engagement might indeed be sensible.

As I said, it was criticism of surveillance, not blocking content or in general curating content for very young children.


I mean legally they shouldn't be with COPPA and all that, otherwise there are far more regulations in relation to their privacy, the two ideas of toning it down for children and not following children's privacy standards would visibly contradict each other. Doesn't stop children though since the most blocking then from registering for most web sites prematurely like that is a text box that might as well say "I am above 13 or willing to claim I am". A more cynical person would say that's by design but there really is no unintrusive to verify that either otherwise such a thing probably would be enforced for legal reasons.

One thing I noticed that is often missing in these discussions, sometimes it isn't up to the parents to decide at all.

Imagine your 8 - 11 years old daughter's social circle, and the school she is at where her classmate were allowed very little Internet at home or school. Then you wouldn't have a problem enforcing whatever internet rules.

Now imagine everyone of her classmate were playing Minecraft, and she is the only one being left out.

The point is, if everyone at her school is spending time on god damn stupid Chinese TikTok, then a 10 - 15 minutes Tik Tok for her would be a necessary evil.

So far most of the Internet stuff are entertainment only. So stuff like Pop Music, Anime, Viral Videos etc. While not productive, they are harmless. And educating them not to use real names and talk to strangers on the internet seems to have worked so far. And only keep track of topics they looked into. ( At least before the age of 12 or 14 ) Generally speaking the internet is still fairly safe under some guidance.

But I have witness teenagers ( son of my close friend ) wondered into politics and culture war at the age of 14+. And it is absolutely destructive. The age where they start doing things without telling you, and going on to Reddit or whatever Internet forums. I dont have a good solution to that.

Part of the reason why I have been thinking about Age restricted participation on web forums. You could only reply if you are over the age of X.

next

Legal | privacy