I see how you made this connection, but it's not entirely fair. Guns are a tool with one use - hurting people (in self defense or not, that's the outcome). On the other hand, stuff like ChatGPT has many uses. So I don't think it's a great analogy. Something like 3D printers are closer - you can use them to make guns, but also lots of other stuff.
You could think of a gun as just a malicious piece of mechanical engineering. You could think of the software in the article as a malicious piece of software engineering.
Please don't talk about murdering the person you're talking to. It's intented to provoke painful emotion.
I'll clarify my comment though. The object is really a model that's shaped like a turtle but with pictures of rifle parts on it. It's neither a rifle nor a turtle. Both human and computer are too confident in their classification and both are just as wrong.
Using your analogy. You could actually hide gun inside a lunchbox and fool humans.
I don't know what you are arguing here. A gun is a physical item, as is ammunition. You can stop somebody, search them, and find out if they have a gun and/or ammunition. You cannot do any such thing with encryption, as it is not a physical entity. Whether your gun is a regular gun, 3d printed or made of candy cane does not change the fact that it is a physical item. I don't care where it comes from or how you built it, as a physical item, it still follows a very different logic than virtual entities.
I think you're correct in recognizing that inanimate objects like guns or software do not have agency and cannot have moral value. The actors who use them should be judged on the morality of the ends to which guns or Tor are the means. This doesn't mean that you can't advocate for the free use of one inanimate object and simultaneously advocate for the regulation of another. Just because I'm in favor of gun rights doesn't mean I have to be OK with nuclear-weapons rights or dangerous-exotic-animal rights.
There's another similarity between Tor and guns (at least in the US). Neither can be eliminated anyway, it's better to discuss how to manage their existence. Tor has no central server that can be shut down. Guns are durable hunks of metal that can be easily hidden, and they exist in huge quantities across the US and could never be confiscated.
I think it's fair to consider what the tool is designed to do as well. Generative AI is capable of distorting reality, but I would argue that it is not the primary purpose of the tool. Likewise, knives can be used to kill, but for the most part they are tools made for cutting, not killing. On the other hand, I fail to see a purpose for guns apart from killing, and for some guns such as hand guns, for killing people.
Yes, in retrospect, guns are instruments created to destruct or hurt, rather than kill. Still, the original point stands: guns are created and used for specific problems, and other tools, in which I include AI, are much more general in usage, even with potential to abuse.
Creation of a new type of weapon cannot be separated ethically or conceptually from the ways the weapon will be used. The design of a weapon influences the manner of its use, and that is unavoidable. The same is true of all objects -- design influences behavior. This link is inextricable.
A shotgun cartridge would be vastly less effective, and in many countries, even getting your hands on ammunition is difficult.
My objection is that it's mostly fright porn for drumming up mindshare. I don't think it's particularly compelling because a) imagining such systems is trivial, and b) there are already ways for us to achieve the same goal, so it's not even a novel threat.
But of course, that's just my opinion. Others obviously feel differently.
Can you downvote me more than once? If so, please feel free.
A gun is a tool, not a concept. You seem no less inclined to argue semantics than I; the difference is that I don't consider a semantic argument, and a meaningful argument, mutually exclusive. Perhaps I'm wrong in that.
This is Hacker News -- you are expected not to be sloppy with your argumentation here. If you say "inanimate object" rather than gun, then we're going to assume you are making a more general statement and react accordingly. I cannot read your mind, so please do not assume I (or anyone else) knows your precise opinion on gun control without being precise with your words.
Guns don't hallucinate. Guns also (currently) require a human to pull the trigger. Even if the human is a dirtbag, that's an important limitation on the gun's potential to do damage because the human is vulnerable. The analogy doesn't work, even before we get to things like limited ammunition or potential for self-enhancement or omnipresence.
This is a 'guns don't kill people' argument, which historically has been a bit more complicated than that. Humans are not context-free actors independent of their environment - our environment shapes us. And the argument here is that this technology would create an environment that would have a tendency to bend us into a shape we may not like.
It does nothing to those barriers. They are still absolutely the same. Unless you're trying to argue that somehow guns magically imbue in people the intent to kill.
I assure you, they do not. In point of fact, the hobby can get rather onerous to upkeep due to maintenance costs and the burden of magical thinking individuals like yourself employ, necessitating constant vigilance and correction.
People kill people.
AI, gun, explosive, makes no difference. Long as there are two blokes atound with irreconcilable opinions/worldviews, somebody's gonna want someone else dead. And that is the problem. The tools do not move until the mind employs them.
Reductively, a gun is a pile of atoms, like everything else, but unlike child pornography, it has the capability of acting upon the material world.
reply