CoPilot and IP as it exists are incompatible. They cannot coexist and still make sense. If all it takes to get around licensing is piping it through ML, that sets a legal precedent that basically legitimizes the practice of "math washing", which is basically dodging personal accountability because the machine did it, not me.
Further, Copilot enabled exactly the opposite of what good software engineering is about. We should understand the underlying consequences of the code we write. This includes libraries, dependency graphs, licensing incurred obligations, etc.
Also, Microsoft quite literally bought the most popular version control as a service company, then leveraged it to create a Machine Learned code generation framework.
They didn't ask for opt in. They didn't do due diligence, they didn't let anyone know ahead of time. They didn't ask anyone, they just did it.
You may look at my last paragraph and think, "Yeah, so what? Welcome to innovation, move fast and break things!"
You may not have noticed if you only pay attention to the tech world, but you live with a bunch of other non-tech people who have to regularly follow way more rules than tech companies have been held accountable for, and a lot of them are realizing that the relative competitiveness of tech is probably coming from their ability to skirt regulations put in place for good reasons.
While yes, sometimes society turns a blind eye and selectively enforces laws/regulations, etc..., It is generally done most frequently when socio-political agents are confident that harsh, resource-intensive enforcement really wouldn't produce as much realistic value in terms of applying that effort. The last decade has seen a lot of non-tech folks starting to become more aware of the reality of how tech isn't out or intending to act responsibly. They're out to make money, and position temselves into centralized positions of power and exaggerated influence.
Copilot is one more example of tech people being so concerned with whether they could, that once again, no one sat down and wondered if they should.
Further, Copilot enabled exactly the opposite of what good software engineering is about. We should understand the underlying consequences of the code we write. This includes libraries, dependency graphs, licensing incurred obligations, etc.
Also, Microsoft quite literally bought the most popular version control as a service company, then leveraged it to create a Machine Learned code generation framework.
They didn't ask for opt in. They didn't do due diligence, they didn't let anyone know ahead of time. They didn't ask anyone, they just did it.
You may look at my last paragraph and think, "Yeah, so what? Welcome to innovation, move fast and break things!"
You may not have noticed if you only pay attention to the tech world, but you live with a bunch of other non-tech people who have to regularly follow way more rules than tech companies have been held accountable for, and a lot of them are realizing that the relative competitiveness of tech is probably coming from their ability to skirt regulations put in place for good reasons.
While yes, sometimes society turns a blind eye and selectively enforces laws/regulations, etc..., It is generally done most frequently when socio-political agents are confident that harsh, resource-intensive enforcement really wouldn't produce as much realistic value in terms of applying that effort. The last decade has seen a lot of non-tech folks starting to become more aware of the reality of how tech isn't out or intending to act responsibly. They're out to make money, and position temselves into centralized positions of power and exaggerated influence.
Copilot is one more example of tech people being so concerned with whether they could, that once again, no one sat down and wondered if they should.
That is why people are angry.
reply