Over this past summer, Microsoft unveiled Copilot. Copilot isn’t just another artificial intelligence program that can create models, or automate tasks. No, instead this unique AI has the ability to generate its own computer code. This purpose is to shave man-hours from the work lives of professional programs. Potentially revolutionary, Copilot doesn’t stray away from what many AI programs already do, enhancing human ability or saving time on tasks. Though the program looks to be an intriguing step in the AI journey on paper, not every developer is happy. According to a report by The New York Times, Matthew Butterick, programmer, writer, and lawyer, and a team of lawyers brought suit against Microsoft and three other companies who designed and deployed Copilot. Among them, are OpenAI and GitHub.
Mr. Butterick believes that Copilot is nothing more than a new form of piracy due to not acknowledging its debt to existing work. In the lawsuit, he and his team are claiming that Microsoft and its collaborators violated the rights of millions of programmers who wrote the original code. But what does the lawsuit mean by this? Well, according to the lawsuit, they point to the fact that Copilot relied on analyzing vast amounts of data found online, primarily from the popular code repository GitHub. Billions of lines of computer code were analyzed during its training. In Mr. Butterick’s view, GitHub violated its own terms of service in the process.
The lawsuit claims that the companies ran afoul of federal law which requires the display of copyright information when the material is used, which makes the use of Copilot a violation of licenses as it continues to run through the data improving its ability to produce code. Most experts believe that training AI on copyrighted material is not necessarily illegal. The issue though is that by doing so you can see seems to create materials that are quite similar to the ones it was trained on. If this does become the case for Copilot, this could be an issue for Microsoft and its partners.
For now, Copilot produces simple and useful code. So far, most programmers say it isn’t producing junior-level code on its own, as it requires vetting and work. But they have found it useful when they’re trying to learn code or master a new programming language. So though Microsoft’s Copilot is able to write simple pieces, it’s not quite there with some of Sci-Fi’s best-known AI villains such as Screamers, Terminator, or even the Matrix. As for the lawsuit. It still has a long way to go before it can reach any conclusion. But at the very least, it could help develop the foundations for civil laws that deal with AI.