Reactions are starting to come in for GitHub’s new Copilot coding tool, which one site calls “a product of the partnership between Microsoft and AI research and deployment company OpenAI — which Microsoft invested $1 billion into two years ago.”
According to the tech preview page: GitHub Copilot is currently only available as a Visual Studio Code extension. It works wherever Visual Studio Code works — on your machine or in the cloud on GitHub Codespaces. And it’s fast enough to use as you type. “Copilot looks like a potentially fantastic learning tool — for developers of all abilities,” said James Governor, an analyst at RedMonk. “It can remove barriers to entry. It can help with learning new languages, and for folks working on polyglot codebases. It arguably continues GitHub’s rich heritage as a world-class learning tool. It’s early days but AI-assisted programming is going to be a thing, and where better to start experiencing it than GitHub…?”
The issue of scale is a concern for GitHub, according to the tech preview FAQ: “If the technical preview is successful, our plan is to build a commercial version of GitHub Copilot in the future. We want to use the preview to learn how people use GitHub Copilot and what it takes to operate it at scale.” GitHub spent the last year working closely with OpenAI to build Copilot. GitHub developers, along with some users inside Microsoft, have been using it every day internally for months.[Guillermo Rauch, CEO of developer software provider Vercel, who also is founder of Vercel and creator of Next.js], cited in a tweet a statement from the Copilot tech preview FAQ page, “GitHub Copilot is a code synthesizer, not a search engine: the vast majority of the code that it suggests is uniquely generated and has never been seen before.”
To that, Rauch simply typed: “The future.”
Rauch’s post is relevant in that one of the knocks against Copilot is that some folks seem to be concerned that it will generate code that is identical to code that has been generated under open source licenses that don’t allow derivative works, but which will then be used by a developer unknowingly…
GitHub CEO Nat Friedman has responded to those concerns, according to another article, arguing that training an AI system constitutes fair use:
Friedman is not alone — a couple of actual lawyers and experts in intellectual property law took up the issue and, at least in their preliminary analysis, tended to agree with Friedman… [U.K. solicitor] Neil Brown examines the idea from an English law perspective and, while he’s not so sure about the idea of “fair use” if the idea is taken outside of the U.S., he points simply to GitHub’s terms of service as evidence enough that the company can likely do what it’s doing. Brown points to passage D4, which grants GitHub “the right to store, archive, parse, and display Your Content, and make incidental copies, as necessary to provide the Service, including improving the Service over time.” “The license is broadly worded, and I’m confident that there is scope for argument, but if it turns out that Github does not require a license for its activities then, in respect of the code hosted on Github, I suspect it could make a reasonable case that the mandatory license grant in its terms covers this as against the uploader,” writes Brown. Overall, though, Brown says that he has “more questions than answers.”
Armin Ronacher, the creator of the Flask web framework for Python, shared an interesting example on Twitter (which apparently came from the game Quake III Arena) in which Copilot apparently reproduces a chunk of code including not only its original comment (“what the fuck?”) but also its original copyright notice.
Read more of this story at Slashdot.