Code analysis firm sees no major benefits from AI dev tool when measuring key programming metrics, though others report incremental gains from coding copilots with emphasis on code review.
“Many developers say AI coding assistants make them more productive, but a recent study set forth to measure their output and found no significant gains. Use of GitHub Copilot also introduced 41% more bugs, according to the study from Uplevel”
When used in a metaphorical way like this, meaning code assistants are not literal weapons, it means to make a thing readily available to the masses. In context, my comment means that AI has streamlined the task of copy/pasting from stack overflow. You don't even have to search for what you're trying to do and the AI will use answers it's scraped from the site to fill out your code for you.
This is the first time I hear the term weaponizing in the way you describe it. I would have said "AI obliterated the chore of finding the right answer from SO" or something similar. To me, weaponizing means, using against a person or entity.
Not only making things available to the masses, I think "democratizing" would be a better word for that intent.
"Weaponizing" something like this specifically has a connotation of danger or harm to some (unspecified) target.
In this context of copy/pasting stackoverflow answers without understanding them (a practice Considered Harmful), it was previously something that required intent to do. Now your editor will instantly suggest solutions to you and all you have to do is press the tab key to accept unknown lines of code.