Cohen says he hasn’t “kept up with emerging trends.”
Michael Cohen, the former lawyer for Donald Trump, admitted to citing fake, AI-generated court cases in a legal document that wound up in front of a federal judge, as reported earlier by The New York Times. A filing unsealed on Friday says Cohen used Google’s Bard to perform research after mistaking it for “a super-charged search engine” rather than an AI chatbot.
Problem is that these llm answers like this will find their way onto search engines like Google. Then it will be even more difficult to find real answers to questions.
Some LLMs are already generating answers based on other llm generated contant. We've gone full circle.
I was using phind to get some information about edrum sensors, (not the intended usecase, but I was just messing around) and one of the sources was a very obvious AI generating article from a contant mill.
Why is there not an automated check for any cases referenced in a filing, or required links? It would be trivial to require a clear format or uniform cross-reference, and this looks like an easy niche for automation to improve the judicial system. I understand that you couldn’t interpret those cases or the relevance, but an existence check and links or it doesn’t count.
I assume that now it doesn’t happen unless the other side sys a paralegal for a few hours of research
I think the issue is we're still in pretty uncharted territory here. It'll take time for stuff like that to become the norm. That said... The lawyers should be doing those kind of checks anyways. They're idiots if they don't.