Skip Navigation

MDN can now automatically lie to people seeking technical information · Issue #9208 · mdn/yari | MDN now providing LLM generated explainer text for code samples

github.com MDN can now automatically lie to people seeking technical information · Issue #9208 · mdn/yari

Summary MDN's new "ai explain" button on code blocks generates human-like text that may be correct by happenstance, or may contain convincing falsehoods. this is a strange decision for a technical ...

MDN can now automatically lie to people seeking technical information · Issue #9208 · mdn/yari

Seems pretty bad?

3
3 comments