Skip Navigation

Copilot stops working on gender related subjects

Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you. 😑

54 comments
  • It’s almost as if it’s better for humans to do human things (like programming). If your tool is incapable of achieving your and your company’s needs, it’s time to ditch the tool.

  • it will also not suggest anything when I try to assert things: types ass; waits... types e; completion!

  • I think it's less of a problem with gendered nouns and much more of a problem with personal pronouns.

    Inanimate objects rarely change their gender identity, so those translations should be more or less fine.

    However for instance translating Finnish to English, you have to translate second person gender-neutral pronouns as he/she, so when translating, you have to make an assumption, or translate is as the clunky both versions "masculine/feminine" with a slash which sort of breaks the flow of the text.

54 comments