It acknowledged ‘inaccuracies’ in historical prompts.
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
Never claimed they had diverse data sources - they probably don’t.
My point is that that when minorities are underrepresented, which is the default case in GenAI, the (white, male) public tends to accept that.
I like that they tried to fix the issue of GenAI being racist and sexist. Even though the solution is obviously flawed: Better this than a racist model.
I can't believe someone has to spell this out for you, but here we go: an accurate picture of people from an era in which there was no diversity will, by definition, not be diverse.