Mon, Dec 23, 2024 | Jumada al-Aakhirah 22, 1446 | DXB ktweather icon0°C

Google suspends Gemini chatbot's ability to generate pictures of people

Earlier, Google apologised for “inaccuracies” in historical depictions that it was creating

Published: Thu 22 Feb 2024, 7:04 PM

  • By
  • AP

Top Stories

Google logo. — AFP file

Google logo. — AFP file

Google said on Thursday it's temporarily stopping its Gemini artificial intelligence chatbot from generating images of people a day after apologising for “inaccuracies” in historical depictions that it was creating.

Gemini users this week posted screenshots on social media of historically white-dominated scenes with racially diverse characters that they say it generated, leading critics to raise questions about whether the company is over-correcting for the risk of racial bias in its AI model.

“We’re already working to address recent issues with Gemini’s image generation feature,” Google said in a post on the social media platform X. “While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.”

Stay up to date with the latest news. Follow KT on WhatsApp Channels.

Previous studies have shown AI image-generators can amplify racial and gender stereotypes found in their training data, and without filters are more likely to generate lighter-skinned men when asked to generate a person in various contexts.

Google said on Wednesday that it's “aware that Gemini is offering inaccuracies in some historical image generation depictions” and that it's "working to improve these kinds of depictions immediately".

Gemini can generate a “wide range of people", which the company said is “generally a good thing" because people around the world use the system but it is “missing the mark.”

When the AP asked Gemini to generate pictures of people, it responded by saying it's “working to improve” the ability to do so. “We expect this feature to return soon and will notify you in release updates when it does,” the chatbot said.

ALSO READ:



Next Story