Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Meta’s problem with generating races, especially if they are a couple

byEray Eliaçık
April 5, 2024
in Artificial Intelligence

Meta, the company behind Instagram, is under fire for its AI image tools due to concerns about how they represent different races. Recent reports by tech experts have shown that Meta’s tools often make mistakes when generating images of people from different racial backgrounds.

The first problems became clear when Mia Sato noticed a problem. In a recent report, Sato noticed something strange with Instagram’s image generator. Normally, you could type in words like “Asian man” and get a picture. But when Mia tried, all the pictures showed people as Asian, even when they shouldn’t be.

Meta's problem with generating races, especially if they are a couple
Meta’s AI image generation tools have come under scrutiny for their difficulty in accurately depicting individuals of different racial backgrounds (Image credit)

Then, things got even weirder. When Sato tried again later day, the tool wouldn’t work at all. Instead of pictures, it showed an error message saying something went wrong.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Meta's problem with generating races, especially if they are a couple
Furthermore, the tools sometimes fail to produce images altogether, displaying error messages instead, adding to concerns about the reliability of Meta’s AI systems (Image credit)

It’s unclear why this happened, but it’s something to keep an eye on. For now, the mystery remains unsolved.

CNN also looked into the issue and found similar problems. When asking Imagine with Meta AI to show images of interracial couples, the results were often not accurate, missing the diversity of different races.

Meta's problem with generating races, especially if they are a couple
Investigations by CNN revealed similar issues, particularly in generating images of interracial couples, where the diversity of racial backgrounds was often misrepresented (Image credit)

Meta says they’re working on fixing these problems and making their tools less biased. But this isn’t just a problem for Meta—other tech companies have had similar issues with their AI tools. It shows that making AI fair and accurate is hard work.

In the future, it’s crucial for tech companies to be open about how their AI works and to keep working on making it fair for everyone. Meta’s struggles show that there’s still a long way to go before AI can truly represent its power.


Featured image credit: Catherine Thorbecke/CNN

Tags: AIMeta

Related Posts

Google releases Gemini 2.5 Computer Use model for building UI agents

Google releases Gemini 2.5 Computer Use model for building UI agents

October 8, 2025
AI is now the number one channel for data exfiltration in the enterprise

AI is now the number one channel for data exfiltration in the enterprise

October 8, 2025
Google expands its AI vibe-coding app Opal to 15 more countries

Google expands its AI vibe-coding app Opal to 15 more countries

October 8, 2025
Google introduces CodeMender, an AI agent for code security

Google introduces CodeMender, an AI agent for code security

October 8, 2025
The global race for AI talent: Why immigration policy will define the next decade of innovation

The global race for AI talent: Why immigration policy will define the next decade of innovation

October 8, 2025
ChatGPT reaches 800m weekly active users

ChatGPT reaches 800m weekly active users

October 7, 2025

LATEST NEWS

Microsoft delays Xbox Game Pass price increase for some existing subscribers

Google releases Gemini 2.5 Computer Use model for building UI agents

AI is now the number one channel for data exfiltration in the enterprise

Google expands its AI vibe-coding app Opal to 15 more countries

Google introduces CodeMender, an AI agent for code security

Megabonk once again proves you don’t need fancy graphics to become a hit

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.