Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Sony launches world’s first ethical bias benchmark for AI images

FHIBE, Sony AI’s new ethical benchmark, includes 2,000 consented participants from 80 countries to test how AI models treat different demographics.

byEmre Çıtak
November 6, 2025
in Artificial Intelligence, News
Home News Artificial Intelligence
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Sony AI released the Fair Human-Centric Image Benchmark (FHIBE), the first publicly available, globally diverse, consent-based human image dataset designed to evaluate bias in computer vision tasks. This tool assesses how AI models treat people across various demographics, addressing ethical challenges in the AI industry through consented image collection from diverse participants.

The dataset, pronounced like “Phoebe,” includes images of nearly 2,000 paid participants from over 80 countries. Each individual provided explicit consent for sharing their likenesses, distinguishing FHIBE from common practices that involve scraping large volumes of web data without permission. Participants retain the right to remove their images at any time, ensuring ongoing control over their personal data. This approach underscores Sony AI’s commitment to ethical standards in data acquisition.

Every photo in the dataset features detailed annotations. These cover demographic and physical characteristics, such as age, gender pronouns, ancestry, and skin tone. Environmental factors, including lighting conditions and backgrounds, are also noted. Camera settings, like focal length and exposure, provide additional context for model evaluations. Such comprehensive labeling enables precise analysis of how external variables influence AI performance.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Testing with FHIBE confirmed previously documented biases in existing AI models. The benchmark goes further by offering granular diagnoses of contributing factors. For instance, models exhibited lower accuracy for individuals using “she/her/hers” pronouns. FHIBE identified greater hairstyle variability as a key, previously overlooked element behind this discrepancy, allowing researchers to pinpoint specific areas for improvement in model training.

In evaluations of neutral questions about a subject’s occupation, AI models reinforced stereotypes. The benchmark revealed skews against specific pronoun and ancestry groups, with outputs labeling individuals as sex workers, drug dealers, or thieves. This pattern highlights how unbiased prompts can still yield discriminatory results based on demographic attributes.

When prompted about potential crimes committed by individuals, models generated toxic responses at higher rates for certain groups. These included people of African or Asian ancestry, those with darker skin tones, and individuals identifying as “he/him/his.” Such findings expose vulnerabilities in AI systems that could perpetuate harm through biased outputs.

Sony AI states that FHIBE demonstrates ethical, diverse, and fair data collection is achievable. The tool is now publicly available for researchers and developers to use in bias testing. Sony plans to update the dataset over time to incorporate new images and annotations. A research paper detailing these findings appeared in Nature on Wednesday.


Featured image credit

Tags: AISony

Related Posts

Speechify adds voice typing and assistant to Chrome

Speechify adds voice typing and assistant to Chrome

November 26, 2025
Copilot exits WhatsApp on January 15 citing policy shift

Copilot exits WhatsApp on January 15 citing policy shift

November 26, 2025
Rockstar co-founder critiques EA and Microsoft’s AI expectations

Rockstar co-founder critiques EA and Microsoft’s AI expectations

November 26, 2025
Gemini’s upcoming Projects feature mirrors ChatGPT workspaces

Gemini’s upcoming Projects feature mirrors ChatGPT workspaces

November 26, 2025
OpenAI moves ChatGPT Voice into main chat thread

OpenAI moves ChatGPT Voice into main chat thread

November 26, 2025
Perplexity launches Instant Buy AI shopping assistant with PayPal

Perplexity launches Instant Buy AI shopping assistant with PayPal

November 26, 2025

LATEST NEWS

Speechify adds voice typing and assistant to Chrome

Copilot exits WhatsApp on January 15 citing policy shift

Rockstar co-founder critiques EA and Microsoft’s AI expectations

Gemini’s upcoming Projects feature mirrors ChatGPT workspaces

OpenAI moves ChatGPT Voice into main chat thread

Perplexity launches Instant Buy AI shopping assistant with PayPal

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.