Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Sony launches world’s first ethical bias benchmark for AI images

FHIBE, Sony AI’s new ethical benchmark, includes 2,000 consented participants from 80 countries to test how AI models treat different demographics.

byEmre Çıtak
November 6, 2025
in Artificial Intelligence, News

Sony AI released the Fair Human-Centric Image Benchmark (FHIBE), the first publicly available, globally diverse, consent-based human image dataset designed to evaluate bias in computer vision tasks. This tool assesses how AI models treat people across various demographics, addressing ethical challenges in the AI industry through consented image collection from diverse participants.

The dataset, pronounced like “Phoebe,” includes images of nearly 2,000 paid participants from over 80 countries. Each individual provided explicit consent for sharing their likenesses, distinguishing FHIBE from common practices that involve scraping large volumes of web data without permission. Participants retain the right to remove their images at any time, ensuring ongoing control over their personal data. This approach underscores Sony AI’s commitment to ethical standards in data acquisition.

Every photo in the dataset features detailed annotations. These cover demographic and physical characteristics, such as age, gender pronouns, ancestry, and skin tone. Environmental factors, including lighting conditions and backgrounds, are also noted. Camera settings, like focal length and exposure, provide additional context for model evaluations. Such comprehensive labeling enables precise analysis of how external variables influence AI performance.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Testing with FHIBE confirmed previously documented biases in existing AI models. The benchmark goes further by offering granular diagnoses of contributing factors. For instance, models exhibited lower accuracy for individuals using “she/her/hers” pronouns. FHIBE identified greater hairstyle variability as a key, previously overlooked element behind this discrepancy, allowing researchers to pinpoint specific areas for improvement in model training.

In evaluations of neutral questions about a subject’s occupation, AI models reinforced stereotypes. The benchmark revealed skews against specific pronoun and ancestry groups, with outputs labeling individuals as sex workers, drug dealers, or thieves. This pattern highlights how unbiased prompts can still yield discriminatory results based on demographic attributes.

When prompted about potential crimes committed by individuals, models generated toxic responses at higher rates for certain groups. These included people of African or Asian ancestry, those with darker skin tones, and individuals identifying as “he/him/his.” Such findings expose vulnerabilities in AI systems that could perpetuate harm through biased outputs.

Sony AI states that FHIBE demonstrates ethical, diverse, and fair data collection is achievable. The tool is now publicly available for researchers and developers to use in bias testing. Sony plans to update the dataset over time to incorporate new images and annotations. A research paper detailing these findings appeared in Nature on Wednesday.


Featured image credit

Tags: AISony

Related Posts

Netflix to stream video podcasts in 2026

Netflix to stream video podcasts in 2026

November 6, 2025
Google Maps integrates Gemini for hands-free navigation

Google Maps integrates Gemini for hands-free navigation

November 6, 2025
Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

November 6, 2025
Nintendo expands its store app beyond Japan to global markets

Nintendo expands its store app beyond Japan to global markets

November 6, 2025
Blue Origin New Glenn’s second launch set for November 9

Blue Origin New Glenn’s second launch set for November 9

November 6, 2025
Google Notebooklm adds video, report generation

Google Notebooklm adds video, report generation

November 6, 2025

LATEST NEWS

Netflix to stream video podcasts in 2026

Google Maps integrates Gemini for hands-free navigation

Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

Sony launches world’s first ethical bias benchmark for AI images

Nintendo expands its store app beyond Japan to global markets

Blue Origin New Glenn’s second launch set for November 9

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.