Back to all posts

Developer Offer

Try ImaginePro API with 50 Free Credits

Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.

Start Free Trial

Sony AI Challenges Biased Models With New Tool

2025-11-06Will Shanklin3 minutes read
AI Ethics
Technology
Bias

Introducing FHIBE A New Standard for AI Fairness

Sony AI has launched a significant new tool to address one of the most critical challenges in artificial intelligence today: fairness and bias. The company released a benchmark dataset named the Fair Human-Centric Image Benchmark, or FHIBE (pronounced like Phoebe), designed to rigorously test how fairly AI models treat people. Described as the first publicly available, globally diverse, and consent-based human image dataset for evaluating bias, FHIBE sets a new standard for ethical AI development. The results from its initial tests were telling, as Sony revealed that not a single existing AI dataset from any company was able to fully meet its comprehensive benchmarks.

To counter the AI industry's widespread ethical challenges, FHIBE was constructed with a strong focus on consent and global representation. The dataset is composed of images from nearly 2,000 paid participants hailing from over 80 different countries. This consent-based approach stands in stark contrast to the common and often controversial practice of scraping large volumes of web data without permission.

Furthermore, participants in the FHIBE project retain control over their data and can request to have their images removed at any time. Each photo is meticulously detailed with annotations covering demographic and physical characteristics, environmental factors, and even the camera settings used, providing a rich context for analysis.

What the Benchmark Revealed About Todays AI

The application of FHIBE has already yielded critical insights, affirming previously documented biases present in modern AI models. However, the tool's true power lies in its ability to provide granular diagnoses of the factors contributing to these biases. For instance, while some models showed lower accuracy for individuals using "she/her/hers" pronouns, FHIBE was able to identify greater hairstyle variability as a previously overlooked contributing factor.

The benchmark also uncovered that current AI models tend to reinforce harmful stereotypes, even when given neutral prompts about a person's occupation. The tests showed a significant skew against specific pronoun and ancestry groups, with models incorrectly labeling subjects as sex workers, drug dealers, or thieves. Even more alarmingly, when prompted about potential crimes, certain models generated toxic responses at a higher rate for individuals of African or Asian ancestry, those with darker skin tones, and those who identify as male.

Paving the Way for Ethical AI Development

With the release of FHIBE, Sony AI aims to prove that ethical, diverse, and fair data collection is not just an ideal but an achievable reality. By making the tool available to the public, the company hopes to encourage the broader AI community to adopt higher standards for model evaluation and development. The comprehensive research behind the benchmark was also detailed in a paper published in the journal Nature, and the dataset will be updated over time to remain a relevant and powerful resource for building a more equitable AI future.

Read Original Post

Compare Plans & Pricing

Find the plan that matches your workload and unlock full access to ImaginePro.

ImaginePro pricing comparison
PlanPriceHighlights
Standard$8 / month
  • 300 monthly credits included
  • Access to Midjourney, Flux, and SDXL models
  • Commercial usage rights
Premium$20 / month
  • 900 monthly credits for scaling teams
  • Higher concurrency and faster delivery
  • Priority support via Slack or Telegram

Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.

View All Pricing Details
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.