Sony rolls out a standard way to measure bias in how AI describes what it ‘sees’
Images in the test dataset were all sourced with consent
AI models are filled to the brim with bias, whether that’s showing you a certain race of person when you ask for a pic of a criminal or assuming that a woman can’t possibly be involved in a particular career when you ask for a firefighter. To deal with these issues, Sony AI has released a new dataset for testing the fairness of computer vision models, one that its makers claim was compiled in a fair and ethical way.…The RegisterRead More