“Microsoft announced Tuesday that it has updated its facial recognition technology with significant improvements in the system’s ability to recognize gender across skin tones. That improvement addresses recent concerns that commercially available facial recognition technologies more accurately recognized gender of people with lighter skin tones than darker skin tones, and that they performed best on males with lighter skin and worst on females with darker skin.
With the new improvements, Microsoft said it was able to reduce the error rates for men and women with darker skin by up to 20 times. For all women, the company said the error rates were reduced by nine times. Overall, the company said that, with these improvements, they were able to significantly reduce accuracy differences across the demographics”.
«Ces violences affectent la santé et la vie sociale des victimes, avec la même gravité que les autres formes de violences faites aux femmes. Elles n’ont rien de virtuel. Elles sont pourtant largement tolérées. C’est ce dont témoigne un testing inédit mené par le HCE et ses partenaires en juillet 2017 sur les principaux réseaux sociaux (Facebook, Twitter et Youtube) : 92% des contenus sexistes signalés (insultes, menaces de viols ou incitation à la haine) n’ont pas été supprimés par les plateformes, avec des écarts : 87 % pour Facebook, 89 % pour Twitter et 100% pour Youtube».
«Gender-Specific Language. Targets gendered language which may be perceived as excluding, dismissive, or stereotyping. Consider using gender-inclusive language. Example: We need more policemen to maintain public safety. Policemen is corrected to police officers».
« The process of mathematically defining “fair” decision-making metrics also forces us to pin down tradeoffs between fairness and accuracy that must be faced and have sometimes been swept under the carpet by policy-makers. It makes us rethink what it really means to treat all groups equally—in some cases equal treatment may only be possible by learning different group-specific criteria.There is an entirely new field emerging at the intersection of computer science, law, and ethics. It will not only lead to fairer algorithms, but also to algorithms which track accountability, and make clear which factors contributed to a decision. There’s much reason to be hopeful! » – Jennifer T. Chayes.