“Contactée, l’éducation nationale invoque un « protocole de prise en charge ». « Quand un élève est harcelé, il peut aller voir la personne en qui il a confiance », assure-t-on au ministère. Problème : dans les faits, le revenge porn n’est souvent pas considéré comme du harcèlement stricto sensu. Le protocole idoine n’est donc pas activé : « Il intervient s’il y a une notion de répétition, par exemple si des photos intimes étaient diffusées à plusieurs reprises », précise Olivier Raluy, CPE dans un collège et secrétaire national du Syndicat national des enseignements de second degré (SNES-FSU).”
“Google notes in its own AI principles that algorithms and datasets can reinforce bias: ‘We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.’ Google invited affected developers to comment on its discussion forums. Only one developer had commented at the time of writing, and complained the change was down to ‘political correctness.’ ‘I don’t think political correctness has room in APIs,’ the person wrote. ‘If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.’”
“‘Notability are notoriously byzantine, to say it kindly,’ the anonymous editor says. They hope the push to reform the guidelines will help compensate for the historic underrepresentation of women and minorities, since it’s not just women who find their path into Wikipedia blocked. ‘A lot of prejudice is unconscious and intersectional,’ says Lubbock. ‘Wikipedia is dealing not just with a gender inequality issue, but also racial and geographical inequalities.’
“Goldman Sachs denied allegations of gender bias and said on Monday that it will reevaluate credit limits for Apple Card users on a case-by-case basis for customers who received lower credit lines than expected.“We have not and never will make decisions based on factors like gender,” Carey Halio, Goldman’s retail bank CEO, said in a statement. “In fact, we do not know your gender or marital status during the Apple Card application process.”Halio said that customers unsatisfied with their line should contact the company.“Based on additional information we may request, we will re-evaluate your credit line,” the statement said.”
“Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”
“Smart Compose is an example of what AI developers call natural language generation (NLG), in which computers learn to write sentences by studying patterns and relationships between words in literature, emails and web pages. A system shown billions of human sentences becomes adept at completing common phrases but is limited by generalities. Men have long dominated fields such as finance and science, for example, so the technology would conclude from the data that an investor or engineer is “he” or “him.” The issue trips up nearly every major tech company. ”
“The group created 500 computer models focused on specific job functions and locations. They taught each to recognize some 50,000 terms that showed up on past candidates’ resumes. The algorithms learned to assign little significance to skills that were common across IT applicants, such as the ability to write various computer codes, the people said. […] With the technology returning results almost at random, Amazon shut down the project.”
“Microsoft announced Tuesday that it has updated its facial recognition technology with significant improvements in the system’s ability to recognize gender across skin tones. That improvement addresses recent concerns that commercially available facial recognition technologies more accurately recognized gender of people with lighter skin tones than darker skin tones, and that they performed best on males with lighter skin and worst on females with darker skin.
With the new improvements, Microsoft said it was able to reduce the error rates for men and women with darker skin by up to 20 times. For all women, the company said the error rates were reduced by nine times. Overall, the company said that, with these improvements, they were able to significantly reduce accuracy differences across the demographics”.
«Ces violences affectent la santé et la vie sociale des victimes, avec la même gravité que les autres formes de violences faites aux femmes. Elles n’ont rien de virtuel. Elles sont pourtant largement tolérées. C’est ce dont témoigne un testing inédit mené par le HCE et ses partenaires en juillet 2017 sur les principaux réseaux sociaux (Facebook, Twitter et Youtube) : 92% des contenus sexistes signalés (insultes, menaces de viols ou incitation à la haine) n’ont pas été supprimés par les plateformes, avec des écarts : 87 % pour Facebook, 89 % pour Twitter et 100% pour Youtube».
«Gender-Specific Language. Targets gendered language which may be perceived as excluding, dismissive, or stereotyping. Consider using gender-inclusive language. Example: We need more policemen to maintain public safety. Policemen is corrected to police officers».