Étiquette : gender (Page 1 of 2)

Le « revenge porn », pratique « banale » et hors de contrôle chez les élèves

Selon une étude réalisée en 2018 par l’éducation nationale, 9 % des lycéens affirment avoir été « victimes de vidéos, de photos ou de rumeurs humiliantes sur Internet ».

“Contactée, l’éducation nationale invoque un « protocole de prise en charge ». « Quand un élève est harcelé, il peut aller voir la personne en qui il a confiance », assure-t-on au ministère. Problème : dans les faits, le revenge porn n’est souvent pas considéré comme du harcèlement stricto sensu. Le protocole idoine n’est donc pas activé : « Il intervient s’il y a une notion de répétition, par exemple si des photos intimes étaient diffusées à plusieurs reprises », précise Olivier Raluy, CPE dans un collège et secrétaire national du Syndicat national des enseignements de second degré (SNES-FSU).”

Source : Le « revenge porn », pratique « banale » et hors de contrôle chez les élèves

Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias

Google Vision API

“Google notes in its own AI principles that algorithms and datasets can reinforce bias: ‘We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.’ Google invited affected developers to comment on its discussion forums. Only one developer had commented at the time of writing, and complained the change was down to ‘political correctness.’ ‘I don’t think political correctness has room in APIs,’ the person wrote. ‘If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.’”

Source : Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias

Female scientists’ pages keep disappearing from Wikipedia – what’s going on?

http://www.beaude.net/no-flux/wp-content/uploads/2020/01/143762_HERO-GettyImages-860967818.jpg

“‘Notability are notoriously byzantine, to say it kindly,’ the anonymous editor says. They hope the push to reform the guidelines will help compensate for the historic underrepresentation of women and minorities, since it’s not just women who find their path into Wikipedia blocked. ‘A lot of prejudice is unconscious and intersectional,’ says Lubbock. ‘Wikipedia is dealing not just with a gender inequality issue, but also racial and geographical inequalities.’

Source : Female scientists’ pages keep disappearing from Wikipedia – what’s going on? | News | Chemistry World

Goldman Sachs to reevaluate Apple Card credit limits after bias claim

View image on Twitter

“Goldman Sachs denied allegations of gender bias and said on Monday that it will reevaluate credit limits for Apple Card users on a case-by-case basis for customers who received lower credit lines than expected.“We have not and never will make decisions based on factors like gender,” Carey Halio, Goldman’s retail bank CEO, said in a statement. “In fact, we do not know your gender or marital status during the Apple Card application process.”Halio said that customers unsatisfied with their line should contact the company.“Based on additional information we may request, we will re-evaluate your credit line,” the statement said.”

Source : Goldman Sachs to reevaluate Apple Card credit limits after bias claim

Google is fixing gender bias in its Translate service

“Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”

Source : Google is fixing gender bias in its Translate service

Fearful of bias, Google blocks gender-based pronouns from new AI tool

“Smart Compose is an example of what AI developers call natural language generation (NLG), in which computers learn to write sentences by studying patterns and relationships between words in literature, emails and web pages. A system shown billions of human sentences becomes adept at completing common phrases but is limited by generalities. Men have long dominated fields such as finance and science, for example, so the technology would conclude from the data that an investor or engineer is “he” or “him.” The issue trips up nearly every major tech company. ”

Source : Fearful of bias, Google blocks gender-based pronouns from new AI tool | Reuters

Amazon scraps secret AI recruiting tool that showed bias against women

recruiting automation

“The group created 500 computer models focused on specific job functions and locations. They taught each to recognize some 50,000 terms that showed up on past candidates’ resumes. The algorithms learned to assign little significance to skills that were common across IT applicants, such as the ability to write various computer codes, the people said. […] With the technology returning results almost at random, Amazon shut down the project.”

Source : Amazon scraps secret AI recruiting tool that showed bias against women | Reuters

Microsoft improves facial recognition to perform well across all skin tones, genders

Two women outdoors looking at a mobile device using facial recognition technology.

“Microsoft announced Tuesday that it has updated its facial recognition technology with significant improvements in the system’s ability to recognize gender across skin tones. That improvement addresses recent concerns that commercially available facial recognition technologies more accurately recognized gender of people with lighter skin tones than darker skin tones, and that they performed best on males with lighter skin and worst on females with darker skin.
With the new improvements, Microsoft said it was able to reduce the error rates for men and women with darker skin by up to 20 times. For all women, the company said the error rates were reduced by nine times. Overall, the company said that, with these improvements, they were able to significantly reduce accuracy differences across the demographics”.

Source : Microsoft improves facial recognition to perform well across all skin tones, genders

Violences faites aux femmes en ligne

«Ces violences affectent la santé et la vie sociale des victimes, avec la même gravité que les autres formes de violences faites aux femmes. Elles n’ont rien de virtuel. Elles sont pourtant largement tolérées. C’est ce dont témoigne un testing inédit mené par le HCE et ses partenaires en juillet 2017 sur les principaux réseaux sociaux (Facebook, Twitter et Youtube) : 92% des contenus sexistes signalés (insultes, menaces de viols ou incitation à la haine) n’ont pas été supprimés par les plateformes, avec des écarts : 87 % pour Facebook, 89 % pour Twitter et 100% pour Youtube».

Source : Violences faites aux femmes en ligne : le HCE appelle à une véritable prise de conscience et action des géants du web et des pouvoirs publics – Haut Conseil à l’Egalité entre les femmes et les hommes

Select grammar and writing style options in Office 2016 – Office Support

«Gender-Specific Language. Targets gendered language which may be perceived as excluding, dismissive, or stereotyping. Consider using gender-inclusive language. Example: We need more policemen to maintain public safety. Policemen is corrected to police officers».

Source : Select grammar and writing style options in Office 2016 – Office Support

« Older posts

© 2020 no-Flux

Theme by Anders NorenUp ↑