Étiquette : bias (Page 1 of 4)

“Twitter it was looking into why the neural network it uses to generate photo previews apparently chooses to show white people’s faces more frequently than Black faces. Several Twitter users demonstrated the issue over the weekend, posting examples of posts that had a Black person’s face and a white person’s face. Twitter’s preview showed the white faces more often.”

Source : Twitter is looking into why its photo preview appears to favor white faces over Black faces – The Verge

Google Vision API

“Google notes in its own AI principles that algorithms and datasets can reinforce bias: ‘We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.’ Google invited affected developers to comment on its discussion forums. Only one developer had commented at the time of writing, and complained the change was down to ‘political correctness.’ ‘I don’t think political correctness has room in APIs,’ the person wrote. ‘If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.’”

Source : Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias

http://www.beaude.net/no-flux/wp-content/uploads/2020/01/143762_HERO-GettyImages-860967818.jpg

“‘Notability are notoriously byzantine, to say it kindly,’ the anonymous editor says. They hope the push to reform the guidelines will help compensate for the historic underrepresentation of women and minorities, since it’s not just women who find their path into Wikipedia blocked. ‘A lot of prejudice is unconscious and intersectional,’ says Lubbock. ‘Wikipedia is dealing not just with a gender inequality issue, but also racial and geographical inequalities.’

Source : Female scientists’ pages keep disappearing from Wikipedia – what’s going on? | News | Chemistry World

“Changing algorithms is easier than changing people: software on computers can be updated; the “wetware” in our brains has so far proven much less pliable. None of this is meant to diminish the pitfalls and care needed in fixing algorithmic bias. But compared with the intransigence of human bias, it does look a great deal simpler. Discrimination by algorithm can be more readily discovered and more easily fixed.”

Source : Biased Algorithms Are Easier to Fix Than Biased People – The New York Times

View image on Twitter

“Goldman Sachs denied allegations of gender bias and said on Monday that it will reevaluate credit limits for Apple Card users on a case-by-case basis for customers who received lower credit lines than expected.“We have not and never will make decisions based on factors like gender,” Carey Halio, Goldman’s retail bank CEO, said in a statement. “In fact, we do not know your gender or marital status during the Apple Card application process.”Halio said that customers unsatisfied with their line should contact the company.“Based on additional information we may request, we will re-evaluate your credit line,” the statement said.”

Source : Goldman Sachs to reevaluate Apple Card credit limits after bias claim

“Data dredging (also data fishing, data snooping, data butchery, and p-hacking) is the misuse of data analysis to find patterns in data that can be presented as statistically significant when in fact there is no real underlying effect. This is done by performing many statistical tests on the data and only paying attention to those that come back with significant results, instead of stating a single hypothesis about an underlying effect before the analysis and then conducting a single test for it.”

Source : Data dredging – Wikipedia

« Older posts

© 2020 no-Flux

Theme by Anders NorenUp ↑