Mois : décembre 2018 (Page 2 of 3)

Illustration by Taylor Callery

“Taylor Swift fans mesmerized by rehearsal clips on a kiosk at her May 18th Rose Bowl show were unaware of one crucial detail: A facial-recognition camera inside the display was taking their photos. The images were being transferred to a Nashville “command post,” where they were cross-referenced with a database of hundreds of the pop star’s known stalkers”

Source : The Future of Entertainment – Rolling Stone

Cartographie numérique: Visualiser les densités de population en 3D et à l’échelle mondiale

Matthew Daniels, qui a élaboré cette application, a représenté l’ensemble de la population mondiale sous forme de blocs 3D avec un dégradé de couleurs du vert clair au bleu foncé. Ce type de cartographie est surtout efficace pour représenter les « pics » de population dans les zones urbanisées.

Source : Cartographie numérique: Visualiser les densités de population en 3D et à l’échelle mondiale

A 3D-printed head being made at the Backface studio in Birmingham, U.K.

“No such luck with the iPhone X, though. Apple’s investment in its tech – which saw the company work with a Hollywood studio to create realistic masks to test Face ID – has clearly paid off. It was impossible to break in with the model. Microsoft appeared to have done a fine job too. It’s new Windows Hello facial recognition also didn’t accept the fake head as real. Little surprise the two most valuable companies in the world offer the best security.”

Source : We Broke Into A Bunch Of Android Phones With A 3D-Printed Head

Robert Hannigan

“This isn’t a kind of fluffy charity providing free services. It’s is a very hard-headed international business and these big tech companies are essentially the world’s biggest global advertisers, that’s where they make their billions. […] these big companies, particularly where there are monopolies, can’t frankly reform themselves. It will have to come from outside.”

Source : Facebook could threaten democracy, says former GCHQ boss – BBC News

“Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”

Source : Google is fixing gender bias in its Translate service

« Older posts Newer posts »

© 2020 no-Flux

Theme by Anders NorenUp ↑