“At the beginning of March the open source community got their hands on their first really capable foundation model, as Meta’s LLaMA was leaked to the public. It had no instruction or conversation tuning, and no RLHF. Nonetheless, the community immediately understood the significance of what they had been given. A tremendous outpouring of innovation followed, with just days between major developments (see The Timeline for the full breakdown). Here we are, barely a month later, and there are variants with instruction tuning, quantization, quality improvements, human evals, multimodality, RLHF, etc. etc. many of which build on each other. Most importantly, they have solved the scaling problem to the extent that anyone can tinker. Many of the new ideas are from ordinary people. The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.”
“Around the same time, Google, OpenAI and other companies began building neural networks that learned from huge amounts of digital text. Dr. Hinton thought it was a powerful way for machines to understand and generate language, but it was inferior to the way humans handled language.
Then, last year, as Google and OpenAI built systems using much larger amounts of data, his view changed. He still believed the systems were inferior to the human brain in some ways but he thought they were eclipsing human intelligence in others. “Maybe what is going on in these systems,” he said, “is actually a lot better than what is going on in the brain.”
As companies improve their A.I. systems, he believes, they become increasingly dangerous. “Look at how it was five years ago and how it is now,” he said of A.I. technology. “Take the difference and propagate it forwards. That’s scary.”
Until last year, he said, Google acted as a “proper steward” for the technology, careful not to release something that might cause harm. But now that Microsoft has augmented its Bing search engine with a chatbot — challenging Google’s core business — Google is racing to deploy the same kind of technology. The tech giants are locked in a competition that might be impossible to stop”
“Project Zero, Google’s team dedicated to security research, has found some big problems in the Samsung modems that power devices like the Pixel 6, Pixel 7, and some models of the Galaxy S22 and A53. According to its blog post, a variety of Exynos modems have a series of vulnerabilities that could “allow an attacker to remotely compromise a phone at the baseband level with no user interaction” without needing much more than a victim’s phone number. And, frustratingly, it seems like Samsung is dragging its feet on fixing it.”
“There are many reasons you might want to move away from Google, especially in light of some of the recent policy changes regarding Workspaces. Depending on your exact reasons for leaving, there are more or less attractive alternatives to some of Google’s most popular apps. In particular, those can be divided into online web services that, similar to Google, give you access to services via an online account, and self-hosted options like NextCloud and/or apps that can be installed on your own infrastructure, or using instances of your own infrastructure on cloud hosting or web hosting services. These options are attractive for the fact that they allow you to control your own data and maintain the protection of your data. Migrating to these services can be quite easy, whether for email, file sharing, or other services. With these services, it all starts with your domain name.”
“Viktor Lofgren, a Swedish software developer and consultant who created his own indie search engine called Marginalia, told me, “One part of the sameyness is that recommendation and prediction algorithms often seem to work almost too well.” Marginalia, which Lofgren started working on a year ago, is a bare-bones Web site run entirely from a computer in his living room. The search engine’s stated mission is to “show you sites you perhaps weren’t aware of.” Its results, based on its own custom algorithm and data gathering, prioritize text-based Web sites that lack ads, mobile support, encryption, and other features that qualify as good S.E.O. “Google punishes sites that aren’t up to speed with modern Web technologies,” Lofgren said. ”
“Wikimedia Enterprise, a first-of-its-kind commercial product designed for companies that reuse and source Wikipedia and Wikimedia projects at a high volume, today announced its first customers: multinational technology company Google and nonprofit digital library Internet Archive. Wikimedia Enterprise was recently launched by the Wikimedia Foundation, the nonprofit that operates Wikipedia, as an opt-in product. Starting today, it also offers a free trial account to new users who can self sign-up to better assess their needs with the product.”
“Augmented reality allows us to spend more time focusing on what matters in the real world, in our real lives. It can break down communication barriers — and help us better understand each other by making language visible. Watch what happens when we bring technologies like transcription and translation to your line of sight.”
“Today at Google I/O, we announced new ways the latest advancements in AI are transforming Google Maps — helping you explore with an all-new immersive view of the world, find the most fuel-efficient route, and use the magic of Live View in your favorite third-party apps.”
“According to a research paper, « What Data Do The Google Dialer and Messages Apps On Android Send to Google? » [PDF], by Trinity College Dublin computer science professor Douglas Leith, Google Messages (for text messaging) and Google Dialer (for phone calls) have been sending data about user communications to the Google Play Services Clearcut logger service and to Google’s Firebase Analytics service. « The data sent by Google Messages includes a hash of the message text, allowing linking of sender and receiver in a message exchange, » the paper says. « The data sent by Google Dialer includes the call time and duration, again allowing linking of the two handsets engaged in a phone call. Phone numbers are also sent to Google. »”
“We identified four best practices that reduce energy and carbon emissions significantly — we call these the “4Ms” — all of which are being used at Google today and are available to anyone using Google Cloud services.
Model. Selecting efficient ML model architectures, such as sparse models, can advance ML quality while reducing computation by 3x–10x.
Machine. Using processors and systems optimized for ML training, versus general-purpose processors, can improve performance and energy efficiency by 2x–5x. Mechanization. Computing in the Cloud rather than on premise reduces energy usage and therefore emissions by 1.4x–2x. Cloud-based data centers are new, custom-designed warehouses equipped for energy efficiency for 50,000 servers, resulting in very good power usage effectiveness (PUE). On-premise data centers are often older and smaller and thus cannot amortize the cost of new energy-efficient cooling and power distribution systems.
Mechanization. Computing in the Cloud rather than on premise reduces energy usage and therefore emissions by 1.4x–2x. Cloud-based data centers are new, custom-designed warehouses equipped for energy efficiency for 50,000 servers, resulting in very good power usage effectiveness (PUE). On-premise data centers are often older and smaller and thus cannot amortize the cost of new energy-efficient cooling and power distribution systems.
Map Optimization. Moreover, the cloud lets customers pick the location with the cleanest energy, further reducing the gross carbon footprint by 5x–10x. While one might worry that map optimization could lead to the greenest locations quickly reaching maximum capacity, user demand for efficient data centers will result in continued advancement in green data center design and deployment.
These four practices together can reduce energy by 100x and emissions by 1000x.”