Google’s own mobile chip is called the Tensor


Rick Osterloh casually set his laptop down on the couch and leaned back contentedly. It’s not a microphone, but the effect is about the same. Google’s head of hardware just showed me a demo of the company’s latest features: computer processing for the debut video Pixel 6 and Pixel 6 Pro. This feature was only possible with Google’s own mobile processor, which it is announcing today.

He is understandably proud and excited to be able to share the news. The chip is called Tensor and it is the first system on a chip (SoC) designed by Google. The company has been “working on this for about five years,” he said, although CEO Sundar Pichai wrote in a statement that Tensor “has been creating for four years and building two decades of Google’s computing experience.”

Google’s software expertise is something he has become famous for. He excelled in computer photography with night shooting mode for low light shots, i strange a world with how much success conversational AI duplex was able to imitate human speech – all the way to “mind and ahs”. Tensor also uses Google’s machine learning skill and allows the company to transfer artificial intelligence experiences to smartphones that they couldn’t before.

Holding diet Coca-Cola in one hand and gesturing animatedly with the other, Osterloh threw in hyperbolic marketing language such as “Transforming the Computer Experience” and “That Will Be What We Consider to Be a Quite Dramatic Transformation.”


It alludes to a Tensor that provides experiences that previous chips (the company mostly used Qualcomm’s Snapdragon processors in its previous phones) could not provide. Things like being able to simultaneously perform multiple tasks of intensive artificial intelligence without overheating the phone or having enough power to apply computer processing to videos while recording.

That belief in the importance of Tensor is part of why Google decided to release it today ahead of the actual launch of the Pixel 6 in the fall. The company has not yet provided all the details about the processor, nor is it now sharing specific information about its latest flagships. But there are a lot of new things here and we wanted to make sure people have context, ”Osterloh said. “We think it’s a really big change, so we want to start earlier.”

In addition, there is an added benefit. “Information is coming out,” Osterloh added. “Things look leaking today.”

New chip design with AI

Thanks to this leak, we’ve heard a lot of rumors about Google’s efforts to do so own a mobile processor for a while, below code name Project Whitechapel. Although the company will not publicly discuss code names, it is clear that work on Tensor has been going on for a long time.

The name of the chip is an obvious sign of the open platform of the machine learning company, TensorFlow, Osterloh said, and that should tell you how big a role AI plays in this processor. Although Google is not yet ready to share all the details about Tensor, Osterloh explained that the SoC ARM chip is designed around a TPU or Tensor Processing Unit. The mobile chip was designed in conjunction with Google’s artificial intelligence researchers, and the TPU is based on their larger versions in the company’s data centers.

Rendering Google's Tensor mobile chip with words on it


It’s not just designed to speed up machine learning tasks on your phone. Osterloh said that they also redesigned the image signal processor or ISP. In particular, he said that there are “several points in the ISP that we can insert machine learning, which is new”.

Google has also reconstructed the memory architecture to facilitate access to RAM and allow data manipulation during image processing. There are also several places where Osterloh said that they encoded their image processing algorithms directly into hardware. He says this allows Google to do “things that were previously impossible to do on standard SoCs,” although he didn’t share specific details about what Tensor now allows, and previous SoCs couldn’t.

Of course, since this is Google’s first mobile chip, Osterloh admits that people might see the company as unproven. Although he retorted by saying, “I think people are pretty aware of Google’s overall capabilities.”

It is natural to wonder if a company can compete in areas such as energy efficiency and heat management. Osterloh said they designed the Tensor to perform some tasks with more energy efficiency than previous processors they used while remaining within the thermal threshold. Similar to existing processors, Osterloh said that “the system has a bunch of different subsystems, [and] we can use its most efficient element to perform tasks. “

Side view of peach / gold Pixel 6 Pro.


Although there is a global shortage of chips, Osterloh is convinced that Google can manage demand. “There is no doubt that everyone is affected by this,” he said. “The positive thing about this is that it is under our control, we are doing this and we are responsible for it. So we think we should be fine. ”

Why make a Tensor?

So what can Tensor do that other mobile processors can’t? Google is preparing most of the juicy parts for the presentation of the Pixel 6 in the fall. But he offered two examples of areas that would improve dramatically: Photography and voice recognition (and processing). At our meeting, Osterloh showed some demonstrations of the new features with Tensor enabled on the Pixel 6 and 6 Pro, which also gives us a first look at the phones. The headphones have a distinctive new look and bright color options. They also have a horizontal camera protrusion that extends across the width of the rear, which is a “very deliberate part of the design,” Osterloh notes. “We were really famous for photography, [so] we wanted to really emphasize this. ”

Google has upgraded the cameras itself, but the promised photo enhancements aren’t just optical hardware. Behind some of them stands a tensor. With previous chips, the company continued to run to the limits trying to improve photography on its phones. “They’re not designed for machine learning or AI, and they’re certainly not designed to optimize where Google is going,” he said.

Three Pixel 6 phones.  From left to right, their color schemes are black / black, green / blue and red / peach.


So where is Google going? Towards the world “Ambient computing”, a vision that Osterloh and many of his colleagues have put forward in the past. They see a future in which all the devices and sensors we are surrounded by can communicate with Google (sometimes through the Assistant) or the Internet. But Osterloh knows that for most people, the most important device will still be a smartphone. “We see the cell phone as the center of it.”

So when Google wanted to improve beyond the bounds of modern processors, it had to do something different. “What we’ve done in the past, when we’ve come across this kind of engineering, constraints and technical constraints, is to take on the problem ourselves,” Osterloh said.

Photo and video processing upgrade

With the Tensor, the Pixel 6 can do things like capture images with two sensors at the same time, with the main one shooting normal exposure and the wide angle one working at a much faster shutter speed. Osterloh said the system runs a number of different models of real-time machine learning to make it easier to understand things about the scene, such as whether there are faces and whether the device is moving or shaking. Pixel 6 will then combine all of this data and use it to process photos so if you try to capture a hyperactive puppy or child, you are less likely to get a blurry image.

The tensor will also allow Google to perform computationally intensive tasks while recording videos. Osterloh said that in the past, the company was not able to apply much machine learning to video, because that would be too taxable for a phone processor. But “that all changes with Tensor,” he said. One thing they managed to launch is the HDRnet model on videos, which drastically improves quality in tricky situations, like when the camera is pointing towards the sun.

Osterloh showed me demonstrations of how Pixel 6 did both of these things, including an example before and after a blurry photo of an active child and a video comparison of a camp at sunset. Although there was a clear difference, unfortunately I can’t show you the results. In addition, these were controlled Google views. I can’t really assess how impressive and useful these features are until we get to test them in the real world.

Voice and speech improvements

Still, I was able to see a more convoluted view. Osterloh also showed me how voice dictation will work in Pixel 6 on GBoard. On the upcoming phone, you’ll be able to press the microphone button in the compose box, tell your message, and use hot words like “Send” or “Delete” to start actions. You can also edit typos via the on-screen keyboard while the microphone is still listening to your dictation.

All of this works through the device’s new Speech API, and I was impressed that the system is smart enough to tell the difference between saying “Send” to “I’ll send the kids to school” than when you tell it to send a message. Osterloh told me that the algorithm not only searches for the actual word, but also your tone of voice and surrender before initiating action.

Finally, Osterloh showed me a few more things: Live Caption with Translate, as well as Android 12’s Material You design. Thanks to Tensor, Android’s Live Caption feature, which provides subtitles for everything played through your device’s sound system, will also be able to translate what is being said in real time. It all happens on the device, so the next time you watch TED Talk in a foreign language or your favorite international TV show, it won’t matter if they don’t have subtitles — Tensor will provide that for you.

Android 12 Beta


A Look At Material You

Meanwhile, Material You, which Google first introduced at I / O this year, is what Osterloh called the biggest user interface change in Android perhaps from the start. I’ve been waiting to see this feature in the public beta of Android 12, but it’s not available there yet. At our meeting, Osterloh showed me how it works – he changed the background of the Pixel 6 from something pink to a water surface scene, and the icons and prominent elements of the system were quickly updated. The app icons are painted to match, but something new I’ve learned from this demo is that if you’re like me and prefer your icons to retain their original colors, you can leave them intact.

We took a really good look at what’s coming in the fall, though Google still hides many details. We don’t know yet if he plans to say if he has help from other manufacturers in making the Tensor, and details about the CPU and GPU core, clock and other components will be shared later this year. But with the new chip, Google has managed to realize a long-held dream.

“We see this as a Google phone,” he said. “We intended to build this a few years ago and we’re finally here.”

All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories involve partnerships. If you purchase something through one of these links, we can earn a commission for affiliates.

Source link


Please enter your comment!
Please enter your name here