Image
:
Apple’s iPhone and Samsung’s Galaxy lineup took about five years to really hit their stride. Now with five smartphones of its own in the books, Google is
gearing up for
some major changes that could propel the Pixel lineup to mainstream success
. Ahead of the
Pixel 6's official launch later this fall—
yes, it’s confirmed—
I got a chance to sit down and talk with Google’s Rick Osterloh,
senior vice president of devices and services,
for a preview of the Pixel 6
and the new Google-designed
chip that’s powering it—
yep, those rumors were true,
too
.
Advertisement
However, before we dive into the new stuff, let’s
take a look back at the Pixel’s journey
. The
original Pixel
,
the Pixel 2
, and
the Pixel 3
shared
a similar two-toned color scheme and relatively straightforward look that was
primarily designed to show off Google’s software
. The Pixel 1 was the
launchpad for
Google Assistant, the Pixel 2 brought Google Lens, and the Pixel 3 introduced
Night Sight, which changed the way smartphone makers
approach mobile photography
.
Rick Osterloh, Google’s
senior vice president of
d
evices & s
ervices,
at the new Google Store in NYC.
Photo
:
Sam Rutherford
W
ith
the Pixel 4
, Google remixed
its previous two-tone color scheme and introduced
Motion Sense as a way
to tentatively introduce
ambient computing
. Then came
the Pixel 5
, which sidestepped typical flagship aspirations. While it had a neat bio-resin coating, it didn’t offer much in the way of improved hardware.
These
last two Pixels may have seemed
like Google was
throwing ideas against a wall to see what stuck, but all the while
the company was continuing to build out its AI and machine learning efforts as part of a vision for the future
of
smartphone computing.
So here we are today with the Pixel 6, which is not only Google’s new flagship, but it’s the first Pixel to use a
Google-
designed processor, called Google Tensor.
G/O Media may get a commission
Creative GigaWorks T40 II 32W RMS 2.0 Speakers
$100
at Newegg
“Our team’s mission in the hardware division is to try and create a concentrated version of Google, and we do that by combining the best assets the company has in AI software and hardware,” Osterloh told me.
I
nstead of relying on off-the-shelf silicon from another company as it has in the past
, Google decided to design its own system-on-a-chip (SoC) to deliver the kind of AI and machine learning performance Google needs to make its vision a reality.
Advertisement
Image
:
“We’ve been doing that a little bit with Pixel over the years with HDR+, Call Screener, and Night Sight, and all these things are using various techniques for advanced machine learning and AI,” Osterloh said.
“But what has been very frustrating is that we’re not able to do as much as we would like on phones.”
Advertisement
That’s about to change. Osterloh said the new chip
“is our biggest smartphone innovation since we first launched the Pixel five years ago.”
“The name is an obvious nod to our open-
source AI software development library and platform,” he continued.
“The major aim is to try to bring our latest AI innovations to the phone so we can literally run our best [AI and machine learning] models on the Pixel.”
Advertisement
Osterloh didn’t give many details on Tensor’s
chip architecture, overall performance, or even who Google partnered with for its production. But he did say that
Tensor unlocks the ability to run “data
center level” AI models locally on the chip, without the need for help from the cloud. The chip’s support for more powerful on-device AI performance is also a privacy benefit, because the phone won’t need to send your data to the cloud for additional processing.
“No one has developed a true mobile SoC, and that’s where our starting point was,” Osterloh said. “
We see the future of where our research is headed and we co-designed this platform with our AI researchers.”
Advertisement
Image
:
So what exactly does Tensor make possible that previous chips couldn’t?
Osterloh showed me some Pixel 6
features coming this fall, made possible by Tensor on a near-
final prototype of the Pixel 6. Unfortunately, Google prohibited Gizmodo from recording photos or videos of the devices during this preview, but I can say the Pixel 6 looks as good in the images and videos Google provided as it does in person.
Advertisement
To start,
Osterloh showed
me a pretty standard
photo of a young child to highlight one of the most challenging and
most common issues with mobile photography: trying to snap a sharp picture of a subject that just won’t sit still. P
arts of the photo,
like the child’s hands and face,
looked blurry. But
by leveraging Tensor and computational photography, the Pixel 6 was able to convert the photo from one that
might end up in the recycle bin to something you’d actually want to keep.
Osterloh says that by designing Tensor to suit Google’s needs, Google was able to change the memory architecture to more easily manipulate data—even while it’s being processed—to better offload certain tasks to Tensor’s AI engine, which improves both performance and power efficiency, instead of relying more heavily on a chip’s image signal processor like a lot of other processors do.
Advertisement
Image
:
“What we’re trying to do is turn this physics problem into a solvable data problem by using Tensor,” Osterloh said.
“The way we do that is for a scene like this, we will take images through two sensors at once through the ultra-
wide sensor at very fast exposure, so we can get a really sharp picture, and then we take it through the main sensor at normal exposure.”
Advertisement
T
he computations don’t stop there.
“In parallel, we’re also trying to detect motion with one of our machine learning models, and we’re also trying to determine with our face detection model whether there’s a face in the picture,” he said.
“
And so we use all of these machine learning techniques at the same time in parallel, using the TPU and all available resources in the phone.”
Advertisement
The result was a picture that,
while not 100% tack sharp, was still heads and shoulders above what had previously been a cute but blurry photo.
Image
:
Sam Rutherford
Advertisement
Tensor’s capabilities aren’t
limited to photos.
Osterloh also showed a comparison between videos of the same scene captured by an iPhone 12, Pixel 5, and a Pixel 6. When it comes to video
, the demands placed upon AI performance increase
, but with
Tensor, the Pixel 6 is able to do things like provide real-time HDR while also using object detection to identify a sunset, which allows the Pixel 6 to intelligently adjust white balance and increase dynamic range. Those
were aspects both the iPhone and Pixel 5 weren’t able to properly factor in.
“You can almost envision Tensor as being built to perform computational photography for video,”
Osterloh said
. “To be able to process machine learning in real-time on our videos as they’re going is a big change from where we’re at.”
Advertisement
But perhaps the most impressive demo I saw was when
Osterloh played
a video of someone giving
a presentation in French. Despite taking six years of French in middle and high
school, understanding more than a
random phrase or two was way above my level. However, with a couple of
quick taps, Osterloh was not only able to turn on live captions, he also enabled live translation, allowing the Pixel 6 to convert the recording from French to English in real time.
Google’s
Live Caption
and
Interpreter Mode
features
have been available for some
time on various devices,
but
they’ve never been available for use at the same time on a phone, simply because previous chips couldn’t
deliver the kind of AI and machine learning performance to support them.
Gif
:
Advertisement
Osterloh also demoed another new voice dictation feature in Gboard that lets you speak instead of type while texting, and the
Pixel 6 is
able to automatically correct many of its mistakes in real-time. In
cases where it misses, you can correct things yourself without interrupting the message. It’s nice to see that Tensor also supports straightforward improvements like
greatly enhancing the speed and accuracy of speech recognition.
Now let’s talk about the Pixel 6 itself.
Google isn’t yet releasing
detailed specs
, but the Pixel 4 and 5 were criticized for being underwhelming. So I asked
Osterloh if Google would ever make a flagship-level phone again.
Advertisement
“Yes,” he said.
“
Here it is: T
he Pixel 6 and Pixel 6 Pro.”
T
he new Pixel 6 and Pixel 6 Pro share some design elements
with previous Pixels, but reimagined in a fresh, playful, and quite enchanting new way. Instead of a two-toned design, Google opted for a tri-color aesthetic with glass panels in front and back, available in several combinations that to me looks like an avante garde interpretation of a hardware store paint swatch—
I mean that in the best way possible.
Advertisement
Here’s the standard Pixel 6 lineup.
Image
:
Screen bezels are even thinner than before,
the Pixel 6's selfie cam has shifted more toward the center, and both have
big, bright OLED displays. Both the Pixel 6 and 6 Pro are actually quite large. We’re talking devices with screens
at least 6.5 inches
or larger. The
Pixel 6 Pro in particular
felt similar in size to
Samsung’s
Galaxy S21 Ultra.
Advertisement
Instead
of a standard camera bump in back
, the Pixel 6 has what Osterloh described as a “camera band,
” which not only adds some visual appeal but calls attention to the Pixel 6's camera even more—a design element
that Google started exploring on the Pixel 4. And while we don’t know the Pixel 6's camera specs, the
band also highlights the biggest difference between the standard model and the P
ro. The base Pixel 6
features wide
and ultra-wide cameras, and
the Pixel 6 Pro at long last gets a
a bonus telephoto cam with a 4x optical zoom.
Both the Pixel 6 and Pixel 6 Pro feel
very much like premium devices
,
in terms of design, components, and software smarts
. To me, this is a hugely encouraging course reversal
from last year’s mid-range Pixel—and this is coming from someone who once accused
Google of not caring about the Pixel’s hardware
.
Advertisement
The Pixel 6 Pro s
adly lacks a pink option.
Image
:
Now let’s return to Apple and Samsung’s phone lineups, which took years to hit their stride.
Apple’s sixth phone was the iPhone 5, which is one of the most beloved iPhones of all time and right up there with the original iPhone and the iPhone X as the most important iPhones ever. For
Samsung, between the S6 and and S6 Edge, we saw the introduction of a glass design in front and back along with a curved display that essentially crystalized Samsung’s phone design
for the following five years
. W
ith the Pixel 6 and the new Tensor chip, we’re about to see
how Google will use AI and machine learning to really push its vision of ambient computing.
Advertisement
B
etween Osterloh’s insights, the demos I saw, and the Pixel 6's hardware
, my biggest takeaway is that Google is incredibly confident when it comes to this launch.
Usually, companies try to keep details like this locked down
(a
s much as possible in the age of
leaks and rumors) until they get officially announced. But with the Pixel 6, Google isn’t just doubling down on AI and machine learning, it’s trying to capitalize on more than five years of research and
development in a significant way
. W
hile it’s still too early to call the Pixel 6 a success—we’ll have to wait til the fall to actually spend time with the device—I’m encouraged by what I’ve seen so far.