you
welcome to see that's live coverage of
Google i/o this is Google's big
developer conference Amaya's after-hour
alongside Vanessa hand Orianna and
Patrick Holland we'll be talking a bunch
about Google before sundar Pichai heads
up on stage for his big keynote which is
scheduled to start at 10:00 a.m. Pacific
you guys can be part of the conversation
tell us what you want to see at i/o with
and tweet us with the hashtag
CNET live so used the hashtag seen that
live and we'll be able to talk to you
guys later
now in the past bubala juice IOT reduce
all kinds of new stuff new versions of
android google home android web 2.0
google play music what do you want to
see out of Google i/o this year I want
to see a whole bunch of things but I
think I'm probably most excited to see
where they're gonna take their where OS
they rebranded Android wear into the
where OS but I feel like their wearables
have kind of been lagging in terms of
updates and they haven't kind of kept up
kept up and so I'm interested to see
what they do with that I would love to
see hardware her from Google maybe a
pixel branded watch but I know that's a
long shot so you know I'm actually the
pixel we're thankful where I would
totally wear pixel wear but yeah I'm
just interesting interest is to see what
they do with that I'm sure they'll do a
lot of partnerships but and then maybe
some project jacquard type stuff where
they're actually integrating technology
into the pieces of clothing that we wear
Levi's smart gesture-based
like the fabric on the sleeve like the
way you have different gestures on it so
if I'm writing a bike I can swipe and go
turn-by-turn for directions or control
my music but you could put that stuff
anywhere you could put it in a shirt or
another jacket or a smart tie hashtag
smart ties I mean that makes a lot of
sense with Android wear name it
shouldn't just be limited to watches in
general but I know Google still has
avoided making their own watch for so
long I don't know if they're ready yet
well they sound like so I have tagged
pics away or hash tag smart time but we
saw last year nobody I saw last year a
health product come out called the Smart
Watch it was it was the study watch it
warily glance yeah barely Labs which is
part of a alphabet Google formally Guth
Google health services all that being
said it would every one's reaction was
like oh my god I want to
this though was a consumer-facing so but
what was neat about it was designed with
simple had an e ee display and the thing
that kills me about android wear right
now is it actually had a battery life
where it's enter and where's have hardly
any battery life so man I'm all about AI
want to see I think AI the artificial
intelligence Google uses they're the
things that are gonna be the recipes for
things that are gonna be kind of baked
later into products so for example some
of the ones it saw last year are like
stuff like smart reply in Gmail where it
can like come up with your own
vernacular for instant replies we also
see the AI they use no other camera
stuff we got to review the Google Clips
camera which it doesn't have a shutter
on it it lets it decides when to take
the picture or the video for you using
artificial intelligence it's kind of
exciting maybe a little creepy yes I
like stuff like their Google VR 180
which it's a newer platform that they're
using but all this AI if anything gets
it out stays maybe you won't see
hardware products for today but it might
be stuff that will make its way into
hardware and that makes a lot of sense
and I oh it's all about the developers
have to come up with different ways to
use Google's technologies with their
products in the future and AI is a huge
back-end for all these products to work
together in a way that just makes it
feel like the future everything should
be able to figure out things but it
seems like that have issues with privacy
in general and how much data you want
shared between devices so all right so
we're OS yeah what about you I would
like assistant to be consistent between
devices I'm on Google home and my phone
and a watch too all work the exact same
way and maybe that's where I comes in
yeah maybe that could tie everything
together where I can just talk to
something and it always works as opposed
to it being well Google home knows this
and phone does knows this and watch has
very little idea of what's going on
although that got a lot better recently
now we've got a live shot of Lexi fetus
who's at Google IO right now and we can
get a much better look at the big event
wack-bag Lexi let's find out what it's
like over at i/o right now let's how is
it over there hi everybody and welcome
to Google i/o 2018
I'm Lexi's Avedis I am here in Mountain
View California
it is a bright sunny day this is the
shoreline amphitheater I am standing
just outside the keynote is going to
take place right in there as you can see
it is nice bright sunny it looks kind of
like a music festival here now Google I
over the past few years has definitely
got bigger than ever this year is no
exception it is huge there are thousands
of people here there's lots of signage
to show that we are at i/o and also a
little bit of the iconography there of
the Bay Area with the Bay Bridge but all
attendees get this a little swag bag I
want to do a little quick swag unboxing
a D swag for you to show you what I've
got every attendee gets this so they get
the swag bag then they get a water
bottle very useful on bright sunny hot
days like this they get a t-shirt whoo
look at this yeah there we go there's a
little IO signage there too
inside there's also some Sony's and
there is also some Sun screen very
useful all right let's take you inside
I'm gonna walk and put the swag bag in
the back so let's keep going and walk
inside into the keynote in Shoreline
Amphitheatre now it's super super loud
in there right now I want to take you
inside and if it gets a little noisy
it's because they're running this really
cool demo at the moment they're actually
DJing using a couple of synths and a
couple of machines and doing a
machine-learning demo sorry it's a
little couple of stairs leading into the
amphitheater lots of people filing in
now let's go down here lots of
developers and attendees running around
here we go give me a little bit of the
vibe here in Mountain View it's getting
noisier I don't know if you can hear it
but the volume is cranking here we go
alright just take a look at how many
people are in here just have a look at
this as we walk inside this is a lot of
developers ladies and gentlemen have a
look at this it's like a music concert
seriously
lots of people all over the place from
all walks of life all over the world
have come to see this as I was
mentioning before there's that cool demo
down on the stage of the
moment they using a couple of DJ tools
and some machine learning to kind of
change the instruments and change the
effects it all sounds really interesting
it's a little quieter than it was before
thankfully because it was kind of like
blaringly loud but there you go so the
keynote it's gonna kick off in around
about 20 minutes or so back to you guys
in the studio I'll catch you later
thanks Alexa that looks great like
there's so many people at this event
which means we've seen some really cool
software in the future or maybe some
hardware I'm just impressed at how much
it looks like at a concert and I'm yeah
I'm glad she went through the swag bag
is very curious to see what they get
definitely not the the VR headset
underneath the seat that I was expecting
I'll have to find out what else is
actually gonna be showing off at i/o but
on a quick programming note we're
launching a new show it's called
alphabet city and I'll be the host we're
gonna cover everything to do with
alphabet from A to Z and we will stop at
G a lot since alphabet owns Google
alphabet owns a lot of companies it's
coming too soon that to end YouTube very
soon keep an eye out for that so let's
talk a little bit about some of the
recent Google News and maybe they'll
give us an idea what we can expect at
i/o because back in March Android P was
released or at least announced and they
showed off a preview version of it and
it has not support and it has rounded
quarters and not a heck of a lot of
super shiny amazing stuff that'll face
consumers
is anyone here excited by Android P from
what we've seen so far okay I think from
like you mentioned from what we've seen
so far if not support is something that
you're you're excited about or your
touting then that kind of gives us a
preview of how exciting this new launch
is gonna be but I'm hoping that that was
just a preview and that we're gonna see
more features I'm excited about more
privacy features because I think that's
a hot topic this year especially after
f8 and what we saw the of what Facebook
is doing I think with AI coming I think
they have to do a lot to preemptively
make sure that the users data is gonna
be taken care of so I'm interested to
see that so entropy and their privacy
maybe you know we saw a person by the
name of Gabriel Byrne put up an album on
Google photos with images claiming to be
Android P beta which this is around the
right time when
would release a beta version of it and
there's a video showing off how these
other features could be and what was
really noticeable is that there's a lot
more gesture support similar to
something on iphone 10 where you could
swipe up from the home bar so it seems
like maybe what we're seeing if this is
real that the software is adjusting to
the way the hardware is changing yeah
we're gonna have not just we're gonna
have less buttons what do you guys think
of this video do you think it's legit do
you think this is where androids going
yeah I mean I think it is a but also I
think it's fair to say that before there
was it was on the iPhone was also on the
palm and palm and Android have a little
bit of a history together too with
developers from palm slowly working over
but that cart cart systems been around
since like 2008 or 2009 that being said
I think it is it does say something that
we're talking about notches and videos
that may have been posted by Gabriel
Byrne I don't know if that's the actor
very good actor he might be kind of
lonely yeah exactly and then but
on the flip side I think there's a lot
that we don't know about it Google
Google about Android PE and I think
that's what I'm more interested in last
time Android Oreo brought a lot of
stability fixes some curious to see I
hope it's more than just notches and
stuff like that because that's I'm not
very excited right now about Android P
yeah I mean I've seen this when the
privacy stuff if NASA was talking about
how these apps will be more sandbox they
can't really take data from app to app
so that's gonna be more helpful with
Android P and again that's really useful
for like day to day stuff but I don't
know if that's gonna wow consumers like
oh yeah I really want Android P but
that's not really what everything's all
about with these upgrades right it's
about the user experience getting better
and better as you won by the time that
actually rolls out to users I mean
that's one of the big problems of
Android right now is that fragmentation
that we've kind of talked about before
and so you know even these privacy
features come along how long is it going
to take for us to finally be able to
implement them into our devices and
someday I should say else about the
video we watched YouTube is like some of
those things we've seen on other Android
phones maybe they aren't the pure
version of Android Oreo or the Android
before that but the idea is like
something like using like your finger to
navigate around like we have we seen
them in the last few Motorola
and it's a really neat interface so to
bring that option to just built into the
operating system will make it faster and
maybe more reliable but it's interesting
to see I'm excited cuz I think that
could be really cool and that could also
as we get rid of things like home
buttons or they go into the back like
how do we navigate our our phones right
now a lot of it is based around those
notches and face ID or something like
that but I don't know maybe there's way
too there's a new way that we haven't
even seen yet then we'll see some of
that today educate the consumers on that
too because that's a big change when you
have oh there's a button now there's no
button how do you interact with this
it's got to be intuitive so that's what
a lot of education side that Google
would have to figure out how to do or
the phone manufacturers in themselves
using Android P then there's a whole
bunch of watch stuff we talked a little
bit about that but back in March when
they rebranded Android wear to wear OS
because it works with iOS too it works
with everything because it's not just
all about Android when it comes to wear
OS now it also had a huge upgrade with
Google Assistant when where so you can
get contextual answers so if you asked
the weather it would say well you want
to know about tomorrow or not so it gave
it a lot more information plus watches
that had speakers could actually talk
back to you as opposed your just reading
a readout or ki bluetooth headphones
connected you guys think there's a
particular reason why Google still
doesn't make a watch is there a reason
why they may have partners like LG and
and what was that Kors guy again Michael
Michael Kors again fossil I don't get
still such a small portion of the market
wearables I mean it may be the future it
definitely is something that companies
should be looking at right now but it's
such a small slice of the pie that I
think it it had they haven't been
wanting to invest yet when it still
hasn't taken taken off I mean we see
Fitbit and apple watch that are starting
to creep in but there's still if you
look at the pie as a whole there's still
a tiny sliver and so I don't think it's
worth worth it for them to just invest
in it quite yet so maybe they're waiting
for the right time to invest and the
right kind of product the right time to
invest in the SmartWatch I like that
well I think it was like you know
there's some basic things like battery
life I mean unfortunately than Android
wear stuff I've tried which is now where
oh s is the fact that it doesn't last
long you can barely get through a day at
that was even true a little bit of the
Apple watch so hopefully some of the
updates we see can be about power
management I think once you figure that
out once you figure I can get through a
day with my fancy Michael Kors watch
that would be awesome and if they could
talk back to me that would be awesome
and if it had that contextual stuff that
Google assistant brings to my phone and
my other parts of like my smart speakers
that would be awesome
and I think that the Google assistant
portion is really is what's going to set
apart the am sorry the where OS device
is because that's something that
nobody's been able to beat Google at
they have all that data they have all
that the smarts behind it and if they
can incorporate it into wearables I
think that would definitely set them
apart in the market but I think you know
the conversation of the eyes is a huge
topic here and I think it's going to be
a tad IO they've been leading the charge
and not in AI and you know in terms of
the for consumer facing and also for
businesses so I'm really curious to see
what they're gonna announce how far
they're taking AI right now which
hopefully we'll see a little bit of
actual demo there's also been some more
assistant news Google is investing the
startups they put up a blog post they
said that Google is it's saying that
it's opening a new investment program
for early-stage startups that share our
passion for the digital assistant
ecosystem so Google itself is investing
in companies so they can get them moving
to help this whole digital assistant
ecosystem which is the watch and where
OS excuse me because we're LS could mean
anything a helmet you know glasses or
whatever a jacket a tie smart I think
you said you keep track of calories that
was constantly check your pulse but I
think with that is a brilliant idea by a
strategy by Google is investing in that
because we already see how much people
like Google assistant in fact the latest
LG phone actually has a Google assistant
button Hardware button pixel phones
don't even have a hardware button for
that I don't think they should by the
way but I think that this assess I mean
that at least it's breaking into the
mainstream that Google assistant is
legitimate and it works really well and
I think part of it some of its marketing
and some of it's actually
true it just it works fantastic even on
an iPhone it works very well I think one
of the biggest issues with using Amazon
Alexa there's other ones they don't have
the search function as as good as Google
you can't ask it random questions and
follow-ups aren't as as intuitive as
with Google home and with Google
assistant in general and if they can
actually put enough money behind enough
companies which is kind of strange
considering they have they could build
their own company and do whatever they
want they can have another spin-off
company they can have just an internal
project but this is showing that if
you're going to be at i/o and you're a
developer yeah we're also gonna bat
companies so like you have something to
rely on which is really helpful well and
I like how you're spinning it in a
positive note that they're investing in
companies to me also seems like they're
kind of taking ownership of this this
huge market that's gonna be a huge deal
in the coming years so you can almost
say that they're trying to monopolize it
because I mean sundar Pichai even says
that AI is going to be even more
revolutionary than the internet than
electricity now that remains to be seen
but they're making sure that they are
set up to be the AI experts in the
market and it may not seem to like it
right now but I mean they're gonna
expand this and maybe have that monopoly
in the future I'm just thinking about
the the our interests are aligned our
money is in you we're not gonna stop
developing on our side you guys will
keep developing the other side we'll
keep going together but you're right if
there's enough of this control how much
control and how much investment they're
doing I don't actually know that that is
maybe like a small percentage so maybe
they can't be like oh by the way company
why you will now only make things in the
color green right well there's that but
then on the flip side like maybe if
without that company why I'd be
developing on their own and then usually
they get bought by someone like Google
or Apple so I mean the fact that they're
working early on I think breaks a little
bit that Silicon Valley business model
is like hey we like what you didn't were
to buy you and this absorb you into our
product versus us doing it off the bat
like hey we're gonna that way it can be
a little bit it can be neutral there
might be innovation that is outside the
walls of Google that would not normally
happen even though they're kind of known
for that but also my yeah unifying
factor I think - because one of the
things over the years at Google is
there's just all these various things
going on and be able to unite them all
in one stand
place is pretty neat well we have Scott
Stein on on the line from Google i/o as
we speak let's see what what's going on
there again good good we are getting
ready to do the Google i/o thing and
we're seeing set up here we're standing
in front of a big group experiment I
think this happens like every year where
now there's 16,000 people simultaneously
drawing someplace some sort of little
like looks like a Nintendo type city if
you've been there many years before so
is what's different that you can tell us
just by walking in what's different this
year
sure well the last couple years I guess
like a third year now that it's been at
Shoreline Amphitheatre so feels like
going into a big theme park you know for
you check-in and wander through the Sun
is really nice today there's no big
difference in the way of files then
actually if you get settled into this
pattern where beforehand you have this
like musical thing and then you have the
AI group experiments thing get everyone
I guess this year it feels a lot more
chill the music feels the transi and
chill other years like I remember there
were performances like up in the top
boxes that were pretty intense maybe
they want us all to relax and there was
no yeah so do you think that the tone of
the music and the show will reflect the
announcements we're gonna see because if
it was super intense is like oh we have
Google home so be pumped should
everybody be relaxed is this what you're
getting from this vibe yeah maybe the
idea is to like get us all to work well
together this is the era this is the era
of connectivity and peace so in that
cloud I don't know maybe that's what the
P and Android P means is Android peace
great world peace that's what it is
breaking everything together I like it
it sounds like is
get his Android connectivity or what was
my assistant consistency its did
consistency I need some of that point
out above me is the site two years ago
where birds were pooping on that very
same awning housing I'm keeping my eye
on it so far poop free which is great
maybe think maybe there's like an
ultrasonic repeller or something I bet
that's what they installed this here
knowing Google they probably took care
of that that way or they just put their
favorite media people underneath quit
sorry Scott Stein you should have read
that about us the birds are too busy
tweeting that's an excuse so what are we
expecting I mean obviously Android P I'm
interested in seeing what other news
happens with VR you know there were
actually a lot of products and
announcements right before i/o
standalone VR headset that was announced
last year mirages solo that emerged on
Friday for sale and Google also has a VR
180 platform for streaming 3d and 180
degrees and the first camera the Mirage
camera is now available I have it with
me to try shooting some photos maybe
they'll talk more about building more
off that same thing Android wear or
where OS now has extra assistant
features I mean we'll hear a ton about
assistant well hear a ton about AI and
cloud but I wonder if there be any like
really wild surprises that come out of
left field that's what we're hoping I
think and who else from seeing it is is
there with you Scott give us a pain as a
picture show us over here do you want to
say hi we want to say hi there's a
little box together Thank You Jessica
there's rich there's a who's Andrew hey
and we are all we're all together all
together in this uh little Google box
ready to start writing away with new
announcements um
Internet's been okay so far we have a
little no it's not
we it's been actually just wet out on us
it's been good for me because I have
that ethernet that's working but it's
like a coin flip so we'll see how that
goes so if we don't know you and we
don't see you tweeting it's because
internet went down and I oh yeah we're
enjoying the bits of Internet while we
can or a bird but you always expect
would work better it is so Scott you
know usually in October's when Google
has a bunch of hardware do you have any
like odds when it comes to hardware at
this event because I own not exactly
known for that yeah well I mean so the
Google now it's a good question Google
now has a more active hardware division
with pixel and daydream and last year
there were pixel buds and eclipse camera
they're really building a collection of
hardware products more than they have in
a while so you'd think that there would
be something else in that family maybe
something new I would expect maybe at
the forum for the last few years was to
announce plans for that year but then we
wouldn't actually see it until later on
in the year those little bin those have
been mostly kind of like new product
ideas and I wonder if that will be
there'll be something new a new area to
explore or whether it'll be an updated
version because that sounds a little
less exciting but I think it when it
comes to assistant there's certainly
ambitions to put that in everything to
keep getting better so we were just
talking Scott we were just talking about
assistant and where that's going what
are the chances that we're gonna see a
Google branded wearable this time maybe
a pixel watch your ear to be where I was
like let us whatever just I think it's a
very good idea I think it's high time it
happened mainly because right now
they're not pushing the territory when
it comes to fitness and sensors you look
at Fitbit Apple Samsung you're looking
at things like what you can do with
heartrate to check for things like
atrial fibrillation or sleep apnea
there's not been a lot of talk about
that as far as sensors and Google so I
feel like they need a cutting edge
fitness tracking on a new bleeding edge
device
a lot of those fashion brand watches
aren't even putting heart rate on this
so I mean Fitness thing and Google's
cloud makes sense I would expect them to
talk more about that we'll see we'll see
hopefully maybe it may be that pixel
branded watch but probably just the the
inner-workings is what you're saying
then watch it set up just something of a
long battery late that looks okay and
works fine and isn't too expensive
actually find it used to all those
things Scott do you think they actually
come up with a use for Smart Watch
because currently there's none there's
no killer app for it yet it just isn't
yeah well yeah I'd use for it but you're
right it's hard to justify I think in
the end well that's where Google I think
has just emphasized assistant maybe
maybe they're already talking I'm making
the watch more of like a little Google
home on your wrist but you really need
anything like that actually I just had
to eat with my own photo in it from this
thing it's kind of like internal I seen
this off behind there so yeah I think a
good use case would be great
I would imagine Thanks Google may not
try to push too far with that right now
and might focus on existing known
elements of fitness and assistant if
they talked about any big directions for
watch
have you checked under your seat yet for
any other swags oh my god no I have look
under the seat come on there's probably
some more cables we're sneaking cables
there are cables everywhere for power
and Ethernet and more power so basically
what you're saying is this announcement
should be Wireless everything we should
finally be wider why are we still tied
to wires
everything is tethered on a dreaming
come on
we had good breakfast this morning I
don't know what am I
yeah the bathrooms were fine very
sponsored backwards alright thanks a lot
Scott we're gonna check back with you
yeah oh sure okay thanks guys you very
much enjoy the show
you know we actually got the bird
comments on with the hashtag CNET live
I'm bringing it up right now at least
here it says Joseph he says I want to
see Google i/o 2018 is Chrome OS to be
more integrated into Android OS for
better cross compatibility so both
operating systems can talk to each other
you know we didn't talk about this at
all that Android and chrome you guys
know about that whole fuchsia thing
Google fuchsia this experimental
operating system that can run on phones
and on pixel books and they can run
Android apps and they can it seems like
it's the future because it's a Lua
design type thing it's built on a
different core so like it's kernels
different so I was thinking this was the
future for the devices because of all
the legal issues Google's had in the
past with Android and Oracle because
maybe if they just cut ties entirely and
create a whole new thing they can just
say all right you know what screw your
nine billion dollar deal we're doing
this what do you guys think you guys
would you want to see Android and chrome
coming together
I think it's on that I mean I think it's
unnatural like it will eventually happen
whether it's something we'll see now
that's to beat remains a question not
only elite with legal things but also
just the fact that you know as form
factors are happening we had the chrome
book or pixel book we saw like end of
last year and that was runs on Chrome
but then at the same time you can run
some Android apps on there but there's
still a fact that Android apps work
great on phones but they don't always
work great on anything yeah no it
doesn't work well on television and
Google has not figured out how to make a
true television product that it doesn't
it sort of works on watches sort of
tablets are I mean what's the deal with
the tablets because now they're used in
Chrome OS tablets it seems like Android
arguably is best suited to a phone and
that's not what Google wants they want
Android and everything there's an
Android auto there's Android this I
think we see that in like Apple's model
to where you have like the iOS and Mac
OS kind of merging but then if you look
to our friends at Microsoft they tried
something like that now like what five
years ago whatever with like yeah did
not go so well bringing
working of Windows that was supposed to
be for everything so I still think it's
gonna be a while before things emerged
whether it has happened or whether it's
forced on us will be the question but at
the same time someone like Google could
do it and they have a big opportunity
too because something like that would
make me hop on board with stuff more
like stuff for Chrome on TV and Chrome
on my laptop I would be really excited
about that there I think they're set to
be the first to do it now whether or not
they do it here whether or not they do
it now that remains to be seen but I
think it's a great it's a great topic of
conversation and thanks to Joseph for
the for the tweet yeah I'm still trying
to figure out what there's what Google
strategy's gonna be on this because they
need to have one just have a strategy
and the biggest thing is this is for the
developers making an app that works from
one device of the next and the next to
the next and different form factors is
like the Holy Grail you don't wanna have
to create after the particular for one
device or the other because then your
apps sort of seem larger these apps
don't need to be like you know behind
iOS on certain things so that's that's
where this poll really should be from
well I'd like to say that Siri and it's
something that we have started seeing
things being a little more unified at
Google which is not something that are
known for but even in their hardware
design like that pixel book I alluded to
earlier the hardware match the pixel
phone Scott was talking about the Google
Clips camera and VR 180 the apps for
that look identical when you look at
them on your phone it's kind of the same
interface and I got to talk to one of
the developers behind the app or behind
the VR e-180 platform and they saying
it's intentional like Google is trying
to have a unified look and feel so it is
natural to think that chrome and Android
will at some time merge whether again
it's out of a legal necessity but I
think out of a user necessity and they
don't have the legacy that Apple has or
Microsoft has with compute computers on
desktops or operating systems on
desktops or operates this among laptops
so they have a chance to just go and do
it and it's questioned as well they do
it here today I don't know now if we
talk a little bit about television
because I'm always obsessed with
television my whole life is about being
super lazy and worked really hard to
make sure all my texts really great at
homes I can this be super lazy
I want you know the ability to find
anything at any time and Google they've
tried to do this we're like you could
ask it hey I want to watch this show yes
you can do that with Miroku and yes you
can do that with Siri on Apple TV but
Android TV does have a search function
but Android TV is really not taken off
in any given way there was oh we'll talk
about that later it looks like the
keynotes about to begin
let's take you there live and don't
forget we'll be back with our post show
to unpack all the news and remember to
tweet your thoughts and questions with
the hashtag
CNET live let's see what cinder has to
say
this is a true love story song a
triumphant of Toby's song
with only one small caddy this one
hasn't happened yet it's not the
strongest
good morning welcome to Google i/o it's
a beautiful day I think warmer than last
year
hope you're all enjoying it thank you
for joining us I think we have over
7,000 people here today as well as many
many people via live streaming this to
many locations around the world so thank
you all for joining us today we have a
lot to cover but before we get started
you know I had one important business
which I wanted to get over with
towards the end of last year it came to
my attention that we had a major bug in
one of our core products it turns out we
got the cheese strong in our burger
emoji anyway we went hard to work I
never knew so many people care about
where the cheese is we fixed it you know
the irony of the whole thing is I'm a
vegetarian in the first place
so we fixed it but I hopefully we got
the cheese right but as we were working
on this this came to my attention I I
didn't I don't even want to tell you the
explanation the team gave me as to why
the form is floating about the beer but
we restored the natural laws of physics
so all as well we can get back to
business we can talk about all the
progress since last year's i/o I'm sure
all of you would agree it's been an
extraordinary year on many fronts
I'm sure you've all felt it we're at an
important inflection point in computing
and it's exciting to be driving
technology forward and it's made us even
more reflective about our
responsibilities expectations for
technology vary greatly depending on
where you are in the world or what
opportunities are available to you for
someone like me who grew up without a
phone I can distinctly remember how
gaining access to technology can make a
difference in your lives and we see this
in the work we do around the world you
see it when someone gets access to a
smartphone for the first time and you
can feel it in the huge demand for
digital skills we see that's why we've
been so focused on bringing digital
skills to communities around the world
so far we have trained over 25 million
people and we expect that number to rise
over 60 million in the next five years
it's clear technology can be a positive
force but it's equally clear that we
just can't be wide eyed about the
innovations technology creates there are
very real and important questions being
raised about the impact of these
advances and the role they will play in
our lives so we know the path ahead
needs to be navigated care
fully and deliberately and we feel a
deep sense of responsibility to get this
right
that's the spirit at which we are
approaching our core mission to make
information more useful accessible and
beneficial to society I've always felt
that we were fortunate as a company to
have a timeless mission that feels as
relevant today as when we started and
very excited about how we can approach
our mission with renewed vigor thanks to
the progress we seen ai ai is enabling
this too for us to do this in new ways
solving problems for our users around
the world last year at Google i/o we
announced Google ai it's a collection of
our teams and efforts to bring the
benefits of AI to everyone and we want
this to work globally so we are opening
AI centers around the world
ai is going to impact many many fields
and I want to give you a couple of
examples today healthcare is one of the
most important fields ai is going to
transform last year we announced her
work on diabetic retinopathy
it's a leading cause of blindness and we
use deep learning to help doctors
diagnose it earlier and we've been
running field trials since then at our
oven and Sankara hospitals in India and
the field trials are going really well
we are bringing expert diagnosis to
places where trained doctors are scarce
it turned out using the same retinal
scans there were things which humans
quite didn't know to look for but our AI
systems offered more insight your same
eye scan
turns out holes information with which
we can predict the five-year risk of you
having an adverse cardiovascular event
heart attack or strokes so to me the
interesting thing is that you know more
than what doctors could find in these I
scans the machine learning systems
offered newer insights this could be the
basis for a new non-invasive way that
cardiovascular risk and we are working
we just published the research and we
are going to be working to bring this to
field trials with our partners another
area where AI can help is to actually
help doctors predict medical events
turns our doctors have a lot of
difficult decisions to make and for them
getting advanced notice say 24 to 48
hours before a patient is likely to get
very sick has a tremendous difference in
the outcome and so we put our machine
learning systems to work we've been
working with our partners using
de-identified medical records and it
turns out if you go and analyze over a
hundred thousand data points per patient
more than any single doctor could
analyze we can actually quantitatively
predict the chance of readmission 24 to
48 hours before earlier than traditional
methods
it gives doctors more time to app we are
publishing our paper on this later today
and we are looking forward to partnering
with hospitals and medical institutions
another area where AI can help is
accessibility you know we can make
day-to-day use cases much easier for
people let's take a common use case you
know you you come back home in the night
and you turn your TV on it's not that
uncommon to see two people passionately
two or more people passionately talking
over each other imagine if you are
hearing impaired and you're relying on
closed captioning to understand what's
going on this is how it looks to you as
you can see it's gibberish you can't
make sense of what's going on so we have
machine learning technology called
looking to listen it not only looks for
audio cues but combines it with visual
cues to clearly disambiguate the two
voices let's see how that can work maybe
in YouTube
level but he's a father Colangelo level
in other words he understands enough
just you accept use that as our right to
lose on purpose you said it's alright to
lose on purpose and advertise that to
defend it's perfectly okay you said it's
okay we have nothing else to talk about
we have a lot to talk about
but you can see how we can put
technology to work to make an important
day-to-day use case profoundly better
you know the great thing about
technology is it's constantly evolving
in fact we can even apply machine
learning to a 200 year old technology
Morse code and make an impact in
someone's quality of life let's take a
look hi I am Tanya this is my voice I
use Morse code by putting dots and
dashes with switches mounted near my
head as a very young child I used to
communication word board I used a head
stick to point to the words it was very
attractive to say the least once Morse
code was incorporated into my life it
was a feeling of pure liberation and
freedom
I think that is why I like skydiving so
much it is the same kind of feeling
through skydiving I met Ken the love of
my life and partner in crime it's always
been very very difficult just to find
Morris code devices the try Morris code
this is why I had to create my own with
the help from Ken I have a voice and
more independence in my daily life but
most people don't have Ken it is our
hope that we can collaborate with the G
board team to help people who want to
tap into the freedom of using Morse code
G board is the Google keyboard what we
have discovered working on G board is
that there are entire pockets of
population in the world and when I say
pockets it's like tens of millions of
people who have never had access to a
keyboard that works in their own
language with Tonya we've built support
in G board for Morse codes it's an input
modality that allows you to type in
Morse code and get text out with
predictions suggestions I think it's a
beautiful example where machine learning
can really assists someone
that normal keyboard without artificial
intelligence wouldn't be able to I am
very excited to continue on this journey
many many people will benefit from this
and that thrills me to no end
it's a very inspiring story we are very
very excited to have Tanya and can join
us today
Tania and Ken are actually developers
they really worked with her team to
harness the power of actually predictive
suggestions in G mode in G board in the
context of Morse code
I'm really excited the G board with
Morse code is available in beta later
today it's great to reinvent products
with AI G board is actually a great
example of it every single day
we offer users and users choose over 8
billion Auto Corrections each and every
day another example of a one of our core
products which we are redesigning with
AI is Gmail we just had a new fresher
look for Gmail a recent redesign hope
you're all enjoying using it we are
bringing out of the feature to Gmail we
call it smart compose so as the name
suggests we use machine learning to
start suggesting phrases for you as you
type all you need to do is to hit tab
and keep order completing
in this case it understands the subject
is Taco Tuesday it such as chips salsa
guacamole it takes care of mundane
things like addresses so that you don't
need to worry about it you can actually
focus on what you want to type I've been
loving using it I've been sending a lot
more emails to the company not sure what
the company thinks of it but it's been
great we are rolling out smart composed
to all our users this month and hope you
enjoy using it as well
another product which we built from the
ground up using AI is Google photos
works amazingly well and it's scales you
know if you click on one of these photos
what we call the photo viewer experience
where you're looking at one photo at a
time so that you understand the scale
every single day there are over five
billion photos viewed by our users each
and every day so we want to use the AI
to help in those moments so we are
bringing a new feature called suggested
actions essentially suggesting Smart
Actions right in context for you to act
on say for example you went to a wedding
and you're looking through those
pictures we understand your friend Lisa
is in the picture and we offer to share
the three photos with Lisa and with one
click those photos can be sent to her so
the anxiety very everyone is trying to
get the picture on their phone I think
we can make that better say for example
if the photo in the same wedding if the
photos are underexposed
or AI systems offer a solution to fix
the brightness right there one tap and
we can fix the brightness for you or if
you took a picture of a document which
you want to save for later we can
recognize convert the document to PDF
and make it
make it much easier for you to use later
you know we want to make all these
simple cases delightful by the way I can
also deliver unexpected moments so for
example if you have this picture cute
picture of your kid we can make it
better we can drop the background color
pop the color and make the kid even
cuter or if you happen to have a very
special memory something in black and
white maybe of your mother and
grandmother we can recreate that moment
in color and make that moment even more
real and special all these features are
going to be rolling out to Google Photos
users in the next couple of months the
reason we are able to do this is because
for a while we've been investing in the
scale of our computational architecture
this is why last year we talked about
our tensor processing units these are
special purpose machine learning chips
these are driving all the product
improvements you're seeing today and
we've made it available to our cloud
customers since the last year we've been
hard at work and today I'm excited to
announce our next generation TPU 3.0
these chips are so powerful that for the
first time we've had to introduce liquid
cooling in our data centers and we put
these chips in the form of giant pots
each of these parts is now 8 X more
powerful than last year's well over 100
peda flops and this is what allows us to
develop better models larger models more
accurate models and helps us tackle even
bigger problems and one of the biggest
problems we are tackling with the AI is
the Google Assistant our vision for the
perfect assistant is that it's naturally
conversational it's there when you need
it so that you can get things done in
the real world and we are working to
make it even better we want the
assistant to be something that's not
shrill and comfortable to talk to and to
do that we need to start with the
foundation of the Google assistant the
voice today that's how most users
interact with our system our current
voice is codenamed Holly she was a real
person
she spent months in our studio and then
we stitched those recordings together to
create voice but 18 months ago we
announced a breakthrough from our deep
mine team called wavenet unlike the
current systems
wavenet actually models the underlying
raw audio to create a more natural voice
it's closer to how humans speed the
pitch the pace even all the pauses that
convey meaning we want to get all of
that right so we've worked hard at
wavenet and we are adding as of today
six new voices to the Google assistant
let's have them say hello good morning
everyone I'm your Google assistant
welcome to Shoreline Amphitheatre we
hope you'll enjoy Google i/o back to you
sundar
you know our goal is one day to get the
right accents languages and dialects
right globally you know wavenet can make
this much easier with this technology we
started wondering who we could get into
the studio with an amazing voice take a
look
couscous a type of North African
semolina in granules made from crushed
durum wheat I want a puppy with sweet
eyes and a fluffy tail who likes my
haikus don't we all happy birthday to
the person whose birthday it is happy
birthday to you John Legend he would
probably tell you he don't want to brag
but he'll be the best assistant you ever
had can you tell me where you live you
can find me on all kinds of devices
phones Google homes and if I'm lucky in
your heart
that's right John Legend's voice is
coming to the assistant clearly he
didn't spend all the time in the studio
answering every possible question that
you could ask but wavenet allowed us to
shorten the studio time and the model
can actually capture the richness of his
voice his voice will be coming later
this year in certain contexts so that
you can get responses like this right
now in Mountain View it's sixty five
with clear skies today it's predicted to
be 75 degrees and sunny at 10 a.m. you
have an event called Google i/o keynote
then at 1:00 p.m. you have margaritas
have a wonderful day I'm looking forward
to 1 p.m. so John's voice is coming
later this year I'm really excited we
can drive advances like this with AI we
are doing a lot more with the Google
assistant and to talk to you a little
bit more about it that mean whites
caught on to the stage hey call Maddy
okay dialing now hey Google booked a
table for four sounds good hey Google
call my brother hey Google call my
brother can you text Carol for me too
hey Kevin that was great we haven't made
you Google work yet so you have to say
hey hey Google hey Google Play sincere
hey Google play the next episode why the
crown on Netflix all Channing Tatum
movies okay no goo that was great um
Chris get one where you say hey Google
Google find my phone
binding now whoa hey Google no Google
like the front door okay let's just go
with your Google then I'm sure the
engineers would love to update
everything you
two years ago we announced the Google
assistant right here at i/o today the
assistant is available on over 500
million devices including phones
speakers headphones TVs watches and more
it's available in cars from more than 40
auto brands and it works with over 5,000
connected home devices from dishwashers
to doorbells and people around the world
are using it every single day for
example we launched the assistant in
India last year and the response has
been incredible daily usage there has
tripled since the beginning of the year
now by the end of this year the
assistant will support 30 languages and
be available in 80 countries so we've
made great progress but we're just
getting started
today we're going to share with you some
important ways that the assistant is
becoming more naturally conversational
and visually assistive in order to help
you do more and get fine back now as you
heard from sundar
new voices that you can choose from to
make the Google assistant your own are
an important aspect of making a
conversation with your assistant more
natural but to be a great conversation
partner the assistant needs to deeply
understand the social dynamics of
conversation for example let's be honest
it gets a little annoying to say hey
Google every time I want to get my
assistants attention this grandma who
you might have seen on YouTube was
definitely feeling that way
tomorrow in Flagler Beach tomorrow there
will be showers for the high of 65 in
the low of 56 well the assistant
eventually worked for her but it
shouldn't be so hard now you won't have
to say hey Google every time check this
out hey Google did the Warriors win yes
the Warriors won 118 to 92 last Sunday
against the pelicans nice when's their
next game the Warriors next game is
today at 7:30 p.m. where they will be
playing the pelicans great it's gonna be
fun to watch tonight he remind me to
find my Kevin Durant Jersey when I get
home sure I'll remind you when you get
home they'll notice that I was able to
have a natural back-and-forth
conversation with my assistant without
having to repeat hey Google for every
follow-up request even more helpful
my assistant could understand when I was
talking to it first is when I was
talking to all of you we call this
continued conversation and it's been a
top feature request you'll be able to
turn it on in the coming weeks yeah all
right
so let's say it's almost game time hey
Google turn on the Warriors game and
start the popcorn maker sure here's the
Golden State Warriors game on YouTube TV
and turning the popcorn maker on oh and
can you dim the lights in the family
room and in the kitchen
sure dimming the lights in the family
room and in the kitchen now you'll
notice that in both of these examples I
asked for multiple things at once
something that feels really natural for
us but it's very difficult for computers
to understand it's something we call
multiple actions and we're rolling out
this capability and the assistant now
now you might think come on it's easy
just look for the end but it's not
always so straightforward in this
example even though it has the word and
it's not two requests it's actually just
one but here even though it looks very
similar the assistant has to break it
apart to form the two requests in
linguistics it's called coordination
reduction now multiple actions can get
way more complicated so just for fun hey
Google who was the Governor of
California when Kevin Durant was drafted
and what team drafted him Arnold
Schwarzenegger was the governor in 2007
Kevin Durant was drafted by the Seattle
SuperSonics
alright just in case you were wondering
alright so next we've also been working
on improving the conversation with the
Google assistant for families last fall
we launched our family experience for
the Google assistant it provides family
friendly games activities and stories
we've continued to grow our library and
families have listened to over a hundred
and thirty thousand hours of children's
stories in the last two months alone now
as we continue to improve the experience
for families a concern that we've heard
from many parents including people on
the team who have children is our kids
learning to be bossy and demanding when
they can just say hey Google to ask for
anything they need it's not a simple
area but one step that we've been
working on is something we call pretty
please some of the parents on the team
have been testing out with their
families take a look hey Google Talk -
voice thanks for saying please please
what a nice way to ask me thanks for
asking so nicely once upon a time there
was a wacky walrus please help
you're very polite
so the assistant understands and
responds to positive conversation with
polite reinforcements now we've been
consulting with families and child
development experts and we plan to offer
pretty please as an option for families
later this year so with new voices for
your assistant continued conversation
multiple actions and pretty please a is
helping us make big strides so everyone
can have a more natural conversation
with their assistant and now I'd like to
introduce Lillian who's going to share
some exciting things that are doing
bringing voice and visual assistance
together
well thanks Scott and good morning
everyone over the last couple of years
the assistant has been focused on the
verbal conversation that you can have
with Google today we're going to unveil
a new visual canvas for the Google
assistant across screens this will bring
the simplicity of voice together with a
rich visual experience now I'm gonna
invite Maggie to come out because we're
gonna be switching to a lot of live
demos we gave you an early look at our
new smart displays at CES in January
we're working with some of the best
consumer electronic brands and today I'm
excited to announce that the first smart
displays will go on sale in July today
I'll show you some of the ways that this
new device can make your day easier by
bringing the simplicity of voice with a
glance ability of a touchscreen so let's
switch over to the live demos now this
is one of the Lenovo smart displays the
ambient screen integrates with Google
photos and greets me pictures of my kids
Bella and Hudson those are really my
kids and best way to start my day every
morning now because the device is
controlled by voice I can watch videos
or live TV with just a simple command
this makes it so easy to enjoy my
favorite shows while multitasking around
the house hey Google let's watch Jimmy
Kimmel Live okay playing Jimmy Kimmel
Live on YouTube TV here's something from
my life I was driving my daughter in
school this morning so that's right on
YouTube TV you will be able to watch all
of these amazing shows from local news
live sports and much more and they will
be available on smart displays now of
course you can also enjoy all the normal
content from YouTube including how-to
videos music and original shows like the
brand new series Cobra Kai which we
started binge watching this week because
it's so good now cooking is another
instance where the blend of voice and
visuals is incredibly useful Nick and I
are always looking for simple
family-friendly recipes hey Google show
me recipes for Pizza bombs
sure here are some recipes so we can
choose the first one from tasty that one
looks good you see all the recipe
details come right up and we can just
tap to start cooking
sure here's tasty so seeing a video
demonstration along with the spoken
instructions is a total GameChanger for
cooking especially when you have your
hands full thanks Maggie
so we showed you a couple of ways that
smart displace can make life at home
easier but there are so many more from
staying in touch with family with
broadcasts and do a video colleague to
keeping an eye on your home with all of
our other smart home partners to seeing
in advance what the morning commutes
like with Google Maps
we're thoughtfully integrating the best
of Google and working with developers
and partners all around the world to
bring voice and visuals together in a
completely new way for the home now
inspired by the smart display
experiences we've also been working to
reimagine the assistant experience on
the screen that's with us all the time
our mobile phones so I'm gonna give you
a sneak peek into how the assistant on
the phone is becoming more immersive
interactive and proactive so we're gonna
switch to another live demo hey Google
tell me about Camila Cabello according
to Wikipedia
Carla Camila Cabello ester Bao is a
American singer and songwriter so as you
can see we're taking full advantage of
the screen to give you a rich and
immersive response
here's another turn down the heat
sure cooling the living room down and
for a smart home request what you can
see here is we're bringing the controls
right into your fingertips and here's
one of my favorites hey Google order my
usual from Starbucks hello welcome back
to Starbucks
that's one tall nonfat latte with
caramel drizzle anything else so no
thanks and are you picking that up at
the usual place I'm gonna tap yes okay
your orders in see you soon
yeah so we're excited to share that we
were working with Starbucks Dunkin
Donuts story - Domino's and many other
partners on a new food pickup and
delivery experience for the Google
assistant we have already started
rolling some of these out with many more
partners coming soon now rich and
interactive responses to my requests are
really helpful but my ideal assistant
should also be able to help in a
proactive way so when I'm in the
assistant now and swipe up I now get a
visual snapshot of my day
I see helpful suggestions based on the
time my location and even my recent
interactions with the assistant I also
have my reminders packages and even
notes a list organized and accessible
right here I love the convenience of
having all these details helpfully
curated and so easy to get to now this
new visual experience for the phone is
thoughtfully designed with AI at the
core it will launch on Android this
summer an iOS later this year
now sometimes the assistant can actually
be more helpful by having a lower visual
profile so like when you're in the car
let's say you should stay focused on
driving so let's say I'm heading home
from work I have Google Maps showing me
the fastest route during rush hour
traffic hey Google said Nick my ETA and
place some hip-hop okay
letting Nick know you're 20 minutes away
and check out this hip-hop music station
on YouTube so it's so convenient to
share my ETA with my husband with just a
simple voice command I'm excited to
share that the assistant will come to
navigation and Google Maps
this summer so across smart displays
phones and in maths this gives you a
sense of how we're making the google
assistant more visually assistive
sensing when to respond with voice and
went to show a more immersive and
interactive experience and with that
I'll turn it back to sundar thank you
thanks Lillian it's great to see the
progress with that system as I said
earlier our vision for our system is to
help you get things done it turns out a
big part of getting things done it's
making a phone call you may want to get
an oil change schedule maybe call a
plumber in the middle of the week or
even schedule a haircut appointment you
know we are working hard to help users
through those moments we want to connect
users to businesses in a good way
businesses actually rely a lot on this
but even in the u.s. 60% of small
businesses don't have an online booking
system Sarah we think hey I can help
with this problem so let's go back to
this example let's say you want to ask
Google to make you a haircut appointment
on Tuesday between 10:00 and noon what
happens is the Google assistant makes
the call seamlessly in the background
for you so what you're going to hear is
the Google assistant actually calling a
real salon to schedule the appointment
for you let's listen for what time are
you looking for well at 12:00 p.m.
we do not have a false cam available the
closest we have to that is a 1:15 do you
have anything between 10:00 a.m. and
12:00 p.m. depending on what service she
would like what service is she looking
for just a woman's haircut for now ok we
have a 10 o'clock 10:00 a.m. is fine
okay what's her first name the first
name is Lisa I'll say perfect so I will
see we thought 10 o'clock on May 3rd ok
great have a great day bye
that was a real call you just turned the
amazing thing is the assistant can
actually understand the nuances of
conversation we've been working on this
technology for many years it's called
Google duplex it brings together all our
investments over the years and natural
language understanding deep learning
text-to-speech by the way when we are
done the assistant can give you a
confirmation notification saying your
appointment has been taken care of let
me give you another example let's say
you want to call a restaurant but maybe
it's a small restaurant which is not
easily available to book online the call
actually goes a bit differently than
expected so take a listen it's for four
people
like five people four people before you
can come how long is the way usually to
be seated for next Wednesday the 7th oh
no it's not too busy you can okay oh I
got you
Thanks
again that was real call with many of
these examples where the calls quite
don't go as expected but the assistant
understands the context the nuance it
knew to ask for wait times in this case
and handle the interaction gracefully
but we are still developing this
technology and we actually want to work
hard to get this right get the user
experience and the expectation right for
both businesses and uses but done
correctly it'll save time for people and
generate a lot of value for businesses
we really want it to work in cases safe
here a busy pattern in the morning and
your kid is sick and you want to call
for a doctor's appointment so we're
gonna work hard to get destroyed there
is a more straightforward case where we
can roll this out sooner where for
example every single day we get a lot of
queries into Google where people are
wondering on the opening and closing
hours of businesses but it gets tricky
during holidays and businesses get a lot
of calls so we as Google can make just
that one phone call and then update the
information for millions of users and
it'll save a small business countless
number of calls so we're gonna get
moments like this right and make the
experience better for users this is
going to be rolling out as an experiment
in the coming weeks and so stay tuned
in a common theme across all this is we
are working hard to give users back time
we've always been obsessed about that at
Google search is obsessed about getting
users to answers quickly and giving them
what they want which brings me to
another area digital wellbeing now based
on our research we know that people feel
tethered to their devices sure it
resonates with all of you there is
increasing social pressure to respond to
anything you get right away people are
anxious to stay to stay up-to-date with
all the information out there they have
FOMO fear of missing out we wanted we
think there's a chance for us to do
better we've been talking to people and
some people introduced to us the concept
of jomo the actual joy of missing out so
so we think we can really help users
with digital wellbeing this is going to
be a deep ongoing effort across all our
products and platforms and we need all
your help we think we can help users
with their digital well-being in four
ways we want to help you understand your
habits focus on what matters
switch off when you need to and above
all find balance with your family so let
me give a couple of examples you're
going to hear about this from Android a
bit later in their upcoming release but
one of my favorite features is dashboard
in Android we actually give you going to
give how you're spending your time the
apps where you're spending your time the
number of times you unlock your phone on
a given day
the number of notifications you God and
we're going to really help you deal with
this better you know apps can also help
YouTube is going to take the lead and if
you choose to do so it'll actually
remind you to take a break so for
example if you a while maybe it'll show
up and say hey it's time to take a break
YouTube is also going to work to combine
if users want to come by and all their
notifications in the form of a daily
digest so that if you have for
notification it comes to you once during
the day YouTube is going to roll out all
these features this week you know we've
been doing a lot of work in this area
family link is a great example where we
provide parents tools to help manage
kids screen time I think this is an
important part of it we want to do more
here if you want to equip kids to make
smart decisions so we have a new
approach at Google design approach it's
called be internet awesome to help kids
become safe explorers of the digital
world we want kids to be secure kind
mindful and online and we are pledging
to train an additional 5 million kids
this coming year all these tools you're
seeing is launching with our digital
wellbeing site later today another area
where we feel tremendous responsibility
is news news is core to our mission also
times like this it's more important than
ever to support quality journalism it's
foundational to how democracies work
I've always been fond of news growing up
in India I have distinct memory of I
used to wait for the physical newspaper
turns out my grandma my grandfather used
to stay right next to us there was a
clear hierarchy he got his hands on the
newspaper first then my dad and then my
brother and I would go at it you know I
was mainly interested in the sports
section at that time but over time I
developed a fondness for news and it
stayed with me even till today it is
challenging time for the news industry
recently we launched Google News
initiative and we committed 300 million
dollars over the next three years we
want to work with organizations and
journalists to help develop innovative
products and programs
that held the industry we've also had a
product here for a long time Google News
it was actually built right after 9/11
it was a 20% project by one of our
engineers who wanted to see news from a
variety of sources to better understand
what happened since then if anything the
volume and diversity of content has only
grown I think there is more great
journalism being produced today than
ever before
it's also true that people turn to
Google in times of need and we have a
responsibility to provide that
information this is why we have
reimagined our news product we are using
AI to bring forward the best of what
journalism has to offer we want to give
users quality sources that they trust
but we want to build a product that
works for publishers above all we want
to make sure we are giving them deeper
insight and a fuller perspective about
any topic they're interested in I'm
really excited to announce the new
Google News and here's Tristan to tell
you more
Thank You sundar with the new Google
News we set out to help you do three
things first keep up with the news you
care about second understand the full
story and finally enjoy and support the
sources you love after all without news
publishers and the quality journalism
that they produce we'd have nothing to
show you here today so let's start with
how make it easy for you to keep up with
the news you care about as soon as I
open Google News right at the top I get
a briefing with the top five stories I
need to know right now as I move past my
briefing there are more story selected
just for me our a I constantly reads the
firehose of the web for you the millions
of articles videos podcasts and comments
being published every minute and
assembles the key things you need to
know Google News also pulls in local
voices and news about events in my area
it's this kind of information that makes
me feel connected to my community this
article from The Chronicle makes me
wonder how long it would take to ride
across this new Bay Bridge what's cool
is I didn't have to tell the app that I
follow politics love to bike or want
information about the Bay Area it works
right out of the box and because we've
applied techniques like reinforcement
learning throughout the app the more I
use it the better it gets and at any
point I can jump in and say whether I
want to see less or more or of a given
publisher or topic and whenever I want
to see what the rest of the world is
reading I can switch over to headlines
to see the top stories that are
generating the most coverage right now
around the world so let's keep going you
can see there are lots of big gorgeous
images that make this apps super
engaging and a truly great video
experience let's take a look
this brings you all the latest videos
from YouTube and around the wind all of
our design choices focus on keeping the
app light easy fast and fun our guiding
principle is to let the story speak for
themselves so it's pretty cool right
what we're seeing here throughout the
app is the new Google material theme the
entire app is built using material
design our adaptable unified design
system that's been uniquely tailored by
Google later today you'll hear more
about this and how you can use material
themes in your products we're also
excited to introduce a new visual format
we call newscasts you are not going to
see these in any other news app
newscasts are kind of like a preview of
the story and they make it easier if you
get a feel for what's going on check out
this one on the Star Wars movie here
we're using the latest developments in
natural language understanding to bring
together everything from the solo movie
trailer - news articles to quotes and
from the cast and more in a fresh
presentation that looks absolutely great
on your phone
newscast gives me an easy way to get the
basics and decide where I want to dive
in more deeply and sometimes I even
discover things I never would have found
out otherwise for the stories I care
about most or the ones that are really
complex I want to be able to jump in and
see many different perspectives so let's
talk about our second goal for Google
News understanding the full story today
it takes a lot of work to broaden your
point of view and understand a news
story in depth with Google News we set
out to make that effortless full
coverage is an invitation to learn more
it gives a complete picture of a story
in terms of how it's being reported from
a variety of sources and in a variety of
formats we assemble full coverage using
a technique we call temporal Coll ocala
T this technique enables us to map
relationships
entities and understand the people
places and things in a story right as it
evolves we applied this to the deluge of
information publish to the web at any
given moment and then organize it around
storylines all in real time this is by
far the most powerful feature of the app
and provides a whole new way to dig into
the news take a look at how full
coverage works with the recent power
outage in Puerto Rico there are so many
questions I had about this story like
how did we get here could it have been
prevented and are things actually
getting better
we built full coverage to help make
sense of it all all in one place we
start out with the set of top headlines
that tell me what happened and then
start to organize around the key story
aspects using our real-time event
understanding for news events that have
played out like this one over weeks and
months you can you can understand the
origin of developments big by looking at
our timeline of the key moments and
while the recovery has begun we can
clearly see there's still a long way to
go there are also certain questions
we're all asking about a story and we
pull those out so you don't have to hunt
for the answers we know context and
perspective come from many places so we
show you tweets from relevant voices and
opinions analysis and fact checks to
help you understand the story that one
level deeper in each case our AI is
highlighting why this is an important
piece of information and what unique
value it brings now when I use full
coverage I find that I can build a huge
amount of knowledge on the topics I care
about it's a truth 360 degree view that
goes well beyond what I get from just
scanning a few headlines on top of this
our research shows that having a
productive conversation or debate
requires everyone to have access to the
same information which is why everyone
sees the same content in full coverage
for a topic
it's an unfiltered view of events from a
range of trusted news sources
thank you so I gotta say I love these
new features and these are just a few of
the things we think make the new Google
News so exciting but as we mentioned
before none of this would exist without
the great journalism newsrooms produce
every day which brings us to our final
goal helping you enjoy and support the
new sources you love
we've put publishers front and center
throughout the app and here in the
newsstand section it's easy to find and
follow the sources I already love and
browse and discover new ones including
over 1,000 magazine titles like Wired
National Geographic and people which all
look great on my phone I can follow
publications like USA Today by directly
tapping the star icon and if there's a
publication I want to subscribe to say
The Washington Post
we make it dead simple no more forms
credit card numbers or new passwords
because you're signed in with your
Google account you're set when you
subscribe to a publisher we think you
should have easy access to your content
everywhere and this is why we developed
subscribe with Google subscribe with
Google enables you to use your Google
account to access your paid content
everywhere across all platforms and
devices on Google search Google News and
publisher zone sites we built this in
collaboration with over 60 publishers
around the world and it will be rolling
out in the coming weeks
thank you and this is one of the many
steps we're taking to make it easier to
access dependable high-quality
information when and where it matters
most so that's the new Google News it
helps you keep up with the news you care
about with your briefing and newscasts
understand the full story using full
coverage and enjoy and support the new
sources you love by reading following
and subscribing and now for the best
news of all we're rolling out on Android
iOS and the web in a hundred and twenty
seven countries starting today I think
so too pretty cool it will be available
to everyone next week at Google we know
that getting accurate and timely
information into people's hands and
building and supporting high-quality
journalism is more important than it
ever has been right now and we are
totally committed to doing our part we
can't wait to continue on this journey
with you and now I'm excited to
introduce Dave to tell you more about
what's going on in Android Android
started with a simple goal of bringing
open standards to the mobile industry
today it is the most popular mobile
operating system in the world
if you believe in openness if you
believe in choice if you believe in
innovation from everyone then welcome to
Android
hi everyone it's great to be here at
Google i/o 2018
I think clearly all developer
conferences should be held outside it's
pretty damn nice out here that was 10
years ago when we launched the first
Android phone the t-mobile g1 it was
with a simple but bold idea to build a
mobile platform that was free and open
to everyone and today that idea is
thriving our partners have launched tens
of thousands of smartphones used by
billions of people all around the world
and through this journey we've seen
Android become more than just a
smartphone operating system powering new
categories of computing including
wearables TV Auto ARV our IOT and the
growth of Android over the last 10 years
has helped fuel the shift in computing
from desktop to mobile and as Souter
mentioned the world is now the precipice
of another shift ai is gonna profoundly
change industries like healthcare and
transport and is already starting to
change ours and this brings me to the
new version of Android we're working on
Android B Android B is an important
first step towards this vision of AI at
the core of the operating system
in fact AI underpins the first of three
themes in this release which are
intelligence simplicity and digital
well-being so starting with intelligence
we believe smartphones should be smarter
they should learn from you and they
should adopt you technologies such as on
device machine learning can learn your
usage patterns and automatically
anticipate your next actions saving you
time and because it runs on device the
data is kept private to your phone so
let's take a look at some examples of
how we're applying these technologies to
Android to build a smarter operating
system in pretty much every survey of
smartphone users you'll see battery life
as the top concern and I don't know
about you but this is my version of
Maslow's hierarchy of needs
and we've all been there you know your
batteries being okay but then you have
one of those outlier days where it's
draining faster than normal
leave you to run to the charger what
Android P we partnered with deepmind to
work on a new feature we call adaptive
battery it's designed to give you a more
consistent battery experience adopted
battery uses on device machine learning
to figure out which apps you'll use in
the next few hours and which you won't
use until later if at all today and then
with this understanding the operating
system adapts to your usage patterns so
that it spends battery only on the
absent services that you care about and
the results are really promising we're
seeing a 30% reduction in CPU wake-ups
for apps in general and this combined
with other performance improvements
including running background processes
on the small CPU cores is resulting in
an increase in batteries for many users
it's pretty cool another example of how
the OS is adapting to the user is auto
brightness now most modern smartphones
will automatically adjust the brightness
given the current lighting conditions
but it's a one-size-fits-all they don't
take into account your personal
preferences and environment so often
what happens is you then need to
manually adjust the brightness slider
resulting the screen later becoming too
bright or too dim what Android P we're
introducing a new on device machine
learning feature we call adaptive
brightness a dr. brightness learns how
you like to set the brightness slider
given the ambient lighting and then does
it for you in a power efficient way so
you'll literally see the brightness
slider move as the phone adapts to your
preferences and it's extremely effective
in fact we're seeing almost half of our
test users now make fewer manual
brightness adjustments compared to any
previous version of Android we're also
making the UI more intelligent last year
we introduced the concept of predicted
apps a feature that places the next apps
the OS anticipates you need on the path
you'd normally follow to launch that app
and it's very effective with an almost
60 percent prediction rate
would Android pee we're going to beyond
simply predicting the next act to launch
to predicting the next action you want
to take we call this feature after
actions so let's take a look at how it
works at the top of the launcher you can
see two actions one to call my sister
Fiona and another to start a workout on
Strava for my evening run so what's
happening here is that the actions are
being predicted based on my usage
patterns the phone is adapting to me and
trying to help me get to my next task
more quickly as another example if I
connect my headphones Android will
surface in action to resume the album I
was listening to to support app actions
developers just need to add an actions
an XML file to their app and then action
surface not just in the launcher but in
smart text selection the Play Store
Google search and the assistant take
Google search we're experimenting with
different ways to surface actions for
apps you've installed and use a lot for
example I'm a big fan tango user so when
I search for the New Avengers movie
infinity war I get in addition to
regular suggestions I get an action to
the Fandango app to buy tickets pretty
cool actions are a simple but powerful
idea for providing deep links into the
app given your contacts but even more
powerful is bringing part of the app UI
to the user right there and that we call
this future slices slices are a new API
for developers to define interactive
snippets of their app UI that can be
surfaced in different places in the OS
in Android Pete were laying the
groundwork by showing slices first in
search so let's take a look let's say
I'm out and about and I need to get a
ride to work if I type lift into the
Google Search app I now see a slice from
the lift app installed on my phone lyft
is using the slice api's rich array of
UI templates to render a slice of their
app in the context of search and then
lift is able to give me the price for my
trip to work and the slice is
interactive so I can order the ride
directly from it pretty nice the slice
templates of versatile so
developers can offer everything from
playing a video to say checking into a
hotel is another example if I search for
Hawaii
I'll see you slice from Google photos
with my vacation pictures and we're
working with some amazing partners on
app actions and slices and we'll be
opening an early access program to
developers more broadly next month so
we're excited to see how actions and in
particular slices will enable a dynamic
two-way experience where the apps UI can
intelligently show up in context so
that's some of the ways that we're
making Android more intelligent by
teaching the operating system to adapt
to the user machine learning is a
powerful tool but it can also be
intimidating and costly for developers
to learn and apply and we want to make
these tools accessible and easy to use
to those who have little or no expertise
in machine learning so today I'm really
excited to announce ml kit and you said
of api's available through firebase with
ml kit you get on device api's to text
recognition face detection image
labeling and a lot more and ml kit also
supports the ability to tap into
Google's cloud-based ml technologies
architectural e you can think of ml kit
as providing ready to use models built
on tensorflow Lite and optimized for
mobile and best of all ml kit is
cross-platform so it runs on both
Android and iOS
we're working with an early set of
partners on mo kit and so with some
really great results for example the
popular calorie counting app lose it is
using our text recognition model to scan
nutritional information and ml kits
custom model api's to automatically
classify 200 different foods through the
camera you'll hear about more about ml
kit at the developer keynote later today
so we're excited about making your
smartphone more intelligent but it's
also important to us that the technology
fades to the back when one of our key
goals over the last few years has been
to evolve androids UI to be simpler and
more approachable vote for the current
set of users and the next billion
Android users would Android P we put a
special simply emphasis on simplicity by
addressing many pain points where we
thought and you told us the experience
was more complicated in an auto being
and you'll find these improvements on
any device that adopts Google's version
of the Android UI such as Google pixel
and Android one devices so let me walk
you through a few live demos on my phone
what could possibly go wrong
in front of 7,000 people in a napa
theater okay
as part of Android P we're introducing a
new system navigation that we've been
working on for more than a year now and
the new design makes androids
multitasking more approachable and
easier to understand and the first
striking thing you'll notice is the
single clean home button and the design
recognizes a trend towards smaller
screen bezels and places an emphasis on
gestures over multiple buttons at the
edge of the screen so when I swipe up
I'm immediately brought to the overview
where I can resume apps I've recently
used I also get five predicted apps at
the bottom of the screen to save me time
now if I'd continued to swipe off or I
swipe up a second time I get to all apps
so architectural II what we've done is
combine the all apps and overview spaces
into one and the swipe of gesture works
from anywhere no matter what app I'm in
so that I can quickly get back to all
apps an overview without losing the
context I'm in
and if you prefer you can also use the
quick scrub gesture by sliding the home
button sideways to scroll through your
recent set of apps like so now one of
the nice things about the larger
horizontal overview is that the app
content is not glanceable so you can
easily refer back to information in a
previous app even more is we've extended
smart text selection to work in overview
so for example if I tap anywhere on the
phrase The Killers all of the phrase
will be selected for me then I get an
action to listen to it on Spotify like
so and we've extended smart text
selections neural network to recognize
more entities like sports teams and
music artists and flight codes and more
I've been using this view navigation
system for the last month and I
absolutely love it it's a much faster
more powerful way to multitask on the go
so changing our navigation works it's a
pretty big deal but sometimes small
changes can make a big difference too so
take volume control and we've all been
there you try to turn down the volume
before a video starts but instead you
turn down the ringer volume and then the
video blasts everyone around you so how
are we fixing it well you can see the
new simplified volume controls here
they're vertical and located beside the
hardware buttons so they're intuitive
but the key difference is that the
slider now adjusts the media volume by
default because that's the thing you
want to change most often and for the
ringer volume all you really care about
is on silent and off like so
okay we've also greatly simplified
rotation and if you're like me and hate
your device rotating at the wrong time
you'll love this feature so right now
I'm in the locked rotation mode and let
me launch an app and you'll notice that
when I rotate the device a new rotation
button appears on the nav bar and then I
can just tap on it and rotate under my
own control we go
all right so that's a quick tour of some
of the ways that we simplified the user
experience in Android PE and there's
lots more everything from a redesign
work profile to better screenshots to
improve notifications management and
more speaking of notifications
management we want to give you more
control over demands on your attention
and this highlights a concept that
sundar alluded to earlier making it
easier to move between your digital life
and your real life to learn more about
this important area and our third theme
let me hand over to Sameer thanks
hi everyone
on a recent family vacation my partner
asked if she could see my phone right
after we got to our hotel room
she took it from me walked over to the
hotel safe locked it inside and turned
and looked me right in the eye and said
you get this back in seven days when we
leave whoa
I was shocked I was kind of angry but
after a few hours something pretty cool
happened without all the distractions
from my phone I was actually able to
disconnect be fully present and I ended
up having a wonderful family vacation
but it's not just me
our team has heard so many stories from
people who are trying to find the right
balance with technology as you heard
from sundar helping people with their
digital wellbeing is more important to
us than ever people tell us a lot of the
time they spend on their phone is really
useful but some of it they wish they'd
spent on other things in fact we found
over 70% of people want more help
striking this balance so we've been
working hard to add key capabilities
right into Android to help people find
the balance with technology that they're
looking for one of the first things we
focused on was helping you understand
your habits
Android pee will show you a dashboard of
how you're spending time on your device
as you saw earlier you can see how many
how much time you spent in apps how many
times you've unlocked your device today
and how many notifications you've
received and you can drill down on any
of these things for example here's my
Gmail data from Saturday and when I saw
this it did make me wonder whether I
should have been on my email all weekend
but that's kind of the point of the
dashboard now when you're engaging is
one part of understanding but what
you're engaging with in apps is equally
important it's like watching TV catching
up on your favorite shows at the end of
a long day can feel pretty good but
watching an infomercial might leave you
wondering why you didn't do something
else instead
many developers call this concept
meaningful engagement and we've been
working closely with many of our
developer partners who share the goal of
helping people use technology in healthy
ways so in Android P developers can link
to more detailed breakdowns of how
you're spending time in their app from
this new dashboard for example YouTube
will be adding a deep link where you can
see total watch time across mobile and
desktop and access many of the helpful
tools that shouldn't sundar shared
earlier now understanding is a good
start but Android P also gives you
controls to help you manage how and when
you spend time on your phone maybe you
have an app that you love but you're
spending more time in it than you
realized
Android Piet lets you set time limits on
apps and will nudge you when you're
close to your limit that it's time to do
something else
and for the rest of the day that app
icon is greyed out to remind you of your
goal people have also told us they
struggle to be fully present for the
dinner that they're at or the meeting
that they're attending because the
notifications they get on their device
can be distracting and too tempting to
resist and come on we've all been there
so we're making improvements to do not
disturb mode to silence not just the
phone calls and texts but also the
visual interruptions that pop up on your
screen to make do not disturb even
easier to use we've created a new
gesture that we've affectionately
codenamed shush if you turn your phone
over on the table it automatically
enters do not disturb so you can focus
on being present
no pings vibrations or other
distractions
of course in an emergency we all want to
make sure we're still reachable by the
key people in our lives like your
partner or your child's school Android P
will help you set up a list of contacts
that can always get through to you with
a phone call even if Do Not Disturb is
turned on finally we heard from people
that they often check their phone right
before going to bed and before you know
it an hour two has slipped by and
honestly this happens to me at least
once a week getting a good night's sleep
is critical and technology should help
you with this not prevent it from
happening so we created windown mode you
can tell the Google assistant what time
you aim to go to bed and when that time
arrives it will switch on do not disturb
and fade the screen to grayscale which
is far less stimulating for the brain
and can help you set the phone down it's
such a simple idea but I found it's
amazing how quickly I put my phone away
when all my apps go back to the days
before color TV don't worry all the
colors return in the morning when you
wake up okay that was a quick tour of
some of the digital well-being features
we're bringing to Android peek this fall
starting with Google pixel digital well
being is gonna be a long-term theme for
us so look for much more to come in the
future beyond the three themes of
intelligence simplicity and digital
well-being that Dave and I talked about
there are literally hundreds of other
improvements coming in Android P I'm
especially excited about the security
advancements we've added to the platform
and you can learn more about them at the
Android security session on Thursday but
your big question is that's all great
how do I try some of this stuff well
today we're announcing Android P beta
and with efforts and Android Oreo to
make OS upgrades easier Android P beta
is available on Google pixel and seven
more manufacturer flagship devices today
you can head over to this link to find
out how to receive the beta on your
device and please do let us know what
you think okay that's a wrap on what's
new in Android and now I'd like to
introduce Jen to talk about Maps thank
you
it has changed I'm sure so much and you
can actually be part of it being able to
be armed with the knowledge of where
you're going that you're gonna build
together like anybody else can two
consecutive earthquakes hit Mexico City
and Google map helped a response to
emergency crisis like this the hurricane
had turned Houston into islands and the
roads were changing constantly we kept
saying thank God for Google like what
would we have done it's really cool that
this is helping people to keep doing
what they love doing and keep doing what
they need to do
building technology to help people in
the real world every day has been cored
who we are and what we focused on at
Google from the very start recent
advancements in AI in computer vision
have allowed us to dramatically improve
long-standing products like Google Maps
and have also made possible brand-new
products like google lens let's start
with Google Maps Maps was built to
assist everyone wherever they are in the
world we've mapped over 220 countries
and territories and put hundreds of
millions of businesses and places on the
map and in doing so we've given more
than a billion people the ability to
travel the world with the confidence
that they won't get lost along the way
but we're far from done we've been
making maps smarter and more detailed as
advancements in AI have accelerated
we're now able to automatically add new
addresses businesses and buildings that
we extract from Street View and
satellite imagery directly to the map
this is critical in rural areas in
places without formal addresses and in
fast changing cities like Lagos here
where we literally changed the face of
the map in the last few years
hello Nigeria
we can also tell you if the business
you're looking for is open how busy it
is what the wait time is and even how
long people usually spend there we can
tell you before you leave
whether parking is going to be easy or
difficult and we can help you find it
and we can now give you different routes
based on your mode of transportation
whether you're riding a motorbike or
driving a car and by understanding how
different types of vehicles move at
different speeds we can make more
accurate traffic predictions for
everyone but we've only scratched the
surface of what maps can do we
originally designed maps to help you
understand where you are and to help you
get from here to there but over the past
few years we've seen our users demand
more and more of maps they're bringing
us harder and more complex questions
about the world around them and they're
trying to get more done today users
aren't just asking for the fastest route
to a place they also want to know what's
happening around them what the new
places to try are and what locals love
in their neighborhood the world is
filled with amazing experiences like
cheering for your favorite team at a
sports bar or a night out with friends
or family at a cozy neighborhood Bistro
we want to make it easy for you to
explore and experience more of what the
world has to offer we've been working
hard on an updated version of Google
Maps that keeps you in the know on
what's new and trending in the areas you
care about it helps you find the best
place for you based on your context and
interests let me give you a few examples
of what this is gonna look like with
some help from Sophia first we're adding
a new tab to maps called for you it's
designed to tell you what you need to
know about the neighborhood's you care
about new places that are opening what's
trending now and personal
recommendations here I'm being told
about a cafe that just opened in my area
if we scroll down I see a list of the
restaurants that are trending this week
this is super useful because with zero
work maps is giving me ideas to kick me
out of my rut and inspire me to try
something new but how do I know if a
place is really right for me
have you ever had the experience at
looking at lots of places all with four
star ratings and you're pretty sure
there's some you're gonna like a lot and
others that maybe aren't quite so great
but you're not sure how to tell which
ones we've created a score called your
match to help you find more places that
you'll love your match uses machine
learning to combine what Google knows
about hundreds of millions of places
with the information that I've added
restaurants I've rated cuisines I've
liked and places that I've been to if
you cook into the match number you'll
see reasons explaining why it's
recommended just for you it's your
personal score four places and our early
testers are telling us that they love it
now you can confidently pick the places
that are best for you whether you're
planning ahead or are on the go and need
to make a quick decision right now
thanks so much Sophia before you tab and
the your match score are great examples
of how we can help you stay in the know
and choose places with confidence now
another pain point we often hear from
our users is that planning with others
can be a real challenge so we wanted to
make it easier to pick a place together
here's how long press on any place to
add it to a shortlist now I'm always up
for ramen but I know my friends have
lots of opinions of their own so I can
add some more options to give them some
choices when you've collected enough
places that you like share the list with
your friends to get their input too you
can easily share with just a couple of
taps on any platform that you prefer
then my friends can add more places that
they want to or just vote with one
simple click so we can quickly choose a
group favorite so now instead of copying
and pasting a bunch of links and sending
text back and forth decisions can be
quick easy and fun this is just a
glimpse of someone what's coming to maps
on both Android and iOS later this
summer and we see this as just the
beginning of what Maps can do to help
you make better decisions on-the-go and
to experience the world in new ways from
your local neighborhood to the far-flung
corners of the world
this discovery experience wouldn't be
possible
without small businesses because when we
help people discover new places we're
also helping local businesses be
discovered by new customers these are
businesses like the bakery in your
neighborhood or the barber shop around
the corner these businesses are the
fabric of our communities and we're
deeply committed to helping them succeed
with Google every month we connect users
to businesses nearby more than nine
billion times including over a billion
phone calls and three billion Direction
requests to their stores in the last few
months we've been adding even more tools
for local businesses to communicate and
engage with their customers in
meaningful ways you can now see daily
posts on events or offers from many of
your favorite businesses and soon you'll
be able to get updates from them the new
for you stream - and when you're ready
you can easily book an appointment or
place an order with just one click
we're always inspired to see how
technology brings opportunities to
everyone the reason we've invested over
the last 13 years in mapping every road
every building and every business is
because it matters when we map the world
community has come alive and
opportunities arise in places we never
would have thought possible and as
computing evolves
we're gonna keep challenging ourselves
to think about new ways that we can help
you get things done in the real world
I'd like to invite a partner to stage to
share how we're doing this both in
Google Maps and beyond
the cameras in our smartphones they
connect us to the world around us in a
very immediate way they help us save a
moment capture memories and communicate
but with advances in AI and computer
vision that you heard sundar talk about
we said what if the cameras can do more
what if the cameras can help us answer
questions questions like where am I
going
or what's that in front of me let me
paint the familiar picture you exit the
subway you're already running late for
an appointment or a tech company
conference that happens and then you're
the you phone says head south on Market
Street so what do you do one problem you
have no idea which way is south so you
look down at the phone you're looking at
that blue dot on the map and just
starting to walk to see if it's moving
in the same direction if it's not you're
turning around
they've all been there so we asked
ourselves well what if the camera can
help us here our teams have been working
really hard to combine the power of the
camera the computer vision with Street
View and maps to reimagine walking
navigation so here's how it could look
like in Google Maps let's take a look
you open the camera
you instantly you instantly know where
you are
no futzing with the phone you all the
information on the map the street names
the directions right there in front of
you notice that you also see the map so
that way you stay oriented you can start
to see nearby places so you see what's
around you and just for fun our team's
been playing with an idea of adding a
helpful guide like that there so that it
can show you the way oh there she goes
pretty cool
now enabling these kinds of experiences
though GPS alone doesn't cut it so
that's why we've been working on what we
call VPS visual positioning system that
can estimate precise positioning and
orientation one one way to think about
the key insight here is just like you
and I when we are in an unfamiliar place
you're looking for visual landmarks
looking for the storefront the building
facades etc and it's the same idea
VPS uses the visual features in the
environment to do the same so that way
we help you figure out exactly where you
are and get you exactly where you need
to go pretty cool so that's an example
how we're using the camera to help you
in maps but we think the camera can also
help you do more with what you see
that's why we started working on Google
lense now people are already using it
for all sorts of answers and especially
when the questions are difficult to
describe in words answers like oh that
cute dog in the park
that's a labradoodle or this building in
Chicago is the Wrigley building and it's
425 feet tall or as my nine-year-old son
says these days that's more than 60
Kevin Durant's now today lens is the
capability in Google products like
photos and the assistant but we're very
excited that starting next week lens
will be integrated right inside the
camera app on the pixel the new LG g g7
and a lot more devices this way it makes
it super easy for you to use lands on
things right in front of you already in
the camera
very excited to see this now like voice
vision is a fundamental shift in
computing for us and it's a multi-year
journey but we're already making a lot
of progress so today I thought I'd show
you three new features in google lens
that can you give you more answers to
more types of questions more quickly
shall we take a look all right
okay first lens can now recognize and
understand words words are everywhere if
you think about it traffic signs posters
restaurant menus business cards but now
with smart text selection you can now
connect the words you see with the
answers and actions you need so you can
do things like copy and paste from the
real world directly into your phone just
like that
or let's say you're looking at or you
can pay turn a page of words into a page
of answers so for example you're looking
at a restaurant menu you can quickly tap
around figure out every dish what it
looks like what are all the ingredients
etcetera by the way has a vegetarian
good to know
ratatouille the zucchini and tomatoes
really cool now in these examples lens
is not just understanding the shape of
characters and the letters visually it
sounds actually trying to get at the
meaning in the context behind these
words and that's where all the language
understanding that you heard Scott talk
about really comes in handy okay
the next feature I want to talk about is
called style match and the idea is this
sometimes your question is not what's
that exact thing instead your question
is what are things like it you're at
your friend's place you check out this
trendy looking lamp and you want to know
things that match that style and now
lens can help you or if you see an
outfit that catches your eye you can
simply open the camera tap on any item
and find out of course specific
information like reviews itself of any
specific item but you can also see all
the things and browse around that match
that style now
there's two parts to it of course lens
has to search through millions and
millions of items but we kind of know
how to do that search but the other part
actually complicates things which is if
there can be different textures shapes
sizes angles lighting conditions
etcetera so it's a tough technical
problem but we're making a lot of
progress here and really excited about
it
so the last thing I want to tell you
about today is how we're making lens
work in real time so as you saw in the
style match example you start to see you
open the camera and you start to see
lens surface proactively all the
information instantly and it even
anchors that information to the things
that you see now this kind of thing
where it's sifting through billions of
words phrases places things just in
real-time to give you what you need not
possible without machine learning so we
are using both on device intelligence
but also tapping into the power of cloud
TP use which we announced last year at
i/o to get this done really excited and
in over time what we want to do is
actually overlay the live results
directly on top of things like
storefronts street signs or a concert
poster so you can simply point your
phone at a concert poster of Charlie
puth and the music video just starts to
play just like that this is an example
of how the camera is not just answering
questions but it is putting the answers
right where the questions are and it's
very exciting so smart text selection
style match real-time results all coming
to lens in the next few weeks please
check them out
so those are some examples of how Google
is applying AI in camera to get things
done in the world around you when it
comes to applying AI mapping and
computer vision to solving problems in
the real world well it doesn't get more
real than self-driving cars so to tell
you all about it please join me in
welcoming the CEO of vamo john krafchick
thank you
hello everyone we're so delighted to
join our friends at Google onstage here
today and while this is my first time at
Shoreline it actually isn't the first
time for our self-driving cars you see
back in 2009 and the parking lot just
outside this theater some of the very
first tests of self-driving technology
took place it was right here we're a
group of Google engineers roboticists
and researchers set out on a crazy
mission to prove that cars could
actually drive themselves now back then
most people thought self-driving cars
were nothing more than science fiction
but this dedicated team of dreamers
believe that self-driving vehicles could
make transportation safer easier and
more accessible for everyone and so the
Google self-driving car project was born
now fast forward to 2018 in the Google
self-driving car project is now its own
independent alphabet company called way
Moe and we've moved well beyond
tinkering and research today
wham-o is the only company in the world
with a fleet of fully self-driving cars
with no one in the driver's seat on
public roads now members of the public
in Phoenix Arizona have already started
to experience some of these fully
self-driving rides - let's have a look
hey Denny one of self-driving are you
ready
it's pretty cool all of these people are
part of what we call the way Moe early
writer program where members of the
public use our self-driving cars in
their daily lives over the last year
I've had a chance to talk to some of
these early writers and their stories
are actually pretty inspiring one of our
early writers neha witnessed a tragic
accident when she was just a young teen
which scared her into never getting her
driver's license but now she takes away
Moe to work every day and there's Jim
and Barbara who no longer have to worry
about losing their ability to get around
as they grow older
then there's the Jackson family way Moe
helps them all navigate their jam-packed
schedules
taking Kyla and Joseph to and from
school practices and meet ups with
friends so it's not about science
fiction when we talk about building
self-driving technology these are the
people who are building it for in 2018
self-driving cars are already
transforming the way they live and move
so Phoenix will be the first stop for
way Mo's driverless transportation
service which is launching later this
year soon everyone will be able to call
way moe using our app and a fully
self-driving car will pull up with no
one in the driver's seat to whisk them
away into their destination and that's
just the beginning because that way Moe
we're not just building a better car
we're building a better driver and that
driver can be used in all kinds of
applications ride hailing but just
personal cars connecting people to
public transportation and we see our
technology as an enabler for all of
these different industries and we intend
to partner with lots of different
companies to make this self-driving
future a reality for everyone now we can
enable this future because of the
breakthroughs and investments we've made
in AI back in those early days
Google was perhaps the only company in
the world investing in both AI and
self-driving technology at the same time
so when Google started making major
advances in machine learning with speech
recognition computer vision image search
and more Wayne was in a unique position
to benefit for example back in 2013 we
were looking for a breakthrough
technology to help us with pedestrian
detection luckily for us Google was
already deploying a new technique called
deep learning a type of machine learning
that allows you to create neural
networks with multiple layers to solve
more complex problems
so our self-driving engineers teamed up
with researchers from the Google brain
team and within a matter of months
we reduced the error rate for detecting
pedestrians by 100x that's right
not a hundred percent about a hundred
times and today
today AI plays an even greater role in
our self-driving system unlocking our
ability to go truly self-driving now to
tell you more about how machine learning
makes way mo the safe and skilled driver
that you see on the road today I'd like
to introduce you to Demetri
good morning everyone
it's great to be here I want way mo AI
touches every part of our system from
perception to prediction to
decision-making to mapping and so much
more now to be a capable and safe driver
our cars need a deep semantic
understanding of the world around them
our vehicles need to understand and
classify objects interpret their
movements reason about intent and
predict what they will do in the future
they need to understand how each object
interacts with everything else
and finally our cars need to use all
that information to act in a safe and
predictable manner
so needless to say there's a lot that
goes into building a self-driving car
and today I want to tell you about two
areas where AI has made a huge impact
perception and prediction the first
perception detecting and classifying
objects is a key part of driving
pedestrians a particular poses a unique
challenge because the common all kinds
of shapes postures and sizes so for
example here's a construction worker
picking out of a manhole with most of
his body obscured here's a pedestrian
crossing the street concealed by a blank
of wood and here we have pedestrians who
are dressed an inflatable dinosaur
costumes and now we haven't taught our
cars about the Jurassic period but can
still classify done correctly we can
detect and classify these pedestrians
because we apply deep nets to a
combination of sensor data a
traditionally in computer vision neural
networks are used just on camera images
and video but our cars have a lot more
than just cameras we also have lasers to
measure distance and shapes of objects
and radars to measure their speed and by
applying machine learning to this
combination of sensor data we can
accurately detect pedestrians in all
forms in real time a second area where
machine learning has been incredibly
powerful for way more is predicting how
people will behave on the road now
sometimes people do exactly what you
expect them to and sometimes they don't
take this example of a car running a red
light
unfortunately we see this kind of thing
more than we'd like but let me break
this down from the cars point of view
our car is about to proceed straight
through an intersection we have a clear
green light and cross traffic is stopped
with a red light but just as we enter
the intersection all the way in the
right corner we see a vehicle coming
fast
our models understand that this is
unusual behavior for a vehicle that
should be decelerating we predict the
car will run the red light so we
preemptively slowed down which you can
see here with this red fence and this
gives the red light runner room to pass
in front of us while it barely avoids
hitting another vehicle now we can
detect this kind of anomaly because
we've trained our ML models using lots
of examples today our fleet has self
driven more than 6 million miles on
public roads which means we've seen
hundreds of millions of real-world
interactions to put that in perspective
we drive more miles each day than the
average American tribes in the year now
it takes more than Google good
algorithms to build a self-driving car
we'll send you really powerful
infrastructure and at--we mo we is the
tensorflow ecosystem and Google's data
centers including TP use to train our
neural networks and with TP use we can
now train our Nets up to 15 times more
efficiently we also use this powerful
infrastructure to validate our models in
simulation and in this virtual world
we're driving the equivalent of 25,000
cars all day every day all told we've
driven more than five billion miles and
simulation and with this kind of scale
both in training and validation of our
models we can quickly and efficiently
teach our cars new skills and one skin
skill we started to tackle is
self-driving in difficult weather such
as snow as you see here and today for
the first time I want to show you behind
the scenes look at what it's like for
our cars to self-driving snow this is
what our car sees before we apply any
filtering
now driving the snowstorm can be tough
because snowflakes can create a lot of
noise for our sensors but when we apply
machine learning to this data this is
what our car sees we can clearly
identify each of these vehicles even
through all of the sensor noise and the
quicker we can unlock these types of
advanced capabilities the quicker we can
bring ourselves out in cars to more
cities around the world into a city near
you now we can't wait to make our
self-driving cars available to more
people moving us closer to a future
where roads are safer easier and more
accessible for everyone thanks everyone
now please join me in welcoming back in
the morning session
thanks Demetri it's a great reminder of
how AI can play a role in helping people
in new ways all the time
I started at Google as an engineer as an
engineering intern almost 19 years ago
and what struck me from almost the very
first day was the commitment to push the
boundaries on what was possible with
technology combined with a deep focus on
building products that had a real impact
on people's lives and as the years have
passed I've seen time and again who
technology can play a really
transformative role from the earliest
days of things like search and maps to
new experiences like the Google
assistant as I look at the Google of
today I see those same early values
alive and well we continue to work hard
together with all of you to build
products for everyone and products that
matter
we constantly aspire to raise the bar
for ourselves even higher and to
contribute to the world into society in
a responsible way now we know that to
truly build for everyone we need lots of
perspectives in the mix and so that's
why we brought an i/o this year to
include an even wider range of voices
we've invited additional speakers over
the next three days to talk to you all
about the broader role that technology
can play in everything from promoting
digital well-being to empowering NGOs to
achieve their missions along with of
course the hundreds of technical talks
that you've come to expect from us at
i/o and that we hope you can enjoy and
learn from as well
welcome to i/o 2018 please enjoy and I
hope you all find some inspiration in
the next few days to keep building good
things for everyone thank you
well welcome back to see next live
coverage of Google i/os the Developers
Conference and if she's just joining us
I'm still a as actor
so you're still Vanessa handle Rihanna
and you're still Patrick Holland and we
had seen so many things in this long
long presentation from Google I'm going
to do a quick rundown we've got a recap
they got usages for AI in the medical
field accessibility Gmail smart compose
google photos with smart actions google
offers suggestions on how to fix photos
there and there's just a lot to go
through fact like what was your favorite
announcement out of the hours and hours
we just saw there's actually a few but
the one that these pops in my mind is
that a our navigation of Google Maps
that they could have a little Fox point
me in the right direction of how to get
to Starbucks or wherever I'm looking at
he has tremendous that was truly crazy
the idea what we were seeing so this a
r2 if you somehow missed it this had a
visual of you hold your camera up to the
to the real world you'd have an overlay
of the actual map and you'd have
directions overlaid with the image that
the camera is seeing so instead of you
looking down you're like well is that
Jones in Beach no that's Jones and Beach
and it can tell I really let thought
that was pretty cool and the Fox being
your guide you're following somebody
yeah joke that might cause car accidents
because I'm trying to follow the Fox
turning around the corner Fox well yeah
never mind
yeah it Roger the South Park episode
when they have the game to follow the
dragon I don't know but it goes like oh
I gotta follow the Fox and like you just
you never catch the Fox could be the
problem I think actually it's really
neat but it is also weird cuz now does
that mean we're gonna see a lot of
people walking around with their phones
in front of their faces for that though
because at least you're aware of your
surroundings rather than just being
immersed in your phone because at least
yours you are seeing what's around you
through the through the camera itself
and you're also seeing a fox and you're
also seeing a fox so maybe might be
delusional as well but you know at least
you're aware of incoming traffic versus
just being looking looking facedown
looking at your phone and walking into
incoming traffic but maybe we see lots
of gimmicks in the AR and VR this is
truly a neat and it use case for it and
if it works as advertised that would be
a lot of definitely handset a wearable
glasses kind of thing because
that's really what you would really want
to see come the old Google Goggles
concept yeah I had turned by turn and
everything but this overlay could be
really useful if they can apply it to
another kind of wearable one day it was
ahead of its time I know you've had
we've had it on our top five list of
Google failures but it was just ahead of
its time there wasn't the infrastructure
built in to actually make that succeed
so no no it's a time to resurface the
Google glass right what was your
favorite pen essay so I was wowed at the
beginning I'm a sucker and at first when
I wouldn't they were talking about the
medical field and the googly eye or the
AI translating Morse code I was you know
getting emotional there as I'm sure they
wanted me to but I really like the how
AI is gonna be used in the medical field
and I think that's kind of a boring but
very practical way that it can be used
and it right now in just to be able to
prevent illnesses before they arise
that's a great boring at all because
it's supposed to be something somewhat
supplementing the doctors that are
working because there's all these
reports that doctors being overworked
constantly and there's actively to bad
decision making if you have the help of
an AI that can predict things and
symptoms of a patient that might be
helpful to kind of let the doctors kind
of decompress a little bit but then
again they're gonna use their best
judgment because AI is not infallible
well I think about like right now if you
go on WebMD and enter any kind of thing
it's always like oh you might have this
little like light Ilham ailment or you
might have cancer it's like it's so
there's no extremes in the heavy rain
yeah and having an identification that
could the one that I was impressed with
was that you could tell it like your
cardiovascular health based on the
quality of your eye and I mean and I
think what I liked about it is it's an
example where AI is not gonna take jobs
but it's gonna enhance the jobs are that
are already there like you said a lot of
the doctors are overworked and so it's
not gonna take the role of the doctor
but it's gonna it's gonna analyze all
that my new shed detail that the
doctor's not gonna have to so that he
can focus on the big picture stuff and
also I think it also if you're on a
treatment with a doctor it can help you
stay on track better let you know when
you're getting off track or let the
doctor or the nurse practitioner know
that ok sticking to this program or how
to
that could be really cool a lot of stuff
reminds me of guys I'm sorry well that
reminds me of what IBM's Watson can do
and basically taking all this data
putting it together and giving it as
useful information
I believe Watson could also be used for
the customer service and other kind of
Harvard you know thought process stuff
then just Oba turning left it's like now
what's this guy have to come back for me
talked about a lot of AI stuff and AR
stuff what were you wowed with Google
lends probably was pretty freaking
awesome
I was just wow we were watching I'm
swearing a bit about how exciting it was
because it just looked amazing the idea
that you could hold up your phone and
get all this real-time data we were
playing around with bixby vision to see
how it compared to bixby vision not
super and if google lens works the way
the demos worked it would be really
intriguing to be able to find data out
about pretty much anything you can see
that was yes I've ever liked about it
was like last year we saw google lens
you hold it up to it like a landmark it
might identify how tall the building is
or what the building is but now I can
hold it up to like a sheet of words and
it would understand like why think the
demo was sushi and it actually show you
what that sir she was so you like oh
that's what I'm eating that's great you
know and the copy-paste feature where
you could copy text from the real world
into the phone that was mind-blowing and
and we were you are saying that a lot of
the features are similar to things we've
already seen Bigsby vision or the style
watch was similar to what Amazon is yeah
but sorry to cut you off three in the
real world we have Scott Stein who is at
Google i/o right now he's in the real
world it's not wearing weird visors or
anything Scott how's it going over there
it's seven good developers can extend up
reuse elements of Microsoft how was the
crowd after the event worth wikibook
pumped up or was it kind of subdued the
obsess
they didn't want to get one are you how
are you feeling are you pumped or are
you kind of sad that there's no actual
hardware that you're gonna be looking at
after not a single mention of where OS
and AR did emerge please become Joe
Belfiore so it's pretty interesting
we meet a lot of things on but when it
comes we are packed you know stuff here
surprising way things and video we can
come back in just a second because if
you're hearing this and you're like hey
it's Joe Belfiore
from Microsoft showing up just randomly
at Google IO that's not happening that
would have been pretty freaking cool he
just walks out but stretch case of that
Fox that's not what's going on he might
be like hey wait a minute we had a lot
of that stuff on Windows Phone it was
super seamless we could just give you
the information you wanted and a lot of
the stuff with Android P and I was like
you know trying to come up with
something you know funny about Android P
but P sounds like it's really for
passive Android seems like it's just
falling way back kind of relaxing and
we'll get back to that in a second Scott
I'll be back
yes we're back with Scott no idea no
we're good we're good
yes we're good sorry continue
yeah so I don't know I don't know was
here what were you most excited about
from the presentation what stood out in
your mind I think I like what you guys
were saying I think the future of where
AR is going and Google lines is
definitely driving that I think you need
to have real-time they are and that's
what Google lens is really focusing on
so to speak but the you know that's the
key to any sort of headset that's the
key to where computer vision is going
that's the key to where smart cameras
are going you know you transition from
that
in two-way mo and cars everything is
using cameras did you smarter ai
processing on the fly so all these
things dovetail Maps is really
interesting because I use maps for
search and recommendations all the time
so I think the ability for that to live
recommend is great I think all of the
robocalls stuff involving assistant is
really creepy I think it's I think it's
problematic but I think it's definitely
good for Google insofar as they're gonna
keep researching and developing natural
language processing in AI and so that's
part of it right and you know as far as
the Android P developments for managing
your life and being and managing things
with machine learning I guess we'll see
right it's either going to be really
good at that or maybe not so great it's
very hard to tell we've heard over the
years many times how a AI and machine
learning are going to improve our lives
and the proof is in the pudding with
that as far as like how you know that's
a I'm skeptical but I am skeptical
because III think that part is a little
harder to figure out I think when you
look at AI providing more exciting
developments better language processing
things that can pluck out of your camera
that that's really cool stuff so a lot
of that AR stuff that was just thinking
about what kind of impact you think
that's gonna have on battery life on
these devices because just because
Android their claim that androids going
to be really battery efficient do you
think that it's possible to be using all
these augmented reality stuff without
sacrificing tons of battery life no I
think it's gonna use a lot of battery
life and you know it's probably why it
only be used sparingly you bring it up
once in a while you know I think the way
I use they are right now it's like a
little bit at a time and that's also a
big reason it's a good point it's why
you don't see a our headsets that are
doing that because it's extremely
battery intensive and processing
intensive it's why Apple and Google have
had to do this more on phones I think
because you have better processors and
bigger batteries that can drive that but
it down the road
I think the idea is that becomes more
efficient something you could wear on
you something you could do like that
but I think it's an exciting development
but yeah like how useful is it all going
to be
to be continued let me ask one more
question
duplex is duplex creepy or not creepy or
helpful which would create you know it's
a hundred percent creepy this is a
completely creepy thing this is not
there's not something that a lot of
people are gonna want to get a phone
call from I don't think and while it
could be useful for you the person I
think the real the question to me is how
well does that work in the wild if it
does great but you don't want to like
increase the amount of robocalls in the
world I I think it would be great for
assistance for if you have a situation
where you're not easily able in your
life to make phone calls or to go do
things and that's that that could be
great
you know do you if you could be making
the phone call yourself then make the
phone call yourself if it's a way of
getting ahead in line to make the call
so you don't have to deal with it then
is everyone going to use that and then
it's like getting tickets where you're
just going to game in the system it's
weird it's definitely weird but it's
where I mean eventually box talking the
Box natural language AI getting better
you know that's what's gonna have to
happen both your phones for robotics for
anything that you're imagining you know
look your little home getting smarter at
talking to you the better it gets to
talking to other people the better it's
gonna get a talking to you which is
creepy and awesome at the same time but
with the way the what you mentioned
exactly we're gonna have BOTS to
communicating with bots and we're used
to seeing it on the business front where
we get the bots calls but now I like the
fact that we can turkey with duplex we
could kind of turn the tables and robo
call the businesses I mean obviously
there's a lot of privacy issues a lot of
downsides to this as you were saying but
the good thing is at least in my regards
is that now we can now we can kind of
pull it and pull what the businesses
we're doing to us with robo calls to
them and then just have robots talking
to each other and not have to deal with
it except for John John Legend gladly be
Robo calling himself John Legend phone
call mining himself to make a restaurant
reservation and
and on live the era of celebrity voice
assistance has begun to begun the
assistant voice wars have I want to know
when the first time you're gonna have
you're gonna call a restaurant claiming
to be that person with that with the
voice hi can I I'm John Legend I'd like
a reservation for five at five hi I'm
Weird Al Yankovic I'd like to me I'd
like to make lunch reservations for
Scott Stein just like lots of polka
music so I think the real Scott Stein's
gonna check out some other stuff at i/o
sit on the ground there thanks a lot for
all your insights
that's far go get something to eat and
then find something to play with I mean
really dive around go check out the
sandbox
everything's playground themed there
find that teepee you go on a teepee you
hunt we're just a teepee home great I
owe teepee Highness begun I think you
think of the bathroom actually I
probably need it possible let's go back
to pee why not still doesn't have a name
you know they don't really usually give
out the name at Google i/o which is
disappointing but they aren't giving up
the beta and lots of beta yet as of
today so there's a ton of new features
like that's leaked we saw earlier today
it looks like it was legitimate we saw a
bunch of gestures gesture support at
this point some redesigns to that app
drawer which I thought was kind of neat
yeah when you slide up you can have four
apps you might be using and actions you
might actually want to do at that time
or you're gonna work out now you're
gonna call your spouse that kind of
intuitiveness is coming to the operating
system I'm really kind of surprised that
it's not like bells and whistles and
it's amazing it's all about helping you
get information now whether that's
creepy or not because you've fed Google
all this information and paying
attention I kind of like the convenience
factor well yes and no because then okay
fine it is helping you make decisions
but at what point is it gonna start
influencing your decisions you know like
how do I know that I would have texted
this person was it because
Google suggests it or was it because I
wanted to actually text this person so
how do we draw the line are you afraid
that you will lose your free will like
this is my will power of suggestion it's
what we were dealing with with Facebook
right it's like at what point can they
predict your what you're about to do and
at what point are you doing that because
they predicted it so to me it's a little
bit freaky can I go the other up to
where it's like predictive text on your
keyboard where it's like no I didn't
mean to I don't want that app right now
I want the other app you know where it's
it might just be more of a speed bump
until it till it learns really your
preferences in your habits yeah sure
they'll be a learning curve for sure but
and then how are people gonna like build
on top of that too because very few
filmmakers actually keep the Android as
a pure clean version of Android it was
that on top of that like I'm wondering
how some of that will be manipulated by
a third party
oh you somewhere horrible exactly
exactly yeah that's basically as your
preference is perhaps it was interesting
the contrast of all these new features
that are gonna make you engage with your
phone more and then they have the
digital well-being section where it's
helping you decide I think I thought it
was very cool it's just that you're
getting the contrast of this feature
that's gonna help you or make you engage
more with your phone and then their way
to disconnect so at least they're giving
us the option to get disconnect and I I
I thought it was cool I would I would
actually set limit I limit on myself for
the daily limits on the apps and I know
there's apps that do that for you
already but I think it's a good native
way to kind of control yourself and the
dashboard will tell you how how you're
using the different apps and you're
you're spending you know 40 percent of
your time in Gmail maybe that's a bad
thing do you think that's actually gonna
influence I think you think go to my
boss like hey I'm spending 50 percent of
my time in Gmail I've got an email me
less at work you know look I got proof
of it um I think that will because I
think it's kind of like we've seen with
battery life on apps for a while
now-like
which after using most of your battery
life and that tells you a lot and a lot
of time I look I'm like Holy Smoke I
didn't realize this on Facebook that
much or Holy Smoke I didn't realize I
was on snapchat that much Holy See Holy
Smoke about what we did in to get I mean
we talked a little bit about all the
things we got but
seems like there was a supreme lack of
consumer hardware yes the TPU 3.0 was
introduced you cannot wait to get that
pipe out to your children
no I couldn't no no new home versions
where where was where where I was where
you have to ask like that nothing new
phone stuff no new camera stuff a lot of
these products they've they've been
launching outside of events you know
smaller stuff like the clips and I think
of like the VR 180 stuff but yeah we
didn't see any actual hardware or
upgrades to existing hardware yeah
when's the last time we got a Google
home update besides the big ones well I
mean it services why is there upgrading
of the internal is essentially yeah
it'll start to software I've been also
thinking that there was a rumor that
YouTube remix would show up and be this
new music service and may be introduced
today that did not happen it was to eat
up Google Play Music which I'm all for
go ahead just pick one
I'm happy well I know before that we
were talking with stuff that we were
hoping for like how do you feel now with
AI and how that integrates into the
system and do you think it's going to be
that consistent assistant I don't know I
think with all the features they talked
about with AI and assistant all the
different versions of assistants because
you'll see it on a smart display you'll
see it on your watch you'll see it on
your phone and who knows where else I'm
hoping that the experience is consistent
but we've got to see how that plays out
over time because it's it's still
unknown I think they are really trying
and they they're going all in the when
they said assistant launched two years
ago that was it was absurd like wait in
two years you got this good yeah that
quickly and series just you know like in
the back corner crying because that
thing you know
thanks well it makes sense they Google
has the information to make this happen
they have the the user data to make this
kind of artificial intelligence happen
so I'm glad they're doing something
about it but it really does seem like
the artificial intelligence was the
predominant theme of this conference
it's the it's the thread that weaves
everything together and whether or not
that's going to
equal a more consistent assistant I'm
not sure but it's definitely we're
definitely starting to see it
intertwined in every different sector
that Google touches even way Moe which
is not part of Google was showing how
machine learning is helping the
automated vehicles avoid crashes and
drive it so in certain weather
conditions so that to me it seemed a
little bit all over the place but that
was the unifying thing that joined
everything today you could have done a
drinking game every time they said AI or
machine learning it's like it would have
been we stayed in the first 20 minutes
there was just so much of it and I think
that is to Google strength one of the
reasons they have a lot of data on us
but you know the worst always have the
ethical dilemma is that the right way to
go or is the way apples going with
protecting your data good but also does
that does that limit the functionality
something like Siri I don't know but we
all are really excited about what we see
a goal assistance so I think you're
going the right way with it so far there
was one minor hardware announcement I
did forget about this the smart displays
that were announced already at CES are
coming in July so if you really really
want a smart display with Google on it
you could have it I think it's strange
though that as far as I know you can't
use your chromecast the same way you
can't just say hey hey Google or yo
Google which is a new one you could say
yo Google put this on my television
other than getting YouTube videos that's
what the weather once and I got a
YouTube video of some random thing call
the weather and I don't know what you
would area in the world it was in or
what but it wasn't relevant to me at the
time was it nice weather though for
three so it's all those old weather very
old weather wedded one an all-weather
old weather though I guess jeez what
about we have Lexie on the phone she's
gonna kind of give us a insights of
what's going on and the in the pen right
now she's in the sandbox in the sandbox
she's hey Lexie hey guys can you hear me
we can hear you out here tell us what's
going on excellent
um I am in the middle of this door I'm
gonna call it the Thunderdome because
it's full of a lot of the things that
were talked about in the keynote today
so it's just being opened up to media
analysis a couple of other journalists
and press walking around this is all
about Android and every single thing
that they're doing on the platform so
there's a lot of stuff here for
developers I've literally just walked
into this space so I'll kind of show you
and tell you what I know when I see it
so I'm gonna flip the camera around and
give you a bit of an idea of what we can
see all right here we go
let's start off by walking to some
Android key demos now this is showing
off the app actions this is what they
spoke about in the keynote I'll just
take a look and see if anything's
working hi how are you could you show me
some actions
I'm just getting a quick rundown on some
of the live actions happening here in
Android pay so simple stem of course is
you open up the launcher and you see not
just critical applications but you also
see predicted actions and this is
suggesting that I should you know
navigate to work and this is suggesting
that I listen to today's because it's on
music demo which is if I search for lift
not only do I see search suggestion but
I also sees what if all the slices that
lift populates it says hey you want to
take a look home do you want take a lift
to work and so on
oh I say so this is exactly what we just
saw in the keynote exact can you show me
some of the gestures as well in Android
is this running Android P right now we
have to go to the other side okay guys I
got to a hat I got so ahead of myself I
was like I wouldn't see actions okay I
want to see gestures as well all right
I'm gonna I'm gonna try and flip around
and find you guys find you guys some
gestures all right it's a lot of
developers stuff here talking about
action and contact centric stuff I don't
want to go too much into that I want to
try and find the gestures all right all
right here we go oh this is looking
promising okay this is I don't know
exactly this is hi you're alive on CNET
can you help me find Android he gestures
I really want to see how these work
oh you can't just show me any justice on
the horn oh okay oh I was denied I can't
see any gestures yes I'm back yeah I'm
gonna find it guys I'm gonna find it I'm
determined all right here we go we are
going see this is what happens when you
do a live cross with me
you see people waving and you see me
discovering everything as you do okay so
there's just a lot of demo stations set
up here and then there's also a lot of
where OS ah exclusive press access
yes that is why that is why so there's a
bunch of other journalists in the room
here too in indeed indeed it was and did
it was so this is a bunch of different
watches that so Android wear OS is on
and we didn't really see anything in the
keynote about this so I'm not entirely
sure of what they're gonna be demoing
here it's not that much attention paid
here there's a beautiful looking display
but no other real kind of demos running
I mean I can go and start playing with
the phones but from the keen eye it
doesn't sound like there's too much news
so I don't know if we want to spend too
much time there but they have I think
they're just showcasing what how many
watches are running okay
I see exactly from what I see here is
not sure yeah that's true this is an
Android TV stuff over here and then we
switch around and we see some of some
indie games over here as well it looks
like they're showcasing a lot of things
that they didn't mention on the kokino
well that's usually the case so if we
think about kind of who is here the
intended audience it's all about
developers it's all about people
building for Android as the platform so
not necessarily going to show every
single thing that's in the keynote
there's the stuff that's available right
now and as we heard lots of features are
coming soon they're obviously not able
to be rolled out just yet so I was
hoping we have a little few more sneak
peeks I'm sure there's a lot of I mean
we wanted to look at walls of code I
mean
I mean we can do you're gonna see some
walls of code I mean I'm trying to flip
the camera now let's take a look I mean
it you know there we go ha ha that's
what it's for it's developers here well
and outside there's also some Android
auto stuff too there's a couple of cars
and if you can see out here just quickly
lots of cars with Android auto right any
of the screens any of the screens with
the Google home with a Google assistant
on them sorry that's possibly that's
possibly here this might not be the
right term there are several domes set
up so I'm gonna go and try and find all
the right ones this is the one that we
were showing into first but this
undoubtedly so much more this is this is
a huge it's like a music festival
there's so many options here and so many
alright awesome wow thanks
final thoughts that's my own types
reactions to this giant event Vanessa
well the one thing we didn't talk about
that was missing as well was the privacy
and all this and how the the Google
assistant and just AI in general for
Google how how we're gonna have control
over that whether or not we're gonna
have control over that is this opt-in is
this opt-out I think they vaguely
mentioned some things that were opt-in
but they didn't really mention any more
security features for Android P that's
for certain
and they didn't they didn't say how they
were gonna make sure that this kind of
technology is not going to be abused and
so I'm hoping that they are as they go
forward and moving into this direction
and get get garnering all this
information about us including you know
medical stuff that they do have a game
plan of how they're gonna keep it safe
as well so that's something that we
didn't see that may be even more
important than the where OS that we
didn't see so I just thought I'd mention
that not necessarily my favorite but
that was missing when they started to
send report I didn't mention like the
responsibility I had a sense of gravitas
when it came to this level
responsibility with all this data and so
far Google's been safe there's been like
no crazy breaches not like yes not going
away and like you know Yahoo level
the mess that's happened in the past and
how long can that possibly continue
especially as they get more and more
data what about you Patrick I think it's
just another iteration where years ago
we would see Google kind of being you
know very siloed with all the things
they did but here we see yet another
year like last year where they're
integrating their services they're
integrating the looks of things they're
integrating their AI they're integrating
the AR and I think the things are coming
off with are very compelling features
and softer than I'm excited about I you
know I think the question of privacy is
always there but at the end of the day
it's also just trying the stuff out in
real life it's trying that AR navigation
out and seeing how it actually works
it's trying that Google and stuff and
significant actually identify like an
outfit and do that and so I'm excited to
try all these things I'm just really
hoping all of it works as well as we saw
in the demos in this very well produced
you know keynote because you know if
you're holding up a screen and it
doesn't work it's not gonna be as fun
and even more frustration you get with
these devices the more pain points you
know I don't want to use this technology
and you won't pick it up again if it
doesn't work the first few times well I
mean I think not to go back to Apple
Siri but I think that was it didn't work
really well out of the box and a lot of
people have that bad taste in mouth
still today and I think it's the same
thing to be said about a lot of things
you saw today but it does look very
promising the duplex thing borderline
creepy but that's not a thing that's
coming out quite we saw a good mix of
things that we can look forward to in
the future and things that we're gonna
be available rather readily available to
users in the coming weeks so I think
they they played it well in terms of
offering something that we can use right
away and then giving us a look of what
they're planning planning ahead so
hopefully it all works when it does get
released as you mentioned I'm really
hoping this allows me to have the
laziest life possible yes it's creepy
yes they're following everything I'm
doing but well so is the NSA well you
know what the conversation doesn't
really stop here because you are going
to continue this I will never stop
talking about this on alphabet cities to
show this coming soon if you don't know
about it gonna cover everything do with
alphabet which and they they own Google
and they own way no way most feature
today at the keynote itself and
obviously Google owns tons of stuff like
Android YouTube Google home Nest's now
and all kinds of fun stuff so thanks for
getting that plug
does that mean like you're kind of like
the mayor or is I like you're the host
the mayor the guy down there gonna tour
bus I'd favorite commerce guy Chamber of
Commerce uh I don't have a sash I don't
have a sash you totally did that okay
maybe it's not we have so tune in would
see if I have astronaut and amalaki we
have now right smart smart I hashed I
worked I also I just wanted to mention
before we go thank you to our you know a
tweet at Twitter seen it live Charles
Leroy Smith just wanted to know was
wondering about Android P we didn't
discuss what the P stands for we kind of
did he thinks he said a image of a
pretzel guess we didn't say our get 194
what
howdy 400 K French port my reel won't
probably be pistachio my fake well it
would be pterodactyl discuss tobacco's a
silent P awesome mine would probably be
pudding thank you guys oh thanks pretty
see that's global headquarters thank you
for joining us today if you've got
questions about IO let us know on
Twitter you're watching us on YouTube
right now hit that bulb button and ring
that Bell so you can watch more CNET
news and reviews and obviously keep an
eye out for Alphabet City you guys this
is so much fun so what I keep somebody's
for having us here yeah mister mister
mayor of Alphabet City Chamber of
Commerce thing I'll be back with my sash
nice
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.