so we're here at day one of Google i/o
checking out new features for google
lense it's an AR and AI platform for the
company and it's basically built into
Google assistant and now it's built
right into the smartphone's camera so
Google first introduced google lens last
year and basically at the time it was a
way to look through the cameras
viewfinder and identify objects and
photos now lens is much more
sophisticated it uses all the Google's
understanding of natural language
processing and the object recognition
image recognition it combines it into
one big platform so that the smartphone
can see and understand the world around
it and it can parse human language prior
to today Google lens is only available
within Google assistant now it works
right from the smartphone's camera and
it works on other devices right here we
have an LG g7 and we have a whole wall
of props behind us that we can use
google lens to identify and get
information from Google search there are
three ways to access google lens the
first is to just open the camera and
click the google lens button from there
the phone starts looking and trying to
identify objects that sees through the
viewfinder the second way to access
google lens is basically just by
touching and holding the home button
down here launching assistant and just
clicking the lens button
and as you can see right now lens
already sees it and identifies objects
with these little colored dots that says
it knows what it is tapping on one of
the dots will pull up Google search
results so you see it it understands
that this is an album by justice woman
and conveniently justice happens to be
the artist performing at Google i/o
tomorrow and the third way to access
google lens will be a double tap on the
camera button but that only works on the
LG g7 if you look at some of the
clothing here that doesn't quite
identify the clothing but it asked if I
like the clothing other guys is trying
to build a preference profile for me
let's try this one right there it goes
it pulled up shopping results from
Macy's from QVC so it understands what
this item of clothing is and then
prompts you to buy it online now as you
scan google lens over other objects
it'll slowly start to recognize
everything else that you you pan it over
so we have a piece of art right here
that is not correct
hold on looking for results here we go
so it went from the elbow but now it
knows that this was a this is a painting
by Pablo Picasso right here it sees a
photo
and it knows that this was a Norwegian
London hunt I don't think I pronounced
that right but it is a dog breed and
Google identified it so Google wins
isn't just for photos and objects you
can do a lot with text now that includes
text inside the book jacket of a book it
includes text on menus at restaurants
you can point the camera at a whole list
of food items and you can pull up images
of those food items you can pull up
YouTube videos of how to make them you
can even translate those food items if
they're in another language into English
or into Spanish or any other language
you want that Google Translate supports
now you're looking at a book for
instance like the book swing time by
Zadie Smith you can look at huge
passages of text you can even grab that
text using google lens and you can pull
it out as if you would just copy and
pasted it from a document from there you
can translate the text into another
language you can even then do Google
searches on it
Google blends essentially takes text
from anywhere out in the world street
signs restaurant menus even books and it
makes that text searchable now the
underlying technology behind Google wins
it isn't just for basically looking
through a smartphone viewfinder and
looking at products or trying to
translate text
what powers Google lenses level the
foundational AI work that lets Google do
a our experiences so for instance
because Google's software and the phones
that power that software can understand
and see the world you can create whole
virtual 3d images for instance you can
have paintings come to life right out in
front of you and you can walk around you
can even see the reflections of objects
behind you in those 3d images if
developers design them in the right way
and know what environment you're
standing in that's pretty wild
you can also a point your camera lens at
a podium and have an entire 3d image
kind of come to life in front of you
grow up into the sky and encompass the
entire vertical area around you now
these Google lens features are all
coming later this month and as Google
said onstage at the i/o keynote they're
coming to more than just pixel devices
and within the assistant you also be
able to access some iOS from within the
assistant itself but you have to use the
assistant you won't be able to access it
from the iPhones
camera of course for all the news and
announcements from Google i/o 2018 check
out the verge comm and subscribe to us
on YouTube at youtube.com slash The
Verge
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.