/*test3*/ Google Lens: Silent Revolution Nobody Foresaw | iGotOffer
Apps: LifestyleGoogle News & History

Google Lens: Silent Revolution Nobody Foresaw

Google Lens: Silent Revolution Nobody Foresaw
Google Lens: Silent Revolution Nobody Foresaw

Google Lens: Silent Revolution Nobody Foresaw

This week at Google’s I/O conference, its CEO Sundar Pichai introduced a feature that will overhaul our online experience forever. Meet Google Lens – your door into an augmented reality. This is the thing that the IT crowd has been talking about for years, but when the Google Glass didn’t take off, many viewers swiped the augmented reality left. It would never happen in our lifetime, they said. And they couldn’t be more wrong.

It took Google engineers two years to hone the technology to integrate with Google Photo, Google Assistant and… your life. Point your camera at a flower, and Google will recognize it and give its name to you. Point the camera onto a concert’s bill and Google assistant will put an entry into your “to-visit list”. Your camera will help you to get info about shops, bars, cafes and restaurants, about parks and gyms, landmarks and routes. And no, we aren’t talking about QR codes here.

google lens - Google Lens: Silent Revolution Nobody Foresaw

Google Lens is capable of recognizing objects in a human way. You wouldn’t think a flower from the first example features a QR code on its petals, would you? If you look at a picture of a rose, you see the flower; the color of it, the background, the setting. You can tell just by looking if it’s late summer or early fall, and if there’s a bee or a dew drop on the captured petal. With the help of the Google Lens algorithm, your smartphone can now do it too! More than that, if the picture you’ve taken is dark or noisy, Google Lens can lighten it. Or, if a twig squeezed into the frame takes up the front, Google Lens can erase it. Not in a Photoshop way, leaving the white space behind, but simply remove the obstacle as if it never had been there in the first place!

Devoted Web surfers may not see anything outstanding in it, as they may have used the ‘Search by picture’ function for a long time. But, this is another story. When asked to find an image in Google and any other search engine, the programs simply compare the arrangement of color spots and gamut. That’s why, looking for a red rose against a blue wall you can get all kinds of red and blue pictures, the one with a fire hydrant included. It’s annoying, I know. But, we used to shrug it away, saying: ‘Oh, it’s just a computer, it can’t do better’. Now, it can!

It can tell a rose from a fire hydrant and a concert bill from a street art. Why? Because Google is now capable of processing all this big data. The company built the first ever AI-based data center. This is the silent revolution nobody foresaw or ever even heard of. You see, Google made warehouse data centers to store all the needed information for years, since the Google Assistant voice control launch. It took hundreds and hundreds of servers to process the simple request of, ‘Okay Google, pizza delivery phone’.

So, Google faced a problem: either they needed to build some more giant data centers and bear the expenses and electricity bills that came with them, or invent something completely new. And here is when the TPU arrived. Google developed the state-of-art Tensor Processing Unit. Tensor calculations are so complicated that even some mathematicians have a heard time understanding them. They are used to described complicated physical processes on Earth as well as in open space, and has become useful for processing big data as well.

Installing TPU in the servers, Google can save the space and money. But, this is just the beginning. The TPU is used in neural network environments as well. What is a neural network? This is a network architecture that imitates the human brain and its neurons, hence the name – neural. It’s a fresh technology and it has a great advantage over the old, regular networks.

You see, brain neurons cooperate with each other to process any info we percieve through our eyes, ears, and skin. There are myriads of them; more than there are stars in the universe. And they all send signals to each other to communicate in a seemingly chaotic way. We humans do not list variants and do not compare them before doing anything. For instance, it takes a fraction of a second to remember that this very rose is like the one you saw last year in your aunt Augusta’s garden, but not quite the same. And she would be delighted to see a new breed of it. Computer engineers all over the world are trying to imitate this intricate web of associations and memories in their products.

And Google pioneers the industry. So, thanks to TPU powered data centers, Google Lens can recognize the flower and search the floral wiki and bring back the answer. Or, it can warn you, ‘NO! This is not a cute green creature, it’s the poisonous ivy, RUN!’

Links

How to use Google Lens: The Coolest App You Aren’t Using [Video]

Video uploaded by Digital Trends on June 18, 2019

Click to add a comment

Leave a Reply

Your email address will not be published.

Apps: LifestyleGoogle News & History

More in Apps: Lifestyle

©2024 iGotOffer.com. All Rights Reserved. iGotOffer.com is not affiliated with the manufacturers of the items available for trade-in. iGotOffer.com is trademarks of Best Video Studio LLC, registered in the U.S. All other trademarks, logos and brands are the property of their respective owners.