“How do we take…rigid notions of computing and turn them into vibrant experiences that humans adore?” That was the question posed by Prabhakar Raghavan, a Senior Vice President at Google, in the opening of the company’s Search On 2020 event.
The answer to a question as far-reaching as that could fill hundreds of books, but in the context of recent technological innovations the answer seems to lie primarily in artificial intelligence (AI). Want to see how AI is shaping the future of search? Learn how Google is leveraging its power.
- What Is Google’s Search On 2020 Event?
- Better Language Comprehension
- High-Quality COVID-19 Safety Information
- Improved Video Perception
- More Robust Data
- Support for Journalists
- Enhanced Image Recognition and Augmented Reality
- User-Friendly Song Identification
What Is Google’s Search On 2020 Event?
If you’re not familiar with the Search On event, you’re not alone—2020 is the first year Google has hosted an event under that name, perhaps as a response to its cancellation of Google I/O 2020.
Whatever the case, the Search On event took the form of a livestream on October 15, 2020. If you’d like to watch the full livestream, it’s available in its entirety on YouTube:
In addition to Prabhakar Raghavan, the event was presented by Cathy Edwards, Director of Engineering, and Aparna Chennapragada, Vice President and General Manager of Consumer Shopping.
Throughout the event, all three presenters outlined Google’s efforts to harness the power of AI to create a more intuitive search experience, provide users with accurate information and facilitate top-notch journalism.
Better Language Comprehension
In November 2018, Google introduced a new neural network-based technique for natural language processing (NLP) pre-training dubbed Bidirectional Encoder Representations from Transformers, more commonly known by the (significantly more catchy) name BERT.
As Google Fellow and Vice President of Search Pandu Nayak explained one year later, BERT was improving users’ search experience by understanding the intent behind queries rather than just the words they contained.
In the provided example, BERT allowed Google’s algorithm to understand that the query 2019 brazil traveler to usa need a visa concerns a Brazilian citizen traveling to the U.S. and not vice versa:
Fast forward to the Search On 2020 event, where Google announced that BERT is now used in almost all English queries and is actively improving its algorithm’s understanding of several areas:
- Spelling errors: One out of every ten searches is misspelled, so Google’s ability to accurately decipher misspellings is crucial to the user experience.
- Specific passages: Google is now able to index not only whole web pages but also individual passages in order to return better results for queries concerning a specific portion of text.
- Relevant subtopics: Humans naturally associate broad topics with their related subtopics, and now Google’s algorithm does too. For example, it understands that the topic home exercise equipment covers subtopics like affordable exercise equipment and small space equipment.
High-Quality COVID-19 Safety Information
The COVID-19 pandemic has made it more important than ever before for users to access verified, up-to-date information pertaining to their health and safety.
To facilitate that, Google is now providing users with current information about the businesses around them. This includes live busyness updates to assist in social distancing measures, as seen here in the Google My Business listing for Boston’s Gracenote Coffee cafe:
It also provides health information about individual businesses. For instance, Google now specifies whether or not they require visitors to wear masks and whether dining establishments offer takeout or no-contact delivery, as seen in the Google My Business listing for Los Angeles’ Perch restaurant:
Improved Video Perception
In the early days of online video sharing, videos were only understood by search engines based on their title and description. Now, the latest Google update is using AI to better comprehend videos and even identify key moments within them.
With this ability, Google can now divide videos into chapters. This makes it easier for users to navigate through videos and find exactly what they’re looking for, right on Google’s search engine results page (SERP) under a section titled in this video:
On the YouTube website itself, chapters show up as sections on a video’s progress bar and display their titles when moused over:
More Robust Data
Up to this point, Google’s ability to quickly display statistics has varied. While commonly searched-for statistics such as general population counts would typically be displayed at the top of the SERP in an easy-to-read graph, more obscure data would more likely be buried within individual pages.
Now, Google is leveraging information from its Data Commons Project to show users hard-to-find statistics in a handy visual format:
While this feature doesn’t yet work for all statistics, its scope will ostensibly grow as Google’s AI continues to learn.
Support for Journalists
2020 has been nothing if not eventful. As such, the public’s demand for reliable journalism has put pressure on tech giants to ensure it’s as accessible as possible.
The latest Google update goes a step further by providing journalists themselves with a suite of tools in the new Journalist Studio:
It launched with two tools, Pinpoint and a beta preview of The Common Knowledge Project.
With Pinpoint, journalists can:
- search through and analyze thousands of documents including email archives, PDFs and forms;
- filter documents by people, organizations and locations; and
- transcribe and search through audio files.
With the beta preview of The Common Knowledge Project, they can:
- examine, visualize and share data about issues affecting their local communities such as crime, health and education; and
- create custom charts from billions of public data points gathered from sources like the U.S. Census Bureau, Centers for Disease Control and Prevention (CDC) and Federal Bureau of Investigation (FBI).
Enhanced Image Recognition and Augmented Reality
Google’s update also includes AI-powered improvements to Google Lens and augmented reality (AR).
Lens is Google’s visual search tool which allows users to make queries based on their own photos. For instance, users can take a photo of an object like a shoe, lamp or backpack to find similar products. Or, they can scan text and translate it to their preferred language in real time:
Now, Lens also enables to users to:
- get step-by-step homework help on a variety of subjects;
- identify more plants, animals and landmarks than ever before; and
- tap and hold a product image on the Google app (or on the Chrome app on Android devices) to see identical or similar products.
Google has upgraded its AR capabilities too. By definition, AR is a technology that superimposes digital information and objects onto real-world environments in real time. Users can already use Google’s AR tool to do things like see 3D digital objects in their own space, view Google Maps directions and more.
Soon they’ll also be able to view educational materials such as a 3D model of the periodic table and explore photorealistic 3D models of cars in the space of their choice:
User-Friendly Song Identification
When the Shazam app was released in 2008, it permanently changed the way people identified music. Rather than performing a Google search for a snippet of lyrics, they could simply open the app, allow it to “listen” to a few seconds of music and instantly see which song it belonged to.
Google’s latest update is giving users the same capability in an even more flexible format. Now, users can simply hum a melody into their phone:
There’s no separate app required, either—users only need to open the most recent version of the Google app (or use their Google Search widget), tap the microphone button, say “Hey Google, what’s this song?” and hum away.
As of October 2020, the hum-to-search feature is available in upwards of 20 languages on Android. On iOS, it’s available in English only.
New Features, New Era of Search
On their own, each of the Google Search features unveiled at Search On 2020 provides a relatively minor improvement to the user experience. From users’ perspectives, the functionality they provide is convenient but not necessarily groundbreaking.
From an SEO practitioner’s perspective, though, these new features are greater than the sum of their parts. Together, they paint a picture of an AI-driven future of search that’s intuitive, tailored to suit human behavior and user-friendly like never before.
Image credits
Google / October 2019
Screenshots by author / November 2020
Google / October 2020