Actionable SEO Takeaways from Google I/O 2021

Google I/O is Google’s annual developer festival. This year there were several important SEO discussions. Let’s go through them and talk about how they’ll impact your business.

Table of Contents

Using HTTP/2 to update crawling

Google uses web crawlers to discover your webpages. Crawlers go from link to link, gather data about each page, and then carry that data back to Google’s servers. Google then takes the data – information about a page’s content and functionality – and adds it to its Search index.

That index is over 100,000,000 gigabytes because a webpage is added to the multiple entries in the index – one entry for every word the page contains. 

Google uses that index to pull up search results when someone types in a search phrase. This is why it’s so important to optimize your pages for crawlers. Among other things, if you don’t have a good use of keywords, or if your page isn’t included on a sitemap, you’re toast.

What does that mean for HTTP/2?

HTTP/2 is a revision of the web protocol HTTP. It uses the same methods, status codes, and semantics as HTTP, but changes have been made to improve end-user perceived latency by speeding up page loads and reducing round-trip time (RTT), especially for web pages that are resource-heavy. After HTTP/2 was released in 2015, Google announced that it would stop supporting its SPDY protocol and start supporting the new HTTP version. This makes it important for online businesses to upgrade to HTTP/2 if they also want to reduce RTT and page load times.

While HTTP/2 has been around for six years, Google only started crawling over the protocol in November 2020. Now over half of all URLS are crawled with HTTP/2.

What are the HTTP/2 crawling benefits?

The new practice allows Google to request more URLS without increasing the load on a server. The search engine can now request multiple files in parallel, reducing the time it spends crawling.

If you have a large or medium site with content that changes either weekly or daily, HTTP/2 crawling will be better for you. This is because Google doesn’t want to overwhelm your servers and calculates a crawl capacity limit. This limit determines how many simultaneous parallel connections Google can use to crawl a site, as well as the time delay between crawls.

The crawl capacity limit is determined in part by your site response. If it responds quickly, the limit can go up; if your site slows down or responds with errors, Google will crawl less. This means that it can take longer for your new content to be indexed and to appropriately show up when someone is searching on Google. HTTP/2 helps eliminate some of those issues.

If you want Google to use HTTP/2 on your website, all you need to do is enable HTTPS and HTTP/2 support on your web server. Google will take it from there as long as it determines that there’s a possible efficiency gain – again, if you are a larger site with at least weekly content changes, Google will probably make the switch.

Use supported structured data for embedded videos

Videos now show up in Google search results, and while the Ruling Monarch of Search Engines does its best to automatically understand everything about your videos, there are a few things you can do to help it out. This includes marking up your video with VideoObject and providing the:

  • Description
  • Thumbnail URL
  • Upload date, and
  • Duration 

In fact, if you want your content to be displayed as a video rich result, you must include the required VideoObject properties.

If you want to optimize your content further and make it even more Google and searcher-friendly, you can include recommended properties such as:

  • BroadcastEvent: this allows you to add a LIVE badge to your video
  • ItemList: mark up pages with a list of videos
  • Clip: mark important segments in your videos so that users can navigate to the points that best answer their questions
  • SeekToAction: tell Google how your URL structure works so that it can automatically identify key moments and share those with users

Review your pages for Page Experience

In November 2020, Google announced that there would be a page experience update. The updates were rolled out starting in mid-June 2021 and are set to be complete by the end of August. This means that Google now uses page experience as part of its ranking system.

Google shared that page experience will not override great page content, so it’s still important for websites to focus on providing the best content possible. However, if your competitors are also on it when it comes to content, great page experience could be where you pull ahead in the search results.

Google looks at:

  • How fast do pages load? – the main content, known as the Largest Contentful Paint (LCP), should load within the first 2.5 seconds 
  • What is a site’s interactivity? – aim for a First Input Delay (FID) of less than 100 milliseconds; this is the time between a user pressing a key and the browser responding
  • How well does a page provide visual stability? – look for a Cumulative Layout Shift (CLS) – content unexpectedly shifting – of less than 0.1
  • Is a page mobile-friendly?
  • Is a page safe and secure?
  • Is a page served over HTTPS?
  • Is a page accessible to all users? – 15% of users around the world have some form of disability

If you’re wondering how your website is doing, you can check out your Page Experience report in Search Console. You can also visit Google Analytics and get detailed information on who is visiting your page, how they found you, and how they are interacting with your website.

More stories eligible for Google News Top Stories

Google is making changes so that all web pages are eligible to be included in the Top Stories carousel regardless of their Core Web Vitals (CWV) score or page experience status. The only requirement for inclusion will be that pages are compliant with the Google News content policies. 

For your content to be eligible:

  • You must clearly disclose any sponsorships
  • Advertising and other paid promotional material must not exceed your story’s content
  • There should be no misleading titles or snippets
  • Use clear dates and bylines
  • Include information about the authors, publication, and publisher
  • Include information about the company or network behind the content
  • Include contact information

You must also follow all of Google’s search features policies, and should follow these best practices to make it easier for Google to find, understand, and rank your page:

  • Make sure every page on your site is reachable from at least one static text link
  • Minimize the number of redirects needed to follow a link from one page to another
  • Keep all pages’ structured data up to date
  • Provide a clear publication date and time
  • Place the title of your article above the article body and match the HTML <title> tag to the title 
  • Avoid using the article title, or a substring of it, as an active hyperlink in your article page – this makes it harder for Google’s crawler to determine the title
  • Do not include a date or time in your article title – this again makes it difficult for the crawler
  • Titles should be at least 10 characters and between 2 and 22 words
  • Use images that are relevant to your story, rather than logos or captions
  • Format all images as inline
  • Follow Google’s rules for good snippet creation so that the best text shows up under the title in Google’s search results

Faster page rendering with prefetch

You can use signed exchanges (SXG) to allow GoogleSearch to prefetch some of your content while still preserving privacy. This means that after a user clicks on your search result, the page will start loading much faster, since key resources are already available. This makes users happier, and could improve your ranking overall.

Here is an in-depth guide for how to implement SXG, and another guide for AMP pages.

Lighthouse

Lighthouse is one of Google’s developer tools, used for improving your overall site quality. You give Lighthouse a URL, and it then provides a list of recommendations for how to improve features like page performance and accessibility. For each recommendation, Lighthouse includes a report that explains why the audit is important and how to fix it.

The latest feature for the tool is called Lighthouse Treemap. It allows you to see JavaScript page size across different bundles, and see how you can make improvements based on resource sizes and execution coverage. For example, you could check to see how much code, if any, is unused and could be trimmed down.

Winning Websites

There you have it. Where are you going to start first? Whatever you do, remember that SEO is an ongoing process. Not only do tools and algorithms sometimes change, but a static website is only going to start sinking to the bottom of search results. If you need help boosting your online reputation or creating a SEO strategy, let us know!

Get our weekly newsletter

Each week we highlight the top marketing stories that intrigue us or simply make us laugh.

This field is for validation purposes and should be left unchanged.