Google Plans to Improve their Web Rendering Service

At the Chrome Dev Summit 2018, Martin Splitt, a developer advocate at Google, shared some interesting plans about their Web Rendering Service.

Google wants to:

  • make crawling and rendering integrated,
  • make their Web Rendering Service evergreen

What does that mean?  

The Google Web Rendering Service will always run on the newest version of Chrome

As of right now, Google uses Chrome 41 for web rendering, which is a 3-year old browser. It doesn’t support many modern features, so it forces webmasters to use techniques like transpiling to ES5 and using feature detection and polyfills.

Learn more: Here you’ll find out which features are supported by Chrome 41.

It’s kind of a paradox that Chrome is a trendsetter in the “real world,” but Google, the owner of Chrome, uses an outdated version for web rendering.

However, as Martin Split stated, this is going to change.

Google plans to have their Web Rendering Service always up to date and connected to the Chrome release schedule.

  • Google releases, let’s say, Chrome 80 -> the Google Web rendering team will update their Web Rendering Service to Chrome 80 as soon as possible.
  • When Google releases Chrome 81 -> Google Web Rendering Service is updated to Chrome 81, and so on.

Let me quote Martin Splitt:

“We basically work on a process that we hopefully gonna start very soon. So, I make no promises on when but we’re working on figuring out a process to stay up to date with Chrome. So that Googlebot does run with the Chrome Release schedule so that we are basically giving you an evergreen. Yeah, right – that would be fantastic if we could do that.”

 

Why is this important?

If your website uses modern features, it will work fine in modern browsers (like Chrome 70), but it may crash in older browsers like Chrome 41.  For instance, if you use the popular “let” construction outside of the strict mode, it will crash for Googlebot. The same situation will happen if you use JavaScript promises.

That doesn’t sound serious? Errors in rendering caused Google to deindex some pages of Angular.io (the official website of the Angular Framework) in the past. This website worked fine in modern browsers but crashed in Google.

All of this could have been avoided if Google used the most recent version of Chrome for web rendering. I covered this topic in-depth in my Ultimate Guide to JavaScript SEO.

Google plans to make crawling and rendering integrated

For now, Google uses two-wave indexing (the latter is delayed). What does that mean for JavaScript-rich websites?

Basically, in the first stage, Google discovers the initial HTML of your page (no rendering). Then, if resources become available, Google can render your page.  

How long does it take for Google to render a page?

As John Mueller stated, usually it takes a few days and sometimes a few weeks.

Tom Greenway is slightly more optimistic. According to him, sometimes it can take up to a week before the render is complete

For some of you, it may be surprising that it takes so long, but the web is really big. There are over 130 trillion documents on the web and it’s extremely difficult to crawl and render the content – even for giants like Google.

But for you, as an owner of a rapidly changing website, it’s a kind of a big deal. Imagine you have a client-side rendered website with car postings. We know it’s rapidly changing because offers can become outdated within days.

So, by the time Google is finally able to index it, the offer may already be outdated.

In such cases, Google proposes dynamic rendering – serving users a fully-featured JavaScript website while serving bots a prerendered, HTML snapshot.

Martin Split at 7:50 discussed with Tom Greenway which solutions you can use for dynamic rendering.

However, prerendering is error-prone. Disqus.com is a perfect example. Their prerendering failed. We informed them on Twitter three weeks ago that they serve an empty page for Googlebot.

They still haven’t fixed it and they are losing their rankings.

Side note: at Onely, we noticed that for massive websites, dynamic rendering (serving bots a prerendered version of a website) is not always the most optimal solution but that using Isomorphic JS is sometimes better.

Bartosz Góralewicz explained it briefly during his presentation at SMX East. https://www.slideshare.net/goralewicz/javascript-tips-tricks-2018-smx-east, starting from slide 63.

What are Google’s plans?

Google plans to make crawling and rendering integrated. So that when Google crawls a page, it gets rendered rapidly.

If Google is successful at this – making JavaScript websites successful in Google Search will be much easier than before.

More insights from Martin Splitt and Tom Greenway’s presentation

I strongly recommend you watch the video from the Making Modern Web Content Discoverable for Search presentation of Martin Splitt and Tom Greenway. You can watch it here:

There are a few interesting insights that are not strictly related to JavaScript SEO.

For instance, Google will shortly allow sharing Google Search Console reports with people who don’t have access to Google Search Console.  

So if you’re an SEO working on an audit for a website, you will be able to share a particular report with developers easily, even if they don’t have a Google Search Console account.

Summary

Google’s plans for making their Web Rendering Service up-to-date and integrating crawling and indexing is definitely a long-awaited step in the right direction.

If Google overcomes these technical challenges, It should be easier to make JavaScript websites more successful in Google

But, of course, “we trust but verify.” There are a lot of questions coming: will integrating crawling and indexing cause slower crawling? Will every website get rendered?

I will keep my fingers crossed for Google. There is a lot of hard work for them coming.