Javascript & SEO Considerations
How Search Engines Treat Javascript
In 2019, Google updated the version of Googlebot that crawls the web to coincide with updates to Chromium and the ability to more effectively render JavaScript. Though Google may now be able to render and understand content delivered through complex JavaScript frameworks, it can still pose problems for organic search performance.
Rendering does not occur during the first crawl. Instead, Google fetches the code on the page, and files it away into its index. Once rendering resources become available, Google then renders and processes JavaScript so it can better understand additional content, structure, and overall UX. This process can take weeks.
If critical content is not delivered until rendering, that means it may take Google weeks to understand the content on a page. This could present issues with Googlebot discovering and understanding your site’s content, links, navigation and hierarchy if links are not easily discovered and served in the HTML.
It is critical to ensure that Google can crawl hidden content under tabs or discover links that are generated via JavaScript. Ensure Google can navigate through all of your internal links, including pagination pages. This is especially important because JavaScript can be problematic using an infinite scroll. Be sure no important JavaScript files are blocked in robots.txt, and that JavaScript does not remove any content.
Another thing to consider when using JavaScript is that other search engines may also struggle with executing and rendering advanced JavaScript, and this can impact organic traffic. Bing represents a sizable market share of search in the United States, and renders and processes JavaScript differently than Google.
The way you handle JavaScript can also have massive implications in the way content appears on social. Facebook, Twitter, and LinkedIn don’t necessarily execute JavaScript. As a result, shared content that leans on JavaScript to populate Open Graph and Twitter card information may result in a drop in click-through-rate, as it won’t include the title, description, or thumbnail of the page.
Solutions for Consideration
For Google, we just need to ensure Googlebot is able to see and understand critical content when it first crawls the site. Googlebot recommends the following rendering options in order of preference:
Recommended: Hybrid Rendering: Hybrid rendering takes advantage of the speed of rendering items client-side, and the comprehensive approach of server-side rendering. Hybrid rendering essentially means rendering the first page or critical resources server-side so that your content is included in the first-wave of indexing. Then, after the initial critical content has been delivered server-side, JavaScript runs on top of this on the client-side to enhance the user experience.
For hybrid rendering, consider the following resources:
JavaScript Framework | Hybrid Rendering Solution |
---|---|
React | Next.js |
Angular | Angular Universal |
Vue | Nuxt.js |
Recommended: Server-Side Rendering: Server-side rendering ensures that Googlebot can access all dynamic elements of a site on its initial wave of indexing, and makes content available straight away to search engine crawlers and users alike. The biggest concern with server-side rendering comes down to site speed and user-experience. Sometimes, server-side rendering can take up a large amount of resources on the server and slow down the delivery of content.
Not Preferred: Dynamic Rendering: Dynamic rendering is purely a workaround for search engines, and includes services like http://prerender.io . This is when search engines are sent to a dynamic renderer which sends back the rendered content or a static HTML version of a page. The user will still have to render the content on the client-side, but this at least ensures Googlebot and other search engines can see and understand the critical content.
Not Recommended: Client-side rendering: If all of these options are unavailable, we simply have to lean on client-side rendering. This means that it could take Googlebot time to come back to the indexed content and render all resources to fully understand the page contents.