JavaScript Rendering and SEO: Can Search Engines See JS? 

Testing Page Speed
5 of The Best Tools to Test Your Google Website Page Speed
March 17, 2023
A Robust Content Marketing Strategy
The Role of Content Marketing in Investor Relations Strategies
March 31, 2023
Show all
Reviewing Website JavaScript

Reviewing Website JavaScript

There has been a spirited debate in the SEO world for some time now– can search engines see JavaScript?

Over the years, many notable players in the SEO industry have run tests to try and determine whether Google and other search engines can crawl, render, and execute JavaScript resources. Since people began discussing this topic many years ago, search engines have become more advanced, bringing us to the current year, when most major players can see JS.

That being said, there’s more to the story. Just because Google and other search engines have gotten more advanced in their ability to deal with JS doesn’t mean that your pages are necessarily being indexed as they should be.

In this article, we’ll look at how search engines respond to JS resources, how you can ensure that your pages are being indexed by Google, and best practices for JavaScript SEO.

Can Search Engines See JavaScript?

Back in the old days, search engines would be able to see the content of most pages through a downloaded HTML response.

Search Engine Able to See JavaScript

As JavaScript has become an increasingly critical part of the modern web, search engines can no longer solely rely on this tactic to understand what content is on a page in the way a user would see it.

Understanding How Google Crawls, Renders, and Indexes Sites

At Google, the system that deals with the JavaScript rendering process is known as the Web Rendering Service (WRS). This process crawls and stores the resources that are necessary to build pages, such as JavaScript files, CSS files, and HTML.

There are three main phases in which Google processes JavaScript: crawling, rendering, and indexing.

Googlebot (aka Google’s web crawler) will queue pages for both crawling and rendering and then crawl every URL that is in the queue. After the Google bot sends a GET request to the server, the server will respond with the HTML document. At this point, the web crawler determines which resources are required in order for the content of the page to be rendered.

The URL Indexing Process

What this means is that it will crawl the HTML of the page and not the CSS or JS files. The reason for this is that rendering JS takes a ton of resources.

The rendering of JavaScript is, therefore, deferred, and anything unexecuted gets queued to render later when there are more resources available.

At the point at which there are ample resources to process JS files, the page is rendered, and the JavaScript is executed. The rendered HTML is processed again for links, and the URLs that it discovers are queued for crawling. Finally, the rendered HTML is used by Google in order to index the page.

Are your product pages on Shopify not being indexed by Google? If so, take a look at this post that explains why your product pages aren’t being indexed and what you can do to fix the problem.

Can All Search Engines Render JavaScript?

In short, the answer is that Google and Bing are now able to render JavaScript when they are crawling sites. That being said, there is a long list of things that can end up going wrong as a part of that process. Using the inspection tool in the Google Search Console can help you understand whether Googlebot is able to see your content.

The world of crawling, rendering, and indexing sites is changing quickly, and you’ll find ample information online telling you that other search engines aren’t advanced enough at this point to index JavaScript websites. Considering that Google has the lion’s share of the search market, though, you might not be particularly concerned with whether or not search engines like Baidu or Yandex are able to crawl and index JS sites.

The truth is, though, that Yandex and Baidu are both able to process and render JavaScript sites.

A Website Crawler

It’s worth noting that there are separate queues for regular crawling and rendering, which means that sites that require rendering take longer than those that don’t.

Googlebot goes through the queue the first time to request and receive the server-supplied HTML and deals with the rendering later. Even though Google has stated that the delay between the first crawl and rendering is now only a matter of seconds, it often takes days or weeks more to index pages that need to be rendered versus those that don’t.

Are there pages on your site that aren’t linked to anywhere else on your website? These are known as orphan pages, which can have both a negative impact on your SEO and your user experience. This guide looks at why you don’t want to leave orphan pages unattended and how to find and fix them.

Common JavaScript SEO Issues

There are a number of issues that frequently crop up when it comes to JavaScript and SEO.

Fixing JavaScript SEO Issues

Here are some of the common problems as well as the best practices you can use in order to avoid them:

  • Google is impatient when it comes to rendering JavaScript content and won’t wait long for it to render. It’s possible that a timeout error is leading to your content not being indexed.
  • Content that should be indexed shouldn’t be included in lazy loading or delay loading. Instead, you’ll want to focus your efforts largely on images rather than text content when you’re instituting lazy loading.
  • If your robots.txt file has .js files blocked, it can mean that Googlebot isn’t able to crawl them. When Googlebot can’t crawl resources, it doesn’t have any ability to render or index them. You’ll want to make sure that you allow .js files to be crawled to ensure that they will be indexed.
  • You’ll want to make sure that you use internal links in order to help Googlebot be able to find the pages on your site. The reason for this is that search engines aren’t going to click buttons on your site.
  • You should make sure that static URLs are generated for your site’s pages because Googlebot will frequently ignore hashes– i.e., make sure that there are no “#” symbols in your URL.
  • You’ll want to make sure you prevent soft 404 errors, which can be particularly difficult to do in single-page applications. You can either redirect to a URL where the server responds using a 404 status code or change the robots meta tag to noindex in order to avoid this type of error.
  • JS features that require user permission are likely going to be declined by Googlebot. To solve this problem, you’ll want to make sure that your content is accessible to all users without requiring permission (for example, such as requiring that users require camera access.)
  • In order to avoid caching issues with Googlebot, you’ll want to use content fingerprinting. Caching headers can be ignored by WRS, meaning that it can end up using outdated CSS and JS resources. Content fingerprinting makes a fingerprint of the part of the filename that contains the content to avoid this issue.
  • HTTP requests are used by Googlebot to retrieve content, so you’ll want to make sure that your content works properly with HTTP connections.

Are you wondering how you can boost the performance of your webpage without sacrificing your valuable JS resources? Check out this post about how to delay loading JS assets until the user’s about to interact with them.

Making Your Site’s JS SEO Friendly

Making sure that search engines can properly crawl, render, and index your JavaScript content is essential to make sure you aren’t losing out on potential traffic.

Making JavaScript Content SEO Friendly

Here are three steps you can take to make your site’s JavaScript SEO-friendly.

Find Errors in the Google Search Console

Just because your site launches just fine on Google Chrome doesn’t mean that Googlebot will be able to render its content. In order to find out if your pages can be rendered by Googlebot, you’ll want to go to the Google Search Console and use the URL Inspection Tool. Once you get here, you’ll want to enter the URL of the page you are testing into the bar at the top of the page and hit enter.

Once you’ve done that, select the button on the far right of the screen that says “Test Live URL.”

Google Testing Live URL

This process takes a minute or two, but soon enough, you’ll see a “Live Test” tab appear. You’ll want to select “View Tested Page” so that you can see a screenshot of the page’s code.

In order to look for missing content or any discrepancies, take a look at the “More Info” tab.

One of the more common culprits for your JS pages not being rendered by Google is your site’s robots.txt file accidentally blocking the rendering process. You can make sure that no essential resources are being blocked from crawling by adding this code to your robot.txt file:

User-Agent: Googlebot

Allow: .js

Allow: .css

If you block these resources, it can mean that your content doesn’t get rendered and indexed.

Make Sure Google Is Indexing Your JS Content

Now that you have full confidence that your pages are rendering as they should, it’s time to make sure that they are being properly indexed. You can either continue to use Google Search Console, or you can just head to the regular old Google search engine.

Using Google Search Console, you’ll want to head to the URL Inspection Tool and select “View Crawled Page” to take a look at the HTML source code of the page. Look for snippets of JS content as you scan the code.

Google View Crawled Page

There are a number of reasons you might not see your JS content here– it could be that the page is timing out when the content is being indexed, that the JS is generating internal links that cause the URL to not be discovered, or that the content can’t be rendered by Googlebot.

To use the search engine, you’ll want to use the “site:” command.

For example, if your site was blue-pig-media.com and you wanted to check the “Hire Us” page to make sure it’s being indexed, you would enter the following into the search bar:

site:blue-pig-media.com/hire-us

Your page will show up as a result if the page is indexed. Your next step will be checking to see if the JS-generated content is indexed, which you can do using the site command with a snippet of the JS content from the page in question. Using the above example, this would look like:

site:blue-pig-media.com/hire-us/ “snippet of JS content”

If this specific part of your JS content has been indexed, it will show up in the snippet.

If your page doesn’t appear at all after doing the first basic site command, it means that your page isn’t indexed.

Perform a Site Audit

There are a number of site audit tools you can use out there to crawl your JS in the same way that Googlebot does. This can help you find any issues as well as provide potential solutions for remedying them.

Performing a Site Audit

If you’re not interested in purchasing SEO software just to climb the learning curve, though, you can also hire a professional SEO agency to perform a site audit and fix any issues that are keeping your JS content from rendering.

Are you due for a full site audit? Use this checklist to make sure that your on-page SEO is in tip-top shape.

Is It Time to Bring in the Pros?

If you’re dealing with issues like whether or not your JS files are being indexed by Google, you might be wondering if you would be better off hiring professionals to take care of your SEO needs. After all, while it can be truly fascinating and rewarding to manage your own SEO, it also requires that you stay constantly tapped into the ever-changing landscape of SEO and search engines.

Team Fixing JS Issues

If you’ve been on the lookout for the right digital marketing agency to help drive traffic to your site and grow your business, you’re in the right place! Blue Pig Media is a full-service digital marketing and SEO agency that can help you with absolutely every aspect of your site– from designing and building your site to optimizing your site for search engines.

No matter what your business goals are, we’re here to help. If you’re ready to get started, reach out and contact us today.

David Curtis
David Curtis
David Curtis is the founder and CEO of Blue Pig Media. With twenty years of successful execution in sales, marketing and operations, for both clients and vendors, he has a bottom line ROI driven mentality rooted in metrics driven performance across highly competitive global corporate initiatives.

Comments are closed.