There has been a spirited debate in the SEO world for some time now– can search engines see JavaScript?
Over the years, many notable players in the SEO industry have run tests to try and determine whether Google and other search engines can crawl, render, and execute JavaScript resources. Since people began discussing this topic many years ago, search engines have become more advanced, bringing us to the current year, when most major players can see JS.
That being said, there’s more to the story. Just because Google and other search engines have gotten more advanced in their ability to deal with JS doesn’t mean that your pages are necessarily being indexed as they should be.
In this article, we’ll look at how search engines respond to JS resources, how you can ensure that your pages are being indexed by Google, and best practices for JavaScript SEO.
Back in the old days, search engines would be able to see the content of most pages through a downloaded HTML response.
As JavaScript has become an increasingly critical part of the modern web, search engines can no longer solely rely on this tactic to understand what content is on a page in the way a user would see it.
At Google, the system that deals with the JavaScript rendering process is known as the Web Rendering Service (WRS). This process crawls and stores the resources that are necessary to build pages, such as JavaScript files, CSS files, and HTML.
There are three main phases in which Google processes JavaScript: crawling, rendering, and indexing.
Googlebot (aka Google’s web crawler) will queue pages for both crawling and rendering and then crawl every URL that is in the queue. After the Google bot sends a GET request to the server, the server will respond with the HTML document. At this point, the web crawler determines which resources are required in order for the content of the page to be rendered.
What this means is that it will crawl the HTML of the page and not the CSS or JS files. The reason for this is that rendering JS takes a ton of resources.
The rendering of JavaScript is, therefore, deferred, and anything unexecuted gets queued to render later when there are more resources available.
At the point at which there are ample resources to process JS files, the page is rendered, and the JavaScript is executed. The rendered HTML is processed again for links, and the URLs that it discovers are queued for crawling. Finally, the rendered HTML is used by Google in order to index the page.
Are your product pages on Shopify not being indexed by Google? If so, take a look at this post that explains why your product pages aren’t being indexed and what you can do to fix the problem.
In short, the answer is that Google and Bing are now able to render JavaScript when they are crawling sites. That being said, there is a long list of things that can end up going wrong as a part of that process. Using the inspection tool in the Google Search Console can help you understand whether Googlebot is able to see your content.
The world of crawling, rendering, and indexing sites is changing quickly, and you’ll find ample information online telling you that other search engines aren’t advanced enough at this point to index JavaScript websites. Considering that Google has the lion’s share of the search market, though, you might not be particularly concerned with whether or not search engines like Baidu or Yandex are able to crawl and index JS sites.
The truth is, though, that Yandex and Baidu are both able to process and render JavaScript sites.
It’s worth noting that there are separate queues for regular crawling and rendering, which means that sites that require rendering take longer than those that don’t.
Googlebot goes through the queue the first time to request and receive the server-supplied HTML and deals with the rendering later. Even though Google has stated that the delay between the first crawl and rendering is now only a matter of seconds, it often takes days or weeks more to index pages that need to be rendered versus those that don’t.
Are there pages on your site that aren’t linked to anywhere else on your website? These are known as orphan pages, which can have both a negative impact on your SEO and your user experience. This guide looks at why you don’t want to leave orphan pages unattended and how to find and fix them.
There are a number of issues that frequently crop up when it comes to JavaScript and SEO.
Here are some of the common problems as well as the best practices you can use in order to avoid them:
Are you wondering how you can boost the performance of your webpage without sacrificing your valuable JS resources? Check out this post about how to delay loading JS assets until the user’s about to interact with them.
Making sure that search engines can properly crawl, render, and index your JavaScript content is essential to make sure you aren’t losing out on potential traffic.
Here are three steps you can take to make your site’s JavaScript SEO-friendly.
Just because your site launches just fine on Google Chrome doesn’t mean that Googlebot will be able to render its content. In order to find out if your pages can be rendered by Googlebot, you’ll want to go to the Google Search Console and use the URL Inspection Tool. Once you get here, you’ll want to enter the URL of the page you are testing into the bar at the top of the page and hit enter.
Once you’ve done that, select the button on the far right of the screen that says “Test Live URL.”
This process takes a minute or two, but soon enough, you’ll see a “Live Test” tab appear. You’ll want to select “View Tested Page” so that you can see a screenshot of the page’s code.
In order to look for missing content or any discrepancies, take a look at the “More Info” tab.
One of the more common culprits for your JS pages not being rendered by Google is your site’s robots.txt file accidentally blocking the rendering process. You can make sure that no essential resources are being blocked from crawling by adding this code to your robot.txt file:
User-Agent: Googlebot
Allow: .js
Allow: .css
If you block these resources, it can mean that your content doesn’t get rendered and indexed.
Now that you have full confidence that your pages are rendering as they should, it’s time to make sure that they are being properly indexed. You can either continue to use Google Search Console, or you can just head to the regular old Google search engine.
Using Google Search Console, you’ll want to head to the URL Inspection Tool and select “View Crawled Page” to take a look at the HTML source code of the page. Look for snippets of JS content as you scan the code.
There are a number of reasons you might not see your JS content here– it could be that the page is timing out when the content is being indexed, that the JS is generating internal links that cause the URL to not be discovered, or that the content can’t be rendered by Googlebot.
To use the search engine, you’ll want to use the “site:” command.
For example, if your site was blue-pig-media.com and you wanted to check the “Hire Us” page to make sure it’s being indexed, you would enter the following into the search bar:
site:blue-pig-media.com/hire-us
Your page will show up as a result if the page is indexed. Your next step will be checking to see if the JS-generated content is indexed, which you can do using the site command with a snippet of the JS content from the page in question. Using the above example, this would look like:
site:blue-pig-media.com/hire-us/ “snippet of JS content”
If this specific part of your JS content has been indexed, it will show up in the snippet.
If your page doesn’t appear at all after doing the first basic site command, it means that your page isn’t indexed.
There are a number of site audit tools you can use out there to crawl your JS in the same way that Googlebot does. This can help you find any issues as well as provide potential solutions for remedying them.
If you’re not interested in purchasing SEO software just to climb the learning curve, though, you can also hire a professional SEO agency to perform a site audit and fix any issues that are keeping your JS content from rendering.
Are you due for a full site audit? Use this checklist to make sure that your on-page SEO is in tip-top shape.
If you’re dealing with issues like whether or not your JS files are being indexed by Google, you might be wondering if you would be better off hiring professionals to take care of your SEO needs. After all, while it can be truly fascinating and rewarding to manage your own SEO, it also requires that you stay constantly tapped into the ever-changing landscape of SEO and search engines.
If you’ve been on the lookout for the right digital marketing agency to help drive traffic to your site and grow your business, you’re in the right place! Blue Pig Media is a full-service digital marketing and SEO agency that can help you with absolutely every aspect of your site– from designing and building your site to optimizing your site for search engines.
No matter what your business goals are, we’re here to help. If you’re ready to get started, reach out and contact us today.