Why Your On-Page Javascript Should Be Optimized For SEO [+ 6 Optimization Tips]

Today, Javascript is used in almost all web development projects. Its simplicity, speed, and compatibility have made it one of the top programming languages. Yet, conversations around Javascript aren’t always positive. Especially in SEO circles. You may be wondering: “Why do people hate Javascript?”

The answer is quite simple: Google’s content indexing process doesn’t favor Javascript-based websites. Consequently, many assume that Javascript is not a good ally if you’re trying to organically position your site. But is that the case? On the other hand, how can you identify and fix Javascript issues that affect technical SEO?

If you work with JAMstack sites (such as those running on React, Angular, or Vue.js) and want to know if your pages are correctly rendered and indexed, this post is for you.

We’ll explain:

  • Some Javascript SEO basics
  • Why Javascript should be optimized for SEO
  • How Javascript affects SEO
  • How Google crawls and indexes Javascript-based websites
  • How to optimize Javascript for SEO

Let’s dive in!

An Introduction to Javascript

Javascript (often abbreviated as “JS”) is a programming language that, combined with HTML and CSS, allows web pages to contain interactivity and dynamic effects.

Basically, JS makes it possible for websites to include interactive features such as:

  • Complex micro-interactions
  • Slider controls
  • Forms
  • Maps
  • Web-based games

Studies say that 97,9% of all websites use Javascript.

According to W3Techs, 97.9% of all websites use Javascript. And it’s easy to see why. Sliders, carousels, tabs, accordions, and many of the elements that we take for granted as parts of our web experience are Javascript-driven.

Additionally, Javascript has become a go-to language for creating websites that are lightweight, scalable, secure, and easy to maintain. As of this writing, over 1% of websites are powered by frameworks such as React, NextJS, Angular, and Vue. This may look like a small percentage, but it has almost doubled since 2020.

While JAMstack websites still constitute a minority, almost all sites render some Javascript. So, even if your website runs on WordPress, you should be aware of Javascript’s potential SEO downsides. We’ll explore them in detail in the next section.

Javascript SEO: Why Javascript Should Be Optimized for SEO

So, is Javascript bad for SEO? Well, this question could have many different answers. Javascript’s potential pitfalls will be different for JAMstack and non-JAMstack sites.

For instance, if you’re running a WordPress website, your Javascript may only become problematic if it isn’t properly optimized. But, if you’re running a JAMStack site, your problems may not have to do with your code’s weight, but with how your website’s rendered.

The main difference would be that, in the first case (a WordPress site loading Javascript), the content would still be accessible, even when your JS files weren’t loading. In the second case (a JAMstack site), the content itself is dynamic and wouldn’t be visualized without loading Javascript files.

Javascript SEO is a technical SEO subdiscipline dedicated to making Javascript-based websites easy to crawl and index.

Implementing some sort of Javascript SEO is necessary because:

  • It can improve content rendering, boosting your rankings
  • It can help you implement sustainable interlinking practices
  • It can improve your loading speed

How Does Javascript Affect SEO?

Even though SEO specialists and developers often work together, they prioritize different things. And that can be a challenge when it comes to ranking your site. Devs focus on making a website maintainable, reliable, performant, and functional. But sometimes they may overlook important SEO factors before shipping their code.

As simple as it sounds, a well-coded JS element could inadvertently hide content from Googlebots. For example, let’s say you have an accordion that contains very important content. When the user clicks on its toggle, the accordion opens and the information is shown. But, depending on how that element is developed, the content may not be easily accessible to search engine bots.

Well-coded and functional Javascript element could hide content from Googlebots.

Depending on how it’s implemented, Javascript could inadvertently:

As a result, Google may take longer than usual to crawl your pages and may even overlook the most relevant ones. In other words, sometimes even technically correct JS (from a developer’s standpoint) may cause crawling and indexing obstacles.

If you’re running a JAMStack website, you may come across crawling and loading speed issues. These are often solved by switching from client-side to server-side rendering. We’ll take a deeper look in the next section.

How Does Google Crawl and Index Javascript?

Before diving into how to make your Javascript website easy to crawl and index, it’s necessary to take a look at how Google processes it. The process consists of three steps:

  • Crawling
  • Rendering
  • Indexing

This process is often understood as consisting of two “indexing/crawling waves”. Two years ago, Google’s Martin Splitt questioned this idea, stating that “the wave is an oversimplification”.

 

However, we’ll use the wave metaphor to bring some clarity to this very abstract process.

Indexing Static HTML: The First Wave

First, Google decides which elements are necessary to represent the content on the page. To do this, it crawls the static HTML files, leaving out any linked CSS or JS files.

How? Well, Googlebot begins by crawling your site’s HTML URLs. As soon as the crawler requests an HTTP query to the server, it sends it the HTML document (the one the crawler downloads to read and execute). Then, Googlebot reads the robot.txt file. If the URL is incorrect (or it is marked as disallowed), it skips the HTTP request.

Indexing of Additional Content Presented via Javascript: The Second Wave

The growing use of Javascript forced search engines to redesign their indexing systems so they could see content as users see it. Thus, Googlebot crawls and stores all the resources needed to build the page.

These resources include:

  • HTML pages
  • Javascript files
  • CSS files
  • Requesting XHR
  • Endpoints for APIs and more

In 2019, Google made improvements to its Javascript features. Googlebot was upgraded to “evergreen”, making it fully compatible with ECMAScript 6 (ES6) and above, the latest versions of Javascript.

Do these advances remove all worries around indexing? Not really.

Javascript-rich pages take days or even weeks to be properly indexed. Google processes all pages but tends to crawl pages with a lot of JS with a lower frequency. The reason is simple: it’s a very resource-intensive process. It includes downloading, analyzing, and executing heavy and demanding files. Thus, Google Rendering Services take over when Googlebot resources allow it.

In conclusion, indexing remains a challenge for a couple of reasons:

  • Google takes a long time to index and render JS content due to limited computing resources.
  • Google has difficulty in executing web rendering code in its entirety.
  • Google eventually stops indexing slow loading sites – and a good portion of JavaScript-heavy sites fall into this category.

How to Optimize Javascript for SEO

One thing’s for sure: There’s no alternative to Javascript. And there are clear benefits to running a JAMstack site, such as security and speed. Thanks to the subdiscipline of Javascript SEO, you won’t have to compromise your positioning either.

When approaching a new Javascript SEO project, begin with three questions:

  • How does Google crawl your web content?
  • How does Google render your web content?
  • How does Google index your web content?

The best results can be achieved if you follow some basic guidelines for SEO optimization.

We recommend you:

  • Understand Javascript rendering and use it in your favor
  • Ensure that Google is indexing your Javascript content
  • Allow resource crawling
  • Prevent duplicate content
  • Keep your SEO basics in check
  • Monitor your loading speed

Understand Javascript Rendering Types and Use Them in Your Favor

Server-side rendering vs. Client-side rendering.

There are three types of Javascript rendering:

  • Server-side rendering
  • Client-side rendering
  • Dynamic rendering

In this section, we’ll take a look at each of them and share some SEO pros and cons.

Server-side Rendering (SSR)

SSR is the process of serving a webpage to a client with its HTML structure and dynamic content fully integrated. This type of rendering does the job of combining the structure and the content on the server, and delivering it to users (and search bots) “in one piece”.

Websites that are rendered server-side:

  • Load fast
  • Are unlikely to experience partial indexing
  • Are more resource-intensive than their client-side counterparts

Client-side Rendering

JAMstack websites that render client-side “assemble” the HTML structure and the dynamic content directly on the browser, when the user (or bot) requests it.
This type of rendering isn’t as heavy on the server versus the alternatives. But it’s far more likely to cause SEO issues.

For example:

  • High cumulative layout shift
  • Slow loading
  • Partial indexing

Dynamic Rendering

Dynamic rendering combines the best of client-side and server-side rendering. It delivers a server-side version of the page to Googlebots and a client-site version to users. That way, you will:

  • Prevent partial indexing issues
  • Save on server resources

This solution has been promoted by Google, and it’s great for preventing crawling and indexing issues. However, it doesn’t solve potential page speed problems on the user’s side.

At this point, you may be wondering if this constitutes cloaking. As long as both versions show the same content, search engines do not consider it cloaking. In other words, the content that users and search engines see should be the same, only the rendering type should change.

Google Search Console real time test.

In any case, you can check if your content’s been rendered correctly on Google Search Console. Here’s how:

  1. Enter the URL of the page you intend to inspect.
  2. Click the “real-time test” button on the top right side of the page, and after a few seconds, you’ll see a “real-time test” tab.
  3. Click on “view approved page” in the tab to see how Google has viewed it.
  4. Click “more info” to find out if there have been crashes, errors, or timeouts.

Ensure that Google Is Indexing Your Javascript Content

After determining if your website is rendering correctly, it’s time to see how it’s being indexed. You can do this through Google Search Console or through the search engine itself.

Checking how your content it’s being indexed on Google's search engine.

Checking how your content it’s being indexed on Google's search engine.

Using the browser:

  1. Use the site command on Google and enter the URL of the page you want to see. An example would be “site:exampledomain.com/page-URL
  2. If the page is found in Google’s index, it’ll appear on the search result.
  3. Determine if a section of generated Javascript content is indexed. Use the site: command and include a snippet of content you’re displaying dynamically. For example: site:exampledomain.com/page-URL/ “snippet of JS content“. That way, you can check whether the content has been indexed and if it’s displayed correctly.

Using Google Search Console:

  1. Open the URL Inspection tab. You’ll find it on the sidebar, under the Performance label.
  2. Enter the URL you want to examine on the search bar and press Enter.
  3. If the page is indexed on Google, you’ll be able to visualize the source code for its indexed version, by clicking View Crawled Page.
  4. Check whether your content appears on the source code.

Allow Crawling

Check that none of the required resources in robot.txt are disallowed. Adding the following to your robots.txt file will allow Googlebot to crawl all CSS and Javascript files on your site:

User-Agent: Googlebot
Allow: .js
Allow: .css

Prevent Duplicate Content

Oftentimes, a website displays the same content across different URLs. In Google’s eyes, this is considered duplicate content. There are many ways to handle your duplicate content. In some cases, the best solution is to select the URL you want to index (i.e. leave it visible to search engines) and set a canonical tag.

However, in some instances, you may want to delete one of the instances of duplicated content or set a 301 redirect.

Don’t Lose Sight of Your On-Page SEO Basics

On-page SEO is extremely important. Implement on-page SEO best practices to make sure that users and search engines can easily digest your content. Always make sure to:

  • Set and optimize your pages’ title tags and metadescriptions
  • Implement image alt tags
  • Maintain your sitemaps
  • Create a valid robots.txt file

Keep an Eye on Speed Performance

Page speed is a key ranking factor. Javascript, as you already know, is notorious for slowing down websites when overused or unoptimized.

Therefore, you should work hand-in-hand with your developers to:

  • Minimize Javascript and incorporate only the most critical elements.
  • Use Javascript moderately and, if running a JAMstack site, render it server-side.
  • Use Google’s PageSpeed Insight to monitor page speed and find new ways to optimize your website.

Keep an Eye on Your Site’s Code with SEORadar

In this post, we examined how to implement SEO best practices on a Javascript-based site. At this point, you may be excited to get to work on your site’s optimization. But, can you guarantee that your platform remains optimized in the next couple of months? That’s exactly what SEORadar can do for you.

We constantly monitor your code and let you know of any changes that may affect your SERP positioning. If we notice something, we’ll let you know, either through email, Slack, or SMS. If some potential changes worry you more than others, simply customize the priority of those change alerts. In short, we stay on top of your site’s code, so you don’t have to.

Curious? Request a free trial or schedule a demo today.

Continue Reading

Book A Demo

Learn how SEORadar can customize tracking SEO code changes