7 . 2 . 18

SEO Considerations for Single Page Applications

The emergence of JavaScript frameworks such as AngularJS, React and Vue.js, to name a few, has rapidly increased the adoption of single page applications (SPAs). Facebook, Twitter and YouTube are all examples of websites which have adopted SPAs. The growing adoption of SPAs is being driven by the fact that such frameworks have the potential to offer both a better user experience combined with the introduction of development efficiencies.

What is a Single Page Application?

First let’s define exactly what is an SPA. An SPA is a web app that loads a single HTML page and dynamically updates that page as the user interacts with the app. This means all pictures, files, JavaScript, and external resources are loaded when the user initially loads the app. In contrast a traditional site loads each HTML page and all respective resources individually as the user navigates to it.

Traditional Page Lifecycle vs. SPA

Source: Microsoft

Once loaded SPAs deliver users almost instantaneously loading pages, promoting a highly engaging browsing experience. From a developer’s perspective, SPAs allow for efficiencies to be achieved through a template driven approach to development.

SEO Challenges

While SPAs present a wealth of benefits to both users and developers, such frameworks can often create a mild sense of panic with even seasoned SEOs. This is due to the fact SPAs present a host of challenges when attempting to implement traditional SEO techniques.

These challenges stem from the fact JavaScript frameworks i.e. SPAs inject a layer of complexity to traditional website crawling / indexation. This complexity can easily be observed by viewing the source of an SPA, or more commonly the lack thereof. Traditionally crawlers such as Googlebot discover a website’s pages and then a secondary process is initialled by an indexer which subsequently evaluates discovered pages. For websites purely constructed in HTML this is a relatively simple process. For websites with a high reliance on JavaScript, i.e. where a high proportion of links are not part of the HTML source code, a crawler may initially only find a limited set of URLs. In order for deeper URLs to be discovered the indexer must render pages discovered during the initial crawl, extract all supplementary URLs, then pass these back to the crawler to enable deeper content to be discovered. As the complexity of JavaScript websites increases crawling and indexing becomes increasingly slow and inefficient. This reduced efficiency can not only increase the time required for pages to appear within search results but also introduce inaccuracies in the evaluation of a site’s internal link graph. This could potentially result in key pages failing to be rewarded with the authority they deserve and lower value pages appearing higher in the search results.

A further key consideration is to keep in mind that an SPA won’t be viewable to the user in full until all requisite JavaScript is downloaded successfully to the browser. While an SPA is notably faster at loading content once the initial load is complete, for users on a slower connection it could result in a significantly slower initial load time vs. a traditional website.

Finally, while for the majority of websites Google will be by far the most dominant driver of organic traffic, it is still important to consider the other major search engines. From purely a UK perspective Bing and Yahoo trail notably behind Google in terms of their JavaScript rendering capabilities.

SEO Best Practice

The advantages of SPAs to both users and developers are undeniable; to exploit these benefits it is crucial best practice SEO guidelines are followed. Google has previously stated it can now successfully crawl JavaScript content as long as requisite resources are not blocked. However also contained within such announcements has been the following admissions:

  • “Sometimes things don’t go perfectly during rendering, which may negatively impact search results for your site.”
  • “It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.”
  • “Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.”

To maximise the SEO opportunities of SPAs it is recommended the following optimisation practices are implemented:

Utilisation of Serverside Rendering

If an SPA features serverside rendering, crawling / indexation issues can effectively be resolved before they arise.

  • Angular 1.0 – requires third party tools or creation of custom inhouse scripts to enable serverside rendering, e.g.
  • Angular 2.0 – compatible with Angular Universal – a bespoke rendering service for Angular apps
  • React – utilises render to string allowing it to return HTML strings directly on the server

Search Engine Friendly URLs

A unique URL should be served for all pages with a 200 status. URLs should be flat, clean and with no fragments (hashtags). Following the deprecation of Google’s AJAX crawling scheme hashbangs are no longer recommended.

Meta Data

It is recommended all key meta data is rendered directly within the source code of the page, supplementary to serverside render functionality. Key meta data should include the following tags (where applicable):

  • Page Title
  • Meta description
  • Canonical tag
  • Meta robots

Load Relevant Content Immediately

When rendering a page, search engines struggle to click buttons that trigger the loading of further data / content on a page. As a result, it is key all high value content requiring crawling / indexation is rendered in full on page load.

Internal Links

To promote the efficient crawling of all website content it is recommended internal links are embedded utilising link <a> tags as opposed to JavaScript onclick events.

In addition to ensuring all links feature <a> tags it is strongly recommended core navigational elements such as the primary navigation are directly output within the source code.

Error Pages

The correct page header status should be returned in response to encountered errors. For example, a 404 header status should be returned if a user attempts to access a missing page, similarly a 500 status should be returned for a server error.

XML Sitemap

Given the inherent complications with crawling / indexation of SPAs, it is highly recommended an XML sitemap is published to provide supplementary access to all deeper level website content.

Implementation of the above best practice guidelines will help to ensure the significant benefits of SPAs to both users and developers can be realised but with the reassurance potential SEO constraints do not arise.

Chris MannChris Mann is our Technical SEO Manager and heads up our growing team of SEO analysts. With over 10 years professional experience, Chris delivers expert SEO knowledge with a strong technical focus.

Share This: