Server side rendering is when each page gets requested from the server a site is hosted and, and the required HTML is rendered and returned to the client.
What is server side rendering?
What is client side rendering?
Client side rendering is when a request to the server is not made but the HTML is rendered directly in the browser.
What is dynamic rendering?
Dynamic rendering is when an intermediate party sits between a visitor and your server to detect if requests are from a crawler (such as Googlebot or Bingbot) or from a user. If the request is from a bot it will serve a fully rendered DOM from a cache. If it detects a human visitor, it will serve the normal SPA.
This is a great stepping stone if full server side rendering cannot be achieved straight away with services such as prerender.io, Google Puppeteer, or Rendertron.
Is dynamic rendering cloaking?
Despite dynamic rendering logically seeming like it would fit the bill for cloaking, it is not considered cloaking by Google provided you are serving both Google and the user similar content. If you use dynamic rendering to serve Googlebot completely different content to users, this can still be considered cloaking.
How can you check if your website is server side rendered?
Both server side and client side rendered websites change the URL in the browser, so to check if it’s server side rendered you need to check the header response. The easiest way to do this is to use a plugin such as Redirect Path, or Link Redirect Trace.
When you visit your site you will see that the URL will return a header response in the browser:
If your site it server side rendered, however, when you navigate to a different page, you will see a different URL in the browser, but not in the header response, which will still be the first URL you visited:
This is because the page is loaded on the client side, therefore another response to the server is not made, so the header response also does not change.
- Turn JS off in your browser (there are loads of plugins available to do this easily) to see which areas of your site/links can and can’t be loaded.
- Crawl the site using Screaming Frog to see what is and isn’t crawled. Initially you should use the default Text Only crawling. You may also want to ensure you’re crawled with Googlebot as a User Agent as this may change how your website responds if you’re using something such as pre-rendering.
- Ensure every page has an individual URL by utilising the History API.
- Ensure your site is server side rendered, or pre-rendered so search engines crawlers can access all URLs.
The aim is to make your website as easy as possible for search engines to find and crawl. The less effort search engines have to put in to find and see your content, the more likely it is to rank well.
Just because they can (or say they can), doesn’t mean they will, or you should.
Snipcart have a fantastic guide on how to optimise SEO for SPAs including Angular.
How does Google deal with infinite scroll?
Infinite scroll requires a user to interact with a website to load; the products loaded initially in the rendered DOM do not include those that are loaded later via interaction. Therefore, is using infinite scroll on your site, it’s best to include pagination in your HTML also.
This means that Google can still access all products that are related to a particular category. For more information about pagination and SEO, check out our blog post on pagination for e-commerce sites.
What is a SPA?
An SPA is a Single Page Application (unfortunately not a sauna and steam room); it is a website that loads all resources upon initial load so it doesn’t have to make additional requests to the server when a user interacts with the site.
This makes for a smooth, seamless experience, however, can have implications for SEO as depending on the setup, can make it difficult for search engine crawlers to access the site.
Google first crawls the raw HTML of a website using Googlebot, and then renders the content using Google Caffeine.
What is the DOM?
Dan is the Delivery Director at NOVOS. A former neuroscientist, Dan entered the world of SEO by working with one of the top SEO agencies in the country and has architectured several award-winning SEO campaigns since then. Dan joined NOVOS as an SEO manager and his vast knowledge of tech SEO has helped NOVOS execute even the most complicated eCommerce SEO plans. He is phenomenal at solving client problems and knowing exactly the high standards we want to deliver which is why he is now our Delivery Director.
We're eCommerce specialists for a reason, get in touch with us today and find out more.