SEO: Optimizing Single Page Sites For Search Engines


Search engines such as Google index websites without executing Javascript. As a result, single page sites are at a greater disadvantage compared to their traditional counterparts, especially if they seem to rely on Javascript.

In today’s competitive business environment, not being on Google is suicide for any single business.

The majority of unaware victims are tempted to abandon single page sites altogether because of frustration. However, not all is lost as Google and other search engines have recognized this challenge and consequently created a method through which single page sites can have their dynamic pages indexed as well as optimize their pages for crawlers.

The following are ways to optimize your single page site for greater Melbourne, FL SEO results.

The Mechanism Which Google Crawls a Single Page Site

Typically, when Google indexes a traditional website, a Googlebot – Google’s web crawler, first scans and indexes the top-level URI content with the domain name then follows other links to that page and does the same as well. It then tracks the links on the subsequent pages and repeats the whole process until it is done indexing all the content on the site, including any associated domains.

However, when the same Googlebot tries to index single pages website, all that it can see as part of the HTML in an empty container with empty body and div tags and bypasses because there is nothing to crawl. The good news for single pages site is that Google has come up with a mechanism that enable developers provide search information used by the crawler in a way more beneficial than a traditional website.

What Makes a Single Page Site Crawl-able

The most important step in making a single page site crawl-able is using a server that can tell whether it is a crawler or a person using an internet browser making a request to the server, which should respond promptly. Such a server has clear instructions to respond normally to a person’s request through a web browser but return a page optimized for crawlers in a way readable to the crawler.

Basic communication to the crawler include using crawler-optimized pages with a logo you would like to appear on search engine results pages (SERPs), SEO optimized text explaining briefly what the site does, and some of the HTML links to only those pages that Google should index. This page, however, should not have anything that ought to be hidden from the crawler such as legal disclaimer pages, or other private pages that web users should not find using Google search.

Customizing Content for Google’s Web Crawler

A hash bang (#!) is what single page sites use to link to different content. Apparently, people and crawlers do not follow these links the same way. The server is programmed to respond to crawler requests knowing exactly the pattern the crawler follows. As part of Google’s webmaster tools, Googlebot can announce itself to the server as a crawler. SEO clients can request SEO services from Tight Line Productions to optimize a single page website to allow generation of traffic to your site. You can now take advantage of Google’s tools that allow developers to customize their site for users and Google.

Skip to content