Common load speed improvements.

Improvements in common load speed

Hosting your site in the same city as your customers will improve load speeds.

Improvements in common load speed
Improvements in common load speed

You can also use a CDN (content delivery system) to host your website on servers around the globe. No matter where they are located, visitors will experience super-fast loading speeds. Cloudflare, MaxCDN, and Amazon CloudFront are all popular CDN services.

– Allow load speed technologies such as compression, minification, and HTTP/2. Many platforms provide plugins to enable this feature, such as W3 Total Cache. W3 Total Cache, a popular plugin that offers most of these features on WordPress, is the most widely used.

– Locate large files on your website and shrink them. Adobe Photoshop allows you to compress image files from 3MB to 250KB with no visual loss. This is a great solution for image-heavy websites.

These are just a few of many ways to improve a site’s loading speed. There are many tools that can help you identify speed bottlenecks and improvements.

Load speed analysis tools

1. Google Page Speed Insights

Google’s free Page Speed Insights tool will calculate a page load score for you. It is a 100-point scale. This tool will show you how fast your site loads compared to other sites. It is possible to see how fast your site loads on desktop and mobile. Scores close to 100 are almost perfect.

The tool will run a site test and give you a list with high, medium, and low priority areas to improve. These can be sent to your developer for speeding up your site. Or, if you’re a tech-head you can try to fix them yourself.

2. Google – Test my site

Google added benchmarking reports to its mobile load speed test tool, Test My Site in June 2017. This tool is easy to use and invaluable for finding simple-win load speed improvements mobile users can make. It also allows you to compare your website against other websites.

This tool can be a shock to you at first. Many site owners find that they lose up to 30%-50% traffic due to slow loading times on mobile 3G devices. Not a good outlook.

The handy tool offers free reports and recommendations that can be used to increase your load speed, with a strong emphasis on mobile users. You can rank higher than your competitors if you follow these recommendations. Load speed is a key ranking factor in search results.

3. Pingdom Tools – Website Speed Test

The Pingdom Tools Website Performance Test is the best load speed tool. It provides detailed information about files and resources that slow down your site, including file-sizes, load times, and more. Although it provides more detailed information than other tools, it is best for web developers or people with basic experience building websites.

Scroll down to see the list of files that each visitor must download every time they visit your website after the test is complete. Load speed improvements are easiest for large images. Images larger than 200kb can be reduced in Photoshop to reduce their size and maintain quality. Make a list of large files and send them to your web developer.

4. Lighthouse – Tools for Web Developers – Google

Advanced developers who work on complex projects and sites–i.e. Google released Lighthouse, a Chrome extension that helps programmers understand what a Node module means. Lighthouse offers reports on website performance, accessibility and adherence to programming best practice, SEO, and other areas. It also provides actionable steps for improving each area. Although basic websites won’t receive additional insights, this tool will provide recommendations that can be used in conjunction with other tools. Programming ninjas who are looking to improve their performance skills will find Google Lighthouse the Swiss Army Knife of site performance analysis.

The usual suspects–sitemaps.xml and robots.txt.


Sitemaps.xml is a file that search engines look for on every site. This file is essential for search engines to find pages on your website. Sitemaps are basically a map of all pages on your website. It is easy to create this file and get it on your website.

Many CMS systems automatically generate a sitemap file. WordPress, Magento and Shopify all have sitemap files. You may need to install a plugin, or the free XML Sitemaps Generator tool. The XML Sitemaps Generator automatically creates a sitemaps.xml for you.

XML Sitemaps Generator

The next step is to ask your web designer or web developer to upload the file into your main directory. If you have FTP access, you can also do this yourself. The file should be made publicly available with an address such as the one below.

After you’ve done that, submit your sitemap for Google Search Console.

The following Google article provides simple instructions to web developers and web designers for setting up a Google Search Console account.

Search Console Help – Add website property

Log in to your account. Click on your site. Click on “Site Configuration” to open the “sitemaps” section. Then submit your sitemap.


A robots.txt is another important file that every site should have. It should be located in the same location as your sitemaps.xml files. This file’s address should be the same as the one in the following example:

Robots.txt is a file that allows you to tell Google which areas of your site it doesn’t want to show in search engine results.

Although there are no benefits to having a robots.txt on your site it is important to make sure that you do not have any robots.txt files blocking areas of your site that search engines can find.

Robots.txt is a simple text file. It should contain the following contents:

# robots.txt is a good example





The following example will show you how to make your site tell search engines not to crawl it. You don’t want your entire website to be blocked. To be safe, it is always a good idea double-check that your site is not set up in this manner.

# robots.txt: Example of blocking the entire site



This example shows search engines that their software should not be visiting the home directory.

Create a plain text file with Notepad on Windows or Textedit on Mac OS to create your robots.txt. As an example, use the robots.txt-good example to show how the file should look. You should list all directories that you don’t want search engines to view, such as admin areas, CMS back-end, internal folders, and administrative areas.

You can delete your robots.txt file if there aren’t any areas that you want to block. However, make sure you have a valid one.

Duplicate content–canonical tag and other fun.

Later chapters will discuss how Google Panda penalizes websites with duplicate content. Many site content management systems can sometimes create multiple versions of a page.

Let’s take, for example, a page about socket wrenches on your website. However, because of the system you use, this page can be accessed from many URLs in different parts of your site.

This is confusing to search engines and multiple pages are considered duplicate content.

This is why you need to ensure that a unique tag, known as the Canonical Tag, is placed on each page of your website.

Search engines can identify the original version of a page by using the canonical tag. You can tell Google which page you believe is the “true” version by adding the tag to the page. This will allow you to control the pages that appear in search results.

The URL that is the easiest for users to understand, and the URL that looks like it’s written in plain English, should be chosen.

Google will display the most recent version of the page if you use the tag below.

This tag should be included on all pages of your site. It should appear just before the code tag.