Much is made of website usability (UI or UX) as a ranking factor. There are tons of opinions online about how vital it is and even how UX is the new SEO. Whatever that means.
Although experts may be able to talk endlessly about complex UX design, the most important thing is that website visitors find your site easy to use. Website visitors won’t be happy to find themselves lost on a maze of pages. They don’t want websites with poor content or spammy design. And they don’t want ten pages of personal information before they can checkout. Website visitors will leave your site if they are forced to experience this kind of experience.
Google’s view of website usability is less restricted than some fearmongers would like you to believe. Google can observe how users navigate to and from websites. For example, do they click on your site and then return to search two seconds later? Google can also analyze websites with machine learning algorithms. It can also analyse website link patterns and authority to make a judgement about the quality and usability of a website. No one links to a bad website.
We’ve seen how Google looks at links and measures searcher behaviour, so let’s now look at machine learning. Particularly, Google’s algorithm uses machine learning to determine the quality of your website. Let me go over some SEO history.
Google’s Panda algorithm was a seismic change in the spam world. It was introduced by Google in 2011. Google announced the algorithm as follows:
This update will reduce the rankings of low-quality websites–sites that have low value add for users, use copy content from other sites or are not very useful. It will also improve rankings for high-quality websites – sites with original content and information like research, in-depth reporting, thoughtful analysis, and so forth.
These announcements contain clues, such as the vague and unclear phrases “or sites just not very useful” or “and so forth”. How does an algorithm determine which sites are “just not very useful”? It doesn’t. It couldn’t.
Google hired human testers to help them answer 23 questions about the websites they showed on their screens. The questions were “Is the article written by an expert, enthusiast, or is it less detailed in nature?” and “Would this site be okay giving you credit card information?”. This machine learning algorithm was created from the input of human testers. It learned how to distinguish between good and bad websites. The Panda algorithm was unleashed to the wild and it caused a lot of chaos. Google’s search results showed a dramatic improvement in quality. Businesses that relied on low-quality content to rank high were almost destroyed in an instant.
You might be wondering what Google is looking for when it analyzes how great my website is. The answer is that Google is unlikely to find a single person who can definitively answer this question. Next, we need to ask: “If nobody knows, how do we possibly make something Google likes?”
The algorithm’s goal is to approximate the human perception of quality as measured by the 23 questions. We’ll probably have a website that is friendly to humans who answer the questions.
These are the questions Google asked its human testers in order to assess the quality of websites. These questions will help you assess how your website would perform under these scrutiny. You can also ask yourself what you can do to improve it. You can make these changes and Google’s quality algorithms will improve your site’s perception. Your rankings will also increase.
1. Would you trust the information presented in this article?
2. Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
3. Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
4. Would you be comfortable giving your credit card information to this site?
5. Does this article have spelling, stylistic, or factual errors?
6. Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
7. Does the article provide original content or information, original reporting, original research, or original analysis?
8. Does the page provide substantial value when compared to other pages in search results?
9. How much quality control is done on content?
10. Does the article describe both sides of a story?
11. Is the site a recognized authority on its topic?
12. Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
13. Was the article edited well, or does it appear sloppy or hastily produced?
14. For a health related query, would you trust information from this site?
15. Would you recognize this site as an authoritative source when mentioned by name?
16. Does this article provide a complete or comprehensive description of the topic?
17. Does this article contain insightful analysis or interesting information that is beyond obvious?
18. Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
19. Does this article have an excessive amount of ads that distract from or interfere with the main content?
20. Would you expect to see this article in a printed magazine, encyclopedia or book?
21. Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
22. Are the pages produced with great care and attention to detail vs. less attention to detail?
23. Would users complain when they see pages from this site?
Some of these questions may be subjective while others can be more objective. The question “Does this article contain spelling errors?” is an example that can be answered with no difficulty. It is more difficult to interpret and therefore likely to contain more machine learning.
If you are unsure about how your website stacks up against these questions or you know that it is definitely lacking in at least one area, I suggest you take a few minutes to look at the websites you spend most of your time on each week. How do they compare?
This will allow you to calibrate your machine learning algorithm (i.e. Your brain to identify the key signals that will tell you about quality control on content. After calibrating your site, you can revisit it and be shocked by the new insights that you discover.