Cardinal Path

Guide to Great SEO: A Checklist for Digital Marketing Managers

Search Engine Optimization (SEO) is continuously evolving, and that makes it challenging to keep track of what works and what needs to be done to be successful in organic search. 

The checklist below covers SEO basics that marketers need to know, along with key questions to ask yourself, your agency or internal marketing and IT teams, to ensure your organization is getting the most from its SEO efforts. 

1. Getting Access to Search Analytics Data

It could be argued that the first step you take after registering your domain name, before your site even launches/re-launches, is to set up Google Search Console (GSC) and Bing Webmaster Tools (BWT). 

Why?

GSC and BWT are a suite of tools for website owners and administrators to

  • Submit and manage their sites
  • Remove domains, subfolders, and specific URLs from search
  • Monitor and be alerted about issues with websites such as penalties (a.k.a. manual actions)
  • Get Search Analytics (a.k.a. organic performance) data such as click, impression, and position data for your site and keywords
  • And much more…

Best of all, these tools are provided free of cost!

Google and Bing dominate search engine usage in North America, Western Europe, and Australia. Other search engines have significant market share in other markets. These include Baidu (China), Yandex (Russia and Eastern Europe), and Naver (South Korea). If your website operates in multiple regions or languages, ensure your site is registered in the equivalent search console or webmaster tools service for the dominant search engine for those markets.

2. Website Analytics

We would be remiss if we did not mention setting up your website analytics platform. While the free version of Google Analytics is the most common website analytics platform,  enterprise platforms such as Adobe Analytics and Google Analytics 360 offer a  robust solution for collecting data about website visitors. 

Why?

Web analytics software can help businesses understand how users arrive on your website, how they engage with the content and what actions they take. It’s critical for diagnosing issues such as cart abandonment for e-commerce sites. 

For SEO, analytics helps content writers understand which content is converting visitors into customers and highlights abnormal bounce and exit rates that can suggest poor content, user experience, and other issues. Most importantly, it gives you a chance to fix issues that wouldn’t be discovered without the data. You can connect your Google Search Console to Google Analytics and gain additional insights about how your search visitors interact with your content.

3. Technical SEO Issues

With access to search analytics and web analytics data, we can finally start on identifying and fixing SEO issues as well as growth opportunities. 

Generally, we recommend starting with a technical review of the site. This analysis focuses on crawlability and highlighting any potential indexation issues. Crawlability refers to the ease of access to your website for web crawlers –  programs designed to find content by following links from one URL to another URL. If search engines are not able to index your content or website, it won’t appear in any search results.

3.1 – Does your website have crawl errors?

Google’s Search Console provides a Coverage Report which details the errors and warnings they have found while crawling your website. Some errors can be critical issues, for example:

  • Submitted URL not found (404): These URLs are likely included in the sitemap but produce an error when you try to visit them.
  • Submitted URL has a crawl issue: This non-specific error message can be the result of many issues. For example, a redirect loop can be listed as a “crawl issue”. These errors require further investigation to determine their cause.

3.2 – Does your website have crawl warnings?

Warnings in the Coverage Report are for URLs that Google may or may not index due to some issues related to conflicting content, HTML tags or code on the site. 

For example, Submitted URL blocked by robots.txt is a warning for URLs that are likely included in a sitemap but Google is not allowed to visit it. Note that these URLs may still be indexed and visible in search results. 

3.3 – Are You blocking URLs that shouldn’t be blocked?

Sites that block JavaScript and CSS files in the robots.txt file should ensure that those files are not critical for displaying content on the site. Google and Bing will follow the disallow rules and content that is blocked content may not be included in search results.

On the other hand, a common misconception among web developers is that disallowing a page in the robots.txt file will prevent it from being indexed by the search engines or remove it if it is indexed. This is not true. 

A robots.txt file only tells crawlers which URL may not be accessed. If a URL has already been indexed it will not be removed from search results and if there are links or canonical tags pointing to that URL, it may still be indexed. As a result, search engines may show the URL but with a generic title and/or without a description for that URL.

3.4 – Are your URL redirects effective and efficient? 

There comes a time for all sites when old content and their URLs are retired. At that point, instead of showing users old or invalid content, websites typically redirect users to a new landing page or URL. But sometimes, things can go wrong. 

Long redirect chains, redirect loops in which users are stuck going back and forth between two or more URLs, and redirecting users to broken pages are all common issues. All of these issues result in poor user experience and can cause search engines to drop the pages from their index. 

Ideally, all redirects should be a single “hop” to the final valid URL. 

3.5 – Are search engines able to render my website?

With the rise of JavaScript frameworks such as React, Angular, and Node.js, web developers have been creating cutting edge websites that are fast and look amazing on desktop and mobile devices.

At the same time, these frameworks also gained notoriety as being a black-hole for SEO. Any content that entered a JavaScript-based website was never to be found in organic search. 

While Google and Bing are both working towards being able to crawl and index content on sites built with popular JavaScript frameworks, they are not perfect. To help developers, Google has provided the documentation and testing tools, such as the Mobile-friendly testing tool, so that developers can see how Google would render the page.

If your site is built on a JavaScript framework, you can test how well your content is indexed by taking a unique snippet of text from a URL (at least 6-8 words long) and pasting it into the search engine. If the URL is not included in the search results, there may be technical issues hurting your site’s performance in organic search. 

3.6 – Are my webpages built with best practices for site speed and performance? Does my website load slowly?

The range of devices capable of accessing the internet has grown significantly in the last decade. Best practices for web development have evolved to include website speed and performance, enabling users on mobile and tablet devices, and users with limited connectivity or on congested networks to access the content.

To help website owners and developers, Google has provided the Pagespeed Insights tool which assesses a web page for the implementation of best practices and provides a score for both Mobile and Desktop devices. 

The Pagespeed Insights score is not a measure of the actual speed of the site. For that, Google has created the “Test My Site” feature on Think with Google. Here, website owners can check the loading speed for their site, create a benchmark against competitors, and estimate the effect speed improvements would have on revenue (based on Google’s own research). 

4. Content and Metadata Issues

With a technical review completed, we should have a lean and well functioning website ready for content.  Here are some considerations when optimizing your content for search engines. 

4.1 Does each page target a single keyword/topic?

Organic search is driven by keyword relevance, by which we mean that users type in specific keywords to find information and search engines attempt to find the most relevant content to satisfy those users’ wants and needs. 

When creating content for your website, it’s helpful to focus on specific topics and sub-topics for each page. Consider which keywords your audience might use to find information about those topics and include those in important areas of your copy, such as:

  • Title and meta description for the page
  • Headings
  • Internal links to your content
  • The alt text for images, if applicable

4.2 Does each page have enough quality content?

Many website owners measure content quality by the number of words on each page. But, simply putting more words on the page does not make the content better. 

Instead, consider once again the keywords that the audience used to arrive at your webpage. Does the content on your page answer the primary questions they have about the topic? Are you providing the solution that they are most likely looking for? 

One way to think about content quality is to review the top search results that your content is competing against. Enter the primary keyword(s) into the search engine and take a look at the type of content that appears in the top search results. Do a side-by-side comparison of that content and ask:

  • Are we talking about the same topic(s)?
  • Are we answering the same questions? With the same depth and clarity?
  • Is the content on our site in the same format? For example, if the majority of the top search results are a product landing page, you should also produce a product landing page for that keyword.
  • Is our content at least as good as, if not better than, the competition?

4.3 Are my page headings optimized for readability and navigation?

The headings used on a page are often stylized as a larger or bolder font. But, it also serves a purpose beyond making the text more visually appealing. The HTML tags for headings (e.g. h1, h2, h3, etc…) are also used as navigation aide by site visitors who use screen reading software and devices for accessibility.

HTML heading tags are “ranked” in a hierarchy that provides different levels of importance. Ensure that the primary heading uses a <h1> tag and is descriptive of the main topic and content of the page. <h2> tags should be used for secondary headings and include a description of the sub-topic(s) under the primary heading.  Similarly, <h3> tags and beyond should be used to mark a sub-topic one level lower than the parent heading tag.

Conceptually, it may help to break down topics like the chart below.

Primary (H1) Heading
H2 Heading #1 H2 Heading #2
H3 HeadingH3 HeadingH3 HeadingH3 HeadingH3 HeadingH3 Heading
H4 HeadingH4 Heading

H4 Heading

In practice, the content on the page will follow this structure

·         Primary (H1) Heading

o   H2 Heading #1

§  H3 Heading

·         H4 Heading

§  H3 Heading

·         H4 Heading

§  H3 Heading

§  H3 Heading

o   H2 Heading #2

§  H3 Heading

·         H4 Heading

§  H3 Heading

Structuring the content on a page like this has additional benefits. For SEO, these headings provide additional opportunities to use relevant organic search terms and creates natural divisions in content that can help you capture multiple featured snippets. It also allows users to quickly scan through the content to find the information they need, which may lead to an improvement in engagement metrics.

4.4 Are the page titles and meta descriptions for each URL optimized?

With all the effort placed on creating high-quality content, we need to devote some time to two HTML tags that help your web pages make a good impression in the search results. The title tag and the meta description tag.

Search engines use page titles and meta descriptions to create snippets of text that appears in the search results. While there is no technical limit for how much content you can put in these tags, Google and other search engines will show only approximately 60 characters for the title and 160 characters for the description in their search results. Adhering to the limits of what can be shown will give you maximum control of your snippets.

As mentioned earlier, these tags are another opportunity to include the target keywords for your page. Generally, we don’t have multiple pages targeting an identical set of keywords, so the page titles and meta descriptions should be unique to each page on your site.

In cases where this cannot be avoided…

4.5 Have we defined the correct canonical URL for each page?

When we have multiple pages that are very closely related in their topic, or we have a set of pages that form a single piece of content, there is a strong need for canonical tags. 

Canonicalization in SEO means picking a URL which represents the desired landing page when we have duplicate content or several pages that would potentially compete for the same topic.

For example, on each of the duplicate pages and the canonical URL we would place a the following link tag:

<link rel=”canonical” href=”https://www.example.com/the-canonical-page” />

By defining the canonical URL, not only are ensuring users see and land on the page we want and strengthening our presence in organic search results, but we are also preventing the work that would be required to combine the analytics data that would be spread across multiple URLs and other obstacles for analyzing performance. 

Next Steps

Asking the questions and following through on the items in the checklist above should give you a solid start to understanding how well your site is currently optimized. 

Did we cover everything we need to know about SEO? No, actually. We’ve just scratched the surface of technical and on-page SEO. And, we haven’t mentioned local and international SEO or off-site factors such as backlinks. 

We will cover these and other topics in future posts. While a checklist doesn’t match the business value of a comprehensive SEO audit, I hope you take away some valuable insights that help you to move the needle on your SEO goals.  

Contact me if you have questions! info@cardinalpath.com 

CP Marketing

Share
Published by
CP Marketing

Recent Posts

Optimizing user experiences with Digital Experience Analytics (DXA) platforms

As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…

2 months ago

Enabling Value-Based Bidding with Google Tightlock

Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…

3 months ago

Resolving “Unassigned” Traffic in GA4

Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…

3 months ago

This website uses cookies.