Search Engine Optimization

SEO Tools – A classic tool to help you learn about your website.

New increasingly advanced tools get pushed into the marketplace each week from online marketing companies. SEO practitioners, agencies and website owners are always on the lookout for the latest tool that will help them gain more insight into their website in hopes of gaining an advantage on the competition. While I use many of these products, there is one tool that I continually go back to on a regular basis which has been around for a long time. It’s still one of the best…

The tool I’m talking about is Xenu Link Sleuth. I’m sure that most of you have heard of it, but if you haven’t it basically it works in a similar manner to Googlebot. It will spider through the pages of your website, giving you a detailed list of information that can help you learn more about your website. There are tons of ways that this tool can be used including for the following issues:

Broken Links

The main purpose of Xenu is to find broken links. Run the search on your site and any broken links will be displayed in red. The ‘Status’ listing will let you know if the page is simply ‘not found’ or has ‘timed out’. You are able to right click on the URL to look at the properties and find out which page contains the broken link so that it can be easily fixed. I would suggest running Xenu on a regular basis to look for broken links as you always want to make sure that your users have a quality experience and aren’t running into pages that no longer exist.

Duplicate Content

Duplicate content within the same site is an issue that always seems to pop-up, especially when it comes to large ecommerce websites. How often do we see these big sites with the exact same pages indexed multiple times under different URL strings.

If you are curious to find out if your site may also have this problem, then try running Xenu. It will spider your pages and list a number of details including ‘URL’ and ‘Title’. You are then able to sort the data by either of these details to look for similarities which may indicate possible duplicate pages. You can’t rely completely on this tool as you will still need to do a little on-page investigation such as taking a look at the Robots.txt file or looking for Canonical Tags. Once you determine that these pages are indeed duplicate, you can then right click on the listing to look at the properties and find out which pages are linking to this URL. You should then be able to determine why these issues are happening and set in motion a process to fix them.

Title Tags

Use this tool to get a quick list of the Title Tags that are being used on the website. Sort the data by ‘Title’ to find out if you have Title Tags which are exactly the same. If this is true then you may need to think about making some revisions to ensure that each Title Tag is properly targeted towards the content on the page. You are also able to export this data into a spreadsheet where the process of rewriting them is made easier.

Alt Tags

Find out if all of the images on your website are using Alt Tags. The ‘Title’ for each image represents what is used in the Alt Tag. Learn which images are missing Alt Tags and then move forward making the appropriate changes.

External Links

Find out who you’re linking out to. Sometimes it’s worth a review to find out if the sites that you’re linking to are still the quality source of information that they once were.

Cardinal Path

Share
Published by
Cardinal Path

Recent Posts

Optimizing user experiences with Digital Experience Analytics (DXA) platforms

As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…

2 months ago

Enabling Value-Based Bidding with Google Tightlock

Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…

3 months ago

Resolving “Unassigned” Traffic in GA4

Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…

3 months ago

This website uses cookies.