Welcome to our Help Center 👋

How to Use the Site Audit Tools

Give your sites a tune-up
Written by girithara prakash
Updated 2 months ago

A site audit is like a car inspection. You check the brakes, inspect the headlights, and make sure the engine starts. Likewise, a site audit makes sure the basic components work properly so that it can publish content and rank for organic searches.   

To help you evaluate your site, WPBlazer offers three auditing tools. They are:

  • Site Map Generator
  • Robot Generator
  • Site Audit

Opening the audit pages

You can access the site audit tools from the same menu.

From the WPBlazer app dashboard, select a site and then click the Dashboard icon. A new page displays.   

 

On the new page, hover the mouse pointer on the menu icon that sits second from the top.

On the popup panel, click SEO Tools and then select the tool you want to open. A new page displays.  

Using the site map generator

This tool crawls your website and makes a list of the content, including posts and pages. The output is an XML file that search engines like Google read. Every website needs a sitemap if the aim is to rank on keywords. 

If you produce new content on a weekly or monthly basis, it's a good idea to generate a new sitemap every month. If you produce new content daily, explore options to automate sitemap updates. 

The Site Map Generator displays four content tabs: posts, pages, categories, and tags. Click any tab to view its content.

To create a new sitemap, click the Generate SiteMap button.

After the system creates a sitemap, a green bar displays at the top of the page.  The new sitemap is now on your website, and no further work is required. The green bar has a link to download the sitemap, which is optional.

Using the robots text generator

This is a simple but powerful tool. Search engines send out crawlers to look at websites and make a list of the content. That helps websites rank for keywords, and people find your site.

As you grow your site, you might decide that you want crawlers to stay away from some parts of your website -- like the admin folder. That's the purpose of a robots.txt file. It tells crawlers to avoid certain areas of the site.

To create a file, enter the rules, and then click the Submit button. Make sure you follow the standard robots.txt file format.

Using the site audit tools

The Site Audit page works like a toolbox. It contains the tools you need to check your site and make repairs so that it runs smoothly and is visible to search engines.

To access a tool, click a link. You can check coding, analyze speeds, and more. All tools are free, but some limit the number of times you can use the service.

Did this answer your question?