Review Your Website And Reap The Rewards


The Review Your Web Site and Reap the Rewards article contains information about the webmaster tools available to review and analyse your website for optimum performance in NZ.

Do you keep an eye on your Web site’s performance? Are you aware of your Web site’s main traffic sources? If the answer is no, then you could be missing out on identifying some very insightful online trends and statistics specific to your Web site.

Tracking and analysing your Web site will help you to make informed decisions on design, marketing and advertising decisions. How, you ask? Well, your Web site statistics are more than just a load of numbers and data to be ignored and disregarded. They are a powerful reporting tool that can be used to dramatically improve your business’ online strategy. They will highlight weaknesses on your site, so you can take action to achieve improvements.

Below, we outline tools to track and measure how your Web site is performing. They all generate results that are easy to interpret and best of all, these tools won’t cost you a cent.

Review With Google Analytics

A Google Analytics account is a great in-depth reporting tool that provides you with comprehensive insights into your Web site traffic. Whilst most standard reporting packages provide a basic overview of site visitors and popular pages, Google’s Web analysis tool offers an advanced level of statistical analysis.

Here’s what we especially like about Google Analytics:

  • Setting up conversion goals will provide you with a benchmark to improve and identify weak areas of your site. If design, text or optimisation updates improve these ratios, you’re on the right track. If they get worse…well, you get the picture.
  • You’ll be able to determine exactly where your traffic is coming from and what sources actually generate the enquiries or sales you are seeking.
  • Isolate and analyse particular segments of your Web site traffic.

Review With Google Webmaster Central

Google Webmaster Central is a useful tool that will help you see how your site is indexed by the Google search engine. It’s a one-stop shop for monitoring and managing your Web site and will empower you with information to improve your site’s visibility in Google search results.

Some impressive Google Webmaster features include:

  • Diagnose potential problems with your site. Identify crawl errors so you can see what URLs can’t be accessed.
  • See how many sites are linking to you.
  • Identify the top search queries for your site.
  • Submit a sitemap so that Google knows about all the pages on your site, and how often they are updated or changed.
  • Identify the most significant keywords Google found when crawling your site. If your preferred keywords are AWOL from the results, then you might have a problem.

Review With Bing Webmaster Centre

Bing Webmaster Centre is a search service as part of Microsoft’s Bing search engine, allowing webmasters to add their websites to the Bing index crawler. The service also offers tools for webmasters to troubleshoot the crawling and indexing of their website, sitemap creation, submission and ping tools, website statistics, consolidation of content submission, and new content and community resources.

Bing Webmaster Center contains tools and features to support webmasters to access data and manage their websites on Bing:

  • Crawl issues allows webmasters to discover potential issues with their websites such as File Not Found (404) errors, blocked by REP, long dynamic URLs, and unsupported content-types
  • Backlink data allows webmasters to access to data about their referring links
  • Advanced filtering allow webmasters to quickly scope the results in their website reports to zoom into the data they need
  • Sitemaps
  • Outbound links
  • Guidelines for successful indexing

Review Your Sitemap And Robots.txt Setup

Make sure search engines can find and easily index your site content with sitemaps and robots.txt.

A sitemap is a list of all the URLs on your Web site and allows you to notify search engines about pages on your website that are available for crawling.

Three reasons why you should have a sitemap:

  • Handy for those hidden pages that aren’t linked to main content sections and my otherwise be missed by a search engine’s normal crawling process.
  • Search engines use sitemaps to check out your site’s structure and can improve how they crawl your site in the future.
  • Great for SEO. They’ll let search engines know about new or updated pages on your Web site.

Once you’ve created or reviewed your sitemap, make sure you submit it (or resubmit it as the case may be) to Google Webmasters and see if there are any problems with it. Making a standard content page on your site (such as sitemap.html) is a good way to go, but using the sitemap.xml format is recommended by Google Webmaster Central and many web crawlers.

Using robots.txt, on the other hand is about preventing web crawlers from accessing Web sites or particular pages that you don’t want to be found by the public. A robots.txt file added to your site guarantees the primary web crawlers will ignore particular files on your site.

It’s a good idea to review and update your Sitemap and robots.txt setup regularly so that search engines know exactly what’s on your site, making it easier for your target markets to find you.

Reaping The Rewards

The obvious reality is that analysing and tracking your site’s performance alone, will not achieve an improvement in traffic or conversion rates (if only it was that easy?). You need to take action and convert your well-informed online observations into positive enhancements for your site, then keep measuring your online performance. Good luck!

Article written by Mark Rocket of Avatar – New Zealand Web Design