When it comes to understanding how your website is performing from an SEO perspective, Google Analytics is an excellent quick and dirty tool.
It’s also all too often used incorrectly.
Important SEO analytics tip #1:
Trends are more important than volumes. And they’re easier to spot.
Important SEO analytics tip #2:
Short term fluctuations don’t matter.
I can’t tell you how many panic calls I’ve had with people distraught at how their traffic has plummeted, only to see it return a few days later.
The big problem with what we’re seeing here:
Many of the search terms that drive traffic to your website are are completely irrelevant.
For example we get a large number of visitors to our website looking for website review. Digging into these shows us that many are from universities, and so presumably are looking for information rather than our services. The high bounce and exit rates confirm this.
This isn’t a failing or an SEO issue as such; it’s just a fact of life.
But we do need to do something about it, or these terms can distort both the volumes and the trends of traffic to your site.
In other words we need to find a way to exclude as much noise as possible.
Important SEO analytics tip #3:
The obvious way to clean-up the data is to introduce some filters in an advanced segment.
Be careful. I’ve seen a number of filters set up to do this by excluding all organic traffic that spends less than 10 seconds, for example, or organic traffic that only views one page before leaving.
This is a clumsy approach. If parts of your website contain information that people are searching for, and these pages rank well in the search engines, your website will then get visitors who find exactly what they were searching for. Once they have it, they’ll move on.
So even though these people may not appear to be interested in what you sell, there’s hopefully a reason you created those pages, right? So there’s a possibility of converting some of them to customers.
We need a better way.
The good news is that it’s very easy to do so.
The bad news is that it’s slow and fiddly work.
More good news though: you’ll probably only have to do this very occasionally. Once or twice a year or so is usually enough.
The idea is to identify as many of the off-target keywords as possible.
But before you shudder at the thought of trawling through tens of thousand of search terms, you don’t have to identify every single off-target keyword.
The idea is to to get as many statistically significant terms as possible.
For example one of our websites sees around 3,000 keywords bringing people to the site each month; a total of over 40,000 visitors.
Identifying the irrelevant keywords in a list of 3,000 would be brain-stupefying, but by only looking at the top 100 keywords, I’ve already isolated almost 90% of the 40,000 visits. Finally there’s a benefit to (not provided).
Once you have your list of crappy keywords, from there it’s a simple matter of setting-up a simple advanced segment. (I never chose the name.)
This is in two parts. The first makes sure that only organic traffic is included:
The second part excludes the keywords that you don’t want.
There are two ways to do this. The first is to add the terms one at a time, but be warned, this is painful beyond words.
An easier approach is to use a basic RegExp. Simply add keyword 1|keyword 2|keyword 3|keyword 4|keyword 5 to an exclude:
That’s it – from there you can quickly and easily identify the trends by looking at longer periods of time.
The data you’re looking at will be cleaner and more accurate.
Important SEO analytics tip #4:
One final note of caution: be very careful when adding the keywords to exclude. A heavy handed approach will paint another version of the very inaccurate picture that you’re trying to avoid in the first place.
Happy hunting.