rulers

SEO User Experience – How Do You Measure What Google Knows?

Table of Contents

I am going to be speaking on SEO metrics at SMX advanced on Wednesday.  Specifically I’ll be talking about the daunting task of identifying metrics to measure user experience  and specifically to gain insight into the data Google accumulates about our sites as it pertains to user experience.  This is a quick overview of what I plan to discuss.

As SEO’s – we have huge blind spots to the way Google measures our sites.   If you have any doubt, just look at the aftermath of  the Panda fallout.  The best minds in the SEO community came together and came to a fairly unified response to Panda.  (Noindex thin content, get rid of tag and search result pages, get great links …. yada yada yada).   And yet here we are, 30 months down the line and there have been few documented recoveries.

Few people are actually measuring improvements beyond looking at out-of-the box Google Analytics metrics.  These metrics are a poor reflection of the true user search experience.   We need to do better.

I like the quote from Duane Forrester:

When your visitors are happy with your site, search engines notice”  

Of course they will –  and the corollary  of course:

When your visitors are NOT  happy with your site, search engines notice

So how do we measure?  How do they measure it?

How is Google and Bing measuring user experience?    (If I quote Forrester – I can’t forget Bing – but for now on whenever I say Google, I mean Bing too!)  just like us and our own sites, Google  must  have their own metrics that tells them whether or not they are presenting good search results.   Likely oft-discussed metrics include:

  • Dwell time – time actually spent on the search result before returning to Google.   The best positive signal to Google when a user never returns to Google to re-engage with the search result (in effect, this tells Google that the user query was completely satisfied by the site they clicked on).
  • Short clicks – very short visits (as measured by the user return to Google to continue the search).
  • Long clicks – very long visits. Again, the best click is the infinite click – the click that never returns.
  • Pogosticking – behavior upon return to Google. This is the most telling.    Pogosticking is really telling Google one of 3 things:
    1. Lack of Pogo (no return to the specific search result) – Yay – they found their answer.  Gold star for the target site.
    2. Pogo and selection of alternate result –  Nope – the answer you gave me did not work – demerit for that site.
    3. Pogo and search refinement of initial query – again the answer did not work (probably a negative statement about the entire SERP page not matching intent as well as the specific site(s) clicked).

 

That’s what they have.  What do we have?  The  metrics we have out of the box from Google Analytics:

  • Bounce Rate – Flawed and noisy – imperfect but still an indication  of user experience – subject to interpretation
  • Visit duration – Flawed for many of the same reasons – but generally longer is better and should equate to long-clicks.
  • PageViews per Visit – flawed as well.   I have seen sites severely hit by Google with great PV per visit number (for example, as soon as a user  thinks they won’t get they want – they back-back-back button to Google – increasing PageViews).

 

For bounce rate, I’ll discuss improving it:

  1. Tracking all interaction events
  2. Make sure we don’t break it by tracking non-interaction events improperly
  3. Understand it – understand how page changes change  and  technology use  (such as Ajax and tabs) can impact bounce
  4. Segment it – so we understand  that traffic shifts may impact the overall bounce rate while segment bounce rates remain consistent.

 

Improving it makes it less vague – but that’s not enough.

 

I really want to understand the short click.  The quick return to Google.   The “I have to GTFOOH”  syndrome that is a clear indication to Google that a user did not like your site.  For that I have started using event tracking on my sites, tracking:

  • 10 seconds
  • 20 seconds
  • 30 seconds
  • 60 seconds
  • 120 seconds

 

Now I have my short clicks and have cleaned up my long clicks and have a better bounce as well.  It great have a true, accurate count on each of those numbers and give really clear insight into the impact of site changes.

Still – the Pogosticking is a problem.   How do I know users aren’t leaving dissatisfied.   All these numbers are an indication.  But longer does not necessarily mean better!   So how do I get better information?

Why not ask my users?   

In fact that is what Google did with the famous set of questions they asked a group of people to help identify low-quality sites.  (They then compared the results of the Panda algorithm to that set as a validation mechanism).   So this is what I have started doing.

How am I Doing?

I have been using Qualaroo to put up quick surveys on my site.  (I have used SurveyMonkey in the past – it’s cheaper but a lot more work to get the behavior I get out of the box with Qualaroo).   I like Qualaroo because all I have to do is put the same snippet  on the entire site, but trigger the survey based on a url filter (as well as only survey users from search).  I manage  quite a few sites and a couple of them have a Panda event their history.    I am starting with a very simple question:

“Do you think this site has what you were looking for?” (and I ask them why if they put a no).

Good results … it seems to take just about 100 survey answers for the percentages to flatten out.   I have put it up on 3 sites now and this can give me a good indication of site changes or changes to specific section of my sites.  It’s important to get an overall score for your site across all traffic and then of course to get feedback on individual page types with subsequent surveys.  Again, pages that generate the most traffic generate the most metrics.

UX Report Card

In the end, I take all these numbers (bounce rate,  10 second visit increments, user survey, visit duration … Add in Page Views & conversion rates) and put those into a report card which tells a story of the search user experience!   Of course the metrics won’t always point the same way.  We need to interpret these results.

That’s a quick overview of what my talk will be about at SMX.

Continue Reading