As an in-house SEO for large dynamic sites like Trulia, I spent way too much time chasing problems and reversing code that never should have been pushed in the first place. Let’s face it, life would be much more pleasant if we could catch those problems before they get pushed live. This should be done in an automated way so that human error is removed from the equation. As an SEO, I should never be required to manually review code to check meta robots tags and everything else that can go wrong, that is waaaaaay too risky.
Look what happened one of SEORadar’s customers prior to using our service. They accidentally placed a noindex tag on their key pages with this nasty line of code:
And the aftermath was not pretty!
Yikes! They identified the problem after 5 days and it took almost a full week to push the fix due to internal processes. The really sad thing is they did not fully recover and ended up about 15% lower in overall traffic. Recovery from disasters are often extremely painful, so of course, the best thing is to prevent the disaster in the first place.
Testing is hard and things slip through the cracks (which is why we built SEORadar in the first place). SEORadar gives you great piece of mind by monitoring your site and letting you know when something nasty happens. But let’s face it, Google is fast now and will crawl and index those unplanned changes in a jiffy. So sometimes you are working against the clock while the rest of your company responds with their feet in quicksand! Quick, delete that meta robots noindex before Google picks it up and results in grief for you and your company.
While we will always continue to monitor your sites, we want to catch those problem changes before they get pushed live. I’ll show you how to do it with SEORadar.
How it works
The audit reports work just like our standard reports. Instead of comparing the live site to the previous archive, you are comparing the live site to staging.
For example, maybe an important keyword has been deleted from a title. You will get an alert like this:
Or even worse, maybe something like this happened!
Of course, now that you have caught this on staging, you can halt the site push and get it fixed.
You can also run various diff reports that will compare full HTML, content or just SEO elements. Your QA team will love this!
No VPN Access Required
When we first built this feature, we hit obstacles and friction because we required VPN access or login/password to staging environments. We recognize that companies need to protect everything behind their firewall, so we came up with a solution. Using our Chrome extension, your browser will retrieve the test pages and send them out to our servers to run the comparisons. No need to open up your VPN to us! Simply install the Chrome extension which you can find here.
Step 1. Install the Chrome Extension from here
Step 2: Add the staging domain from manage->domains screen.
Step 3: Associate staging domain with primary domain
You can label the staging domain. This is useful if you are access it via IP address or if you have multiple staging environments.
Important: If you want us to test robots.txt on staging, check the box below. That makes sense if your robots.txt on staging is a replica of live and part of your site update. (We respect robots.txt, so it is possible we could be blocked if you tick it).
Step 4: Authorize the extension with your authorization/API key.
That can be fetched by logging in and going to the user->settings screen.
Step 5: Kick off the crawl from the extension
That’s it. We will test and compare all the URLs that are configured for the live domain. And now you will really be able to breathe easier when that major update goes live. Of course, things can still break so you still need SEORadar auditing the live site, but this makes things much, much safer.
This feature is available in our enterprise edition. Contact me for a demo of this feature or if you are an existing customer who would like to upgrade.