“Don’t look at the scoreboard.”
Enterprise SEO Scorecards
John Wooden meant that players and teams should play in the moment, focus on the game plan, do all the small things, and be good teammates. This is the mindset he instilled in his teams in their pursuit of excellence. Winning games and championships was never the goal, they were simply side effects of excellence paying off.
I love this advice when it comes to almost any challenge in life, but e-commerce SEO, where you should meticulously look at the scorecards and reports. Excelling in enterprise SEO requires you to earn trust with many colleagues in the company, and the best way to do that is to share wins. Wins require not only strong performance in the numbers, but high confidence in their accuracy.
My Own Experience
When I started at Oriental Trading in October 2015, the numbers looked great. Halloween traffic and sales were significantly higher year-over-year. Then, during the first week of November, almost overnight, we dropped almost 80% in our overall sales.
On most sites, we can attribute steep dips or jumps to a change we made, or a change Google made. We spot check our reporting tools, and can normally jump to a quick conclusion.
In the enterprise, there are too many moving pieces for the answer to be so simple. Maybe all your rankings dropped. Maybe a change was made in IT or QA processes. Maybe something changed in your attribution or on the Analytics side. Maybe you got hit with a Penalty or an algorithm update. Maybe a competitor went viral. Maybe something broke in the site’s code or databases. Maybe your customer behavior changed. Maybe…
Any one or combination of these factors can lead you to telling the wrong story, especially when the numbers look good. And if your boss – the Director or VP – doesn’t challenge your assumptions, you might go on for months and years working with bad data. Reporting that you can’t retroactively fix.
After 2 months, after asking lots and lots of questions, digging up data from every accessible SEO data source, placing dozens of test orders from outside our network, and setting up countless meetings to find the root of the issue, we figured it out. It turned out to there was a change made to improve security and to block bots and junk traffic from scraping our site; unintentionally, every visitor from Google who had an Orientaltrading.com cookie in their browser – their visits, orders and sales were credited to Direct Load, not SEO.
After fixing the issue, numbers looked good again. 1 week later, a new issue over-reporting Brand (SEO clicks to the home page) traffic and sales surfaced. This one took weeks and months of investigation, and because of limited IT resources, we implemented the fix at the end of August. For over 7 months, SEO Brand was getting credit every time a customer searched for ‘Oriental Trading’, clicked on our home page (good), AND when they clicked on a Paid Search Brand link, clicked an Organic Site Link, or performed an Organic Site Search in the Sitelinks Search box on Google. Every one of these instances credited the Organic Home page because a Chrome glitch pre-rendered Core Metrics tags in Google when a branded search was queried.
Now, as a general rule of thumb, when the numbers look better or worse by 10% or more, YoY, I’ve learned to ask why, and how. I compare my primary data, with secondary sources such as Webmaster Tools, keyword tracking tools, and other clickstream data tools, for validation.
It’s an annoying task, especially when you try to find the root cause of any of these issues, and to find the solution, but you learn a whole new forensic skill from the experience, and as the subject matter expert, you do right by your company.
When you spot a significant change in your traffic and sales, up or down, like an investigative reporter, ask a whole lot of questions that start with ‘why’ and ‘how’.
Questions Worth Asking
Your traffic and sales are lower (or higher) than expected. Why? How do you know if it’s a genuine or fake decrease (or increase)?
A genuine decrease (losing rankings and real sales) may attribute to one or more of these commonly experienced events in enterprise SEO:
- an unfavorable algorithm update
- a server, code or database defect / issue that impacted recent search engine crawls of your site
- negative SEO
- intended code-change elsewhere in the business w/out SEO input
- business seasonality
How should you decipher the root cause(s), and take (quick) action on solving the problem? What backup data sources should you rely on to support your hypothesis?
A fake decrease (no rankings decline; sales attributed to another traffic source, such as Paid Search, or Direct Load) may attribute to one or more of these commonly experienced events in enterprise SEO:
- Bad data: tagging, tracking, reporting, and attribution issues
- business seasonality
- an intended change in IT to improve security, to improve performance, to improve processes.
How should you trouble-shoot the root cause(s) to prove your rankings are as expected, and that real traffic/sales aren’t lost? What backup data sources should you rely on to support your hypothesis?
Enterprise SEO big data increases chances you can work with bad data. Contamination could occur at the root back-end (data warehouse) levels, or surface front-end (reporting suites, dashboards) levels, and may be caused by integration issues, code change, attribution methodology, or human error, so it’s important for SEO teams to deeply connect with Business Intelligence / Analytics / IT teams.
What processes should you put in place to quality check your reports, scorecards, and dashboards?
How should you explain and report the risks or instances of negative SEO to the business partners?
Take responsibility of reporting your primary data. Ensure it’s as accurate as it can be. Break SEO traffic out by home page traffic, category page traffic, product page traffic, static page traffic, and place test orders to validate correct attribution. Good execs will hold you accountable for strong performance. Great leaders will also hold you accountable for data integrity. And so, a great SEO manager gets really granular with the numbers, strongly grasps back-end processes and attribution models, and uses third party tools like Webmaster Tools as secondary data sources to validate accuracy and integrity. Only with accurate data can we take pride in our work, and make sound business decisions.