Are you looking to give your website an SEO boost but think you have exhausted most optimizations or are looking for a quick win?
Consider working on improving the organic click-through rate (CTR) of your search listings. One of the ways I like to tackle this task is to start by examining the true outliers.
Here I’ll outline my process to accomplish this automatically across our SEO clients.
How do we identify outliers?
You should always try to improve the CTR of your site’s organic search results, but where do you start? I start by identifying the true outliers — the page and query combinations that truly fall outside the norm.
A number of statistical methods can be used for the identification of outliers, so let’s look at the few that I’m relying on for detection.
Z-score
According to Khan Academy, “a z-score measures exactly how many standard deviations above or below the mean a data point is.” So, if the CTR for a query and page combo falls below a z-score of -3 (also sometimes -2.5 or -2.68 are used as thresholds) for given organic position, it is an outlier.
z-score = (individual CTR – mean CTR at a given position) / standard deviation of CTR at a given position
Interquartile range (IQR)
Khan Academy defines IQR as “the amount of spread in the middle 50%, percent of a dataset. In other words, it is the distance between the first quartile (Q1) and the third quartile (Q3).”
To find outliers using this method, you look for a CTR that falls outside of 1.5 times the interquartile range (IQR) below the first quartile.
So, if an individual CTR is less than:
Quartile 1 of CTR at a given position – (IQR of CTR at a given position * 1.5)
… then it may be considered an outlier.
Modified z-score
Z-scores don’t work as well for smaller data sets, so to combat the issue for smaller websites where there are fewer data points about CTR at a given position, we can leverage a modified z-score for detecting outliers.
According to IBM:
The standard z score is calculated by dividing the difference from the mean by the standard deviation. The modified z score is calculated from the mean absolute deviation (MeanAD) or median absolute deviation (MAD). These values must be multiplied by a constant to approximate the standard deviation.
Applied to our organic search listing CTR, we’re looking for a modified z-score of less than -3.5 to be considered an outlier.
Modified z-score = (constant of 0.6745 * (individual CTR – median CTR of a given position)) / median absolute deviation for a CTR at a given position
Accessing our data and setting up notifications
Now that we know a little bit about the statistics used to identify our outliers, how do we apply this to our search results?
Enter the Google Search Console Search Analytics API. I’ve written about it before and even made a publicly available Google Search Console Python script for backing up your organic query data to a SQL database every month. That serves well for data mining and other analytics functions, but there’s a whole lot that we can do with the API, such as automating this process of identifying poorly performing click-through rates.
- Data collection: We collect data on query, page and associated metrics via the Google Search Console Search Analytics API. You can use a Python script (or whatever your preference) in order to accomplish this.
- Round average position: I round average position to the tenths decimal place (e.g., 1.19 is rounded to 1.2). Otherwise, you encounter a situation where your sample size for each position is too small. So for example, positions 1.1-1.9 each have associated CTR summary statistics.
- Math: We identify outliers using a combination of the statistical methods for identifying outliers mentioned above. This is also automated via the same script that downloads the Search Analytics data.
- Email: If any negative outliers are identified for a keyword query and page combination at an average position, an email is sent out identifying all of this data to each of the SEOs assigned to the account to investigate.
- Scheduling: Set your script to run on a recurring basis. For frequency, I’ve chosen a monthly time frame, which works well with our clients in the agency framework, but that frequency can be more or less depending on your needs. Scheduling is accomplished using Cron.
Acting on data and winning
The basic idea of what we want to do is identify the search query and landing page combinations that aren’t being clicked as much as they should for their given rank.
If they aren’t performing as well as you’d expect, there can be a number of reasons why. For example, there may be a SERP feature (such as a featured snippet) or paid ads that are taking click share. There’s nothing you can really do about those.
However, there are elements of the SERP that are within your control: rich snippets and meta data (your title tags and meta descriptions). If you determine that it’s likely your meta data that’s causing a less-than-desirable CTR, that’s a simple enough fix — just start playing with different title tag and meta description combinations. Ideally, this done via a system of A/B testing.
Just a quick note regarding the statistical methods used: Not a single one of the methods for detecting outliers is 100 percent perfect. You’ll have to use them as tools to evaluate on an individual basis. It will come down to the SEO on the account to dig a little deeper.
Even once you’ve narrowed down poor CTR performance to copy within the meta data, it is still possible that there are false negatives, query and landing page combinations that aren’t in line with your targets. At the end of the day, we need our SEO practitioners to think logically and act when it makes sense and not otherwise.
Next steps
Once you’ve improved the CTR of your sub-par organic listings, you can start about thinking about looking at the rest of your listings and taking them from typical to exceptional click-through rates. Look at industry benchmarks, as well as other client data, and try to be better. At the end of the day, a lot of the power is in your hands, and good copy can go a long way.
The post Alert! Abnormal organic CTR detected. Automatic detection of poorly performing meta data appeared first on Search Engine Land.
from SEO Rank Video Blog http://ift.tt/2wFQrOe
via IFTTT
No comments:
Post a Comment