Analysis has always been an important part of my work in SEO…
In the past, I was even one step ahead of my competitors. Because of this I was enabled to outrank big competitors with quite large budgets. If I tried a bit harder I would have been able to attain some position over time until I hit the first result.
Life was good, but it requires hard work…
Over time, this is no longer a secret, but thanks to tools like Surfer. It is also not so difficult. The whole process has become less time-consuming, and having everything in one place is very convenient.
What has not changed for me is that I am still approaching this as always, and this may be slightly different from how some people use these tools.
In this article I want to show you how I use surfers with the same process that I first started using a few years ago … Let me show you how I use progressive optimization to guide decision making. I use and focus on 80/20 factors to move the needle fast.
An important difference between the progressive optimization approach and other approaches is that you spend most of your time optimizing your site and site content after posting.
This means that you spend a lot of time returning to your content, making small changes here and there, before returning to make changes later.
Depending on what type of optimization you are doing, you will need a variety of tools.
When it comes to progressively optimizing your content, you’ll want to choose where to start.
Progressive optimization is not suitable for making many changes at the same time … because we are trying to maximize the separation of data.
If you made 5 changes on the page, and your rating improved by 6 positions. What changes were most significant?
What if the 4th change had a positive effect, and the 5th change failed?
What if you improved 8 positions without changing the 5th?
That is the method we think, and it is why we need to narrow the focus of any given round of changes.
This is because it is called progressive because we make the quick changes to try and achieve the best possible result sometime in the future.
So, when you choose things to optimize for the first time, what are you going to focus on?
As a rule, it is the factors that have the greatest impact on your rating.
In true Pareto theory, it often seems that these factors make up 20%, giving you 80% of your results.
Let’s call them the Power Factors.
Surfer gives you excellent data on most of these. The factors that I pay attention to in this article are those that I regularly use for any page that I optimize.
Other Tools You Need
These are not many tools. These tools are very useful things which will make your work a lot easier.
As I already told you, one of the things that has changed over the years is access to tools like surfers that make my life a lot easier. But there are some things that I have not changed over the past few years…
Since we spend time between changes, it can be easy to forget what you did and when.
It is also important to know what you have done so that you can undo these changes if the result does not work. It’s important to use backup and modification controls, as the third version of your page may have a higher rating than version 1, but not better than version 2, etc.
At Syndiket, we believe four types of SEO exist – and we have an acronym to represent those 4 types of SEO. The acronym is T.R.A.P.
“T” stands for Technical, “R” stands for Relevancy, “A” stands for Authority, and “P” stands for popularity. Search engine optimization has many smaller divisions within the 4 types, but all of them can be placed into one of these 4 buckets.
Generally, technical SEO for local businesses carry the least importance for ranking. Technical SEO has a bare minimum that is required and this usually includes things like site speed, indexation issues, crawlability, and schema. Once the core technical parts are done, minimal upkeep is required.
Relevancy is one of trivium elements of SEO. It has equal importance with popularity signals and authority signals. Relevancy signals are based on algorithmic learning principles. Bots crawl the internet every time a searcher has a search. Each search is given a relevancy score and the URLs that pop up for a query. The higher the relevancy score you attain, the greater your aggregated rating becomes in Google’s eyes. Digital marketing is a strange thing in 2020, and ranking a website requires the website to be relevant on many fronts.
Google’s Co-creator, Larry Page, had a unique idea in 1998 which has led to the modern-day Google Empire. “Page Rank”, named after Larry Page himself, was the algorithm that established Google as a search engine giant. The algorithm ranked websites by authority.
Every page of a website has authority and the sum of all pages has another authority metric. The authority metric is largely determined by how many people link to them (backlinks). The aggregate score of all pages pointing to a domain creates the domain score, which is what Syndiket calls “Domain Rating”, per Ahrefs metrics. The more a site is referenced, the more authority it has. But, the real improvement to the algorithm came when Google began to classify authority weight.
If Tony Hawk endorsed Syndiket for skateboarding, it would carry a lot more authority than 5 random high school kids endorsing Syndiket. This differentiation in authority happened in 2012 with the Penguin update. Authority SEO is complicated but VERY important.
Popularity signals are especially strong for GMB or local SEO, but popularity and engagement are used for all rankings. The goal of this signal is for Google to verify its own algorithm. You can check off all the boxes, but if your content is something real people hate, Google has ways to measure that. Syndiket has proprietary methods of controlling CTR (click-through rate) but we also infuse CRO methods into our work to make sure people actually like the content. Social shares and likes are also included in this bucket.
The first step is to select your target page.
If you have an informational site, such as an affiliate site you might have to pick a page that is ranking between positions 8 and 14 for your primary target keyword.
After defining your target page, it’s time to identify your main target keywords.
Choose the three main target keywords.
Ideally, this should be permutations, alternative, or at least having the same intent and meaning.
Note: If you don’t have targeted keywords yet, you can always use the Surfer general keywords, phrases, or keywords and phrases function to help you identify some good keywords.
I think that getting the best overall results requires an analysis of some of the other related keywords that you’re targeting on your page.
Due to the amount of data available to you in Surfer, it is also more important to use the 80/20 rule the tool to “power factor” when analyzing multiple keywords per page.
A very important feature of Surfer is the ability to analyze and analyze search results on your desktop or mobile devices.
Although most sites currently use the “Mobile First” index, this does not mean that the desktop does not matter.
Although most sites currently use the “Mobile First” index, this does not mean that the desktop does not matter.
First, John Muller, a Google employee, confirmed that they still provide 2: 1 desktop computers on mobile-oriented sites.
Yes, we have pages with both desktop & mobile user-agents. Mostly it is something like a 2-3:1 split, so when we transfer to mobile first indexing for a domain, it is 2/3rd mobile, 1/3rd desktop (the numbers are not fix, it is just what I usually see)
More importantly. Mobile search results and desktop search results are different!
It is important to understand from which device your audience performs most of its searches, as you can optimize for the device from which you will receive the most traffic.
Now you have given your keywords time to search.
Once the report is ready, you can move on to your results.
And now, thanks to a really good filtering tool, you can use the visibility icon (in the image below) to remove irrelevant results from your analysis.
Do it carefully, they come out because Google thinks it’s relevant. But equally, sometimes in the search results there are some real clangers in the SERPs that cannot be compared with your own site.
You can turn off averages if you need to isolate the exact numbers for specific results.
This can be helpful if you have already optimized within averages, and are now trying to make results better by optimizing “up or down”.
I would recommend doing this after having already optimized your article for the ‘averages’.
Since it may take several weeks before returning to SERP to optimize one of the other factors, it is best to record the results for your focus factor.
Thus, when you return later to optimize any other factor, you can update your results and record data to the maximum.
This makes it easy to come back later and means you don’t have to constantly switch between the thoughts of the surfer. This is not what you should do, but it helps me to have
Here are some quick shortcuts for some of the “power factors” I’ve recommended analyzing and optimizing with:
If you regularly use surfers, it is possible that you know where to find most of them, but I agree that I had a hidden motive here.
I wanted you to notice a pattern…
If you saw this, you may ask why I pay so much attention to “exact keywords.” And this is amazing, and I want to tell you why I do it.
Methods like TF-IDF, Density and True Density all use Term Frequency or “Keyword Frequency” as basic parts of their calculations.
I find it easier to use this basic data layer to optimize my sites.
It doesn’t matter what methods come and go, the main question of how many times was this word used will always be vital in some way.
You can use whatever you want, but would suggest you to follow my tutorials step by step.
This step does not require anything special…
A simple spreadsheet with dates, keywords and status will do.
You don’t have to do anything special here; you can log changes daily, weekly or every 10 days…
Everything that works for you and your goals is in order. Logging data at “some” intervals is better than not logging.
Calendars are mainly used in our industry for meetings, social media and many more events
It would be prudent to have a calendar for your optimization work too
In order to have progressive optimization, optimization calendar is essential.
It makes you more efficient and effective.
Allows you to register changes in the campaign tracker, make new changes on your page and do the latest analysis on Surfer — all in a timely order. It can be a lot to keep on top of if you are doing this for multiple pages and sites.
The only thing you must need in place is your changelog.
A changelog is important to keep track of what you are actually doing. Optimization is important for many of the same reasons as a calendar, campaign tracker, etc.
The best results, with progressive optimization, require a degree of organization with your workflow and processes regardless of the tools you are using.
You can find other ways around using the files. What you can’t do is find an alternate way around using a changelog.
To make this simple I am sharing a very basic changelog template for those of you who want to use it.
[FREE] Google Sheets Changelog Template: Click to make a copy.
At this point there is nothing more to say other than go out there and get optimizing!
You have everything you need to get started =]
There are many other features that I find really useful when using surfers for optimization purposes.
You can use them as part of the procedures described above or separately. No matter what you choose, if you are new to this tool, you should definitely be aware of these things.
We explained these as one of the “power factors”. Whereas some find it controversial, I as well as many others, and the folks at Surfer have all had pretty good results including terms that are expected to be seen in pieces of content.
Surfer helps you to find these easily by providing you access to common words and common phrases…
“Surfer Audit” is one of my favorite features.
You can perform an audit to see how you stack up by isolating your URL in the results, or by adding it manually
This can be very useful for collecting data early, quickly checking or viewing the form of your site after a progressive optimization campaign.
It gives you some actionable term frequency recommendations.
It will also help you in pointing out any obvious issues where they arise (usually when well outside of the average).
There is no single way to use tools like Surfer. Although many prefer to use it to optimize content before publishing, or even change many things after publication I hope I have shown that you can use whatever approach you want as long as the system is well thought out.