As the Founder of Search Engine Journal (SEJ) Loren Baker once said:
"There’s no point at which you can sit back and relax, content that your site is at the top of search engine results pages (SERPs) once and for all. Sure, you might have reached the pinnacle today, but an SEO pro’s work is never done."
Recovering a lost or declining keyword can mean a number of different things, from fixing technical SEO issues to revamping your content from top to bottom. In this article, we'll take a look at some of the most common causes as to why your rankings may have dropped, and how to recover them.
Let's dive in!
You know you're not the only one with eyes on the prize. Competitors have also done their keyword research and are willing to use their SEO stack to rank for the exact same term.
Once a competitor publishes better E-E-A-T-focused articles and earns stronger backlinks, it's just a matter of time until your ranking drops. Unless, of course, you keep up with the changes and do your best to turn things around.
Maybe your content no longer meets the search intent as well as it once did. Maybe you've published it way back in 2014, and it’s no longer relevant.
Unless the keyword you're trying to rank for gets a smattering of monthly searches, you shouldn't expect to keep ranking for an article that has no freshness or timely relevance to it.
And it’s no news that Google likes its content helpful and fresh. As a user, you know how frustrating it can be to read an article and then realize it was written in 2014. How relevant is your information almost a decade later?
On-page SEO is like your website’s internal tour guide. If it could speak, it would tell search engines all of the reasons why the content on your website is relevant to its users and their intent.
On-page SEO includes:
- On-page content (such as blogs, articles, and landing pages)
- Title tags
- Meta descriptions
- Text-to-HTML ratio
- Image ALT text
- OpenGraph tags
All of the above can encompass minor tweaks or huge overhauls, and can give your rankings a boost when taken seriously.
Without the right technical SEO practices, your website wouldn't be indexed by Google in the first place. That's how important the technical aspects of your website are.
However, the tech side of things often runs quietly in the background. While everything may look good on the surface, check under the hood, and you just might find the clogs reducing your website's efficiency. These could be related to:
Again, troubleshooting and correcting any technical SEO issues can help your website’s indexation, crawlability, and rankings.
A manual action from Google is the fastest way to get your website deranked or de-indexed from Google. A manual action, commonly known as a "penalty," penalizes websites that are found to violate Google's Search Essentials, previously known as Webmaster Guidelines.
- Third-party spam
- Keyword stuffing
- Issues with structured data
- Thin content with little to no added value
- Sneaky redirects
If, during one of its regular crawling sessions, Google finds any issues that could potentially cause problems in search results, that will impact your search engine visibility until you take action.
The number of backlinks your website has – when combined with the relevance of those backlinks, social signals, and the quality of your content – calls the shots on your website's domain authority. That’s what makes them such a crucial ranking factor.
Unfortunately, your backlinks also stand on thin ice, as you could lose them at any given moment. And along with them, your website could also lose a chunk of its authority and overall ranking power.
Google's algorithm changes at least thousands of times every year. The reason why we don't hear about all of them is because 1) We'd go wild, and 2) They're often minor updates and only affect a few search queries.
When it comes to Core Updates, that's where things get more serious. They can result in some websites experiencing a significant increase or decrease in traffic.
While it's true that these updates are meant to improve the relevance and accuracy of search results, not keeping up with major algorithm changes can knock down the rankings of even the most reputable websites.
You shouldn't instruct Google to index every page on your website. For example, you wouldn't want to rank for private pages or pages with deliberately thin content.
That's why you should disallow tags like robots.txt on those pages: to instruct web crawlers not to crawl them.
However, you need to make sure you aren't accidentally "helping" robots.txt (or any no-index tags) in blocking pages you want to rank.
If that happens, search engines won’t have access to the content associated with an important keyword, and your page won’t rank for the desired term.
In this case, you'll typically view a warning in Google Search Console that says, “Indexed, though blocked by robots.txt.”
You've got the potential reasons why you may have lost keywords. Now, let's jump right into the solutions on how to recover them!
First, you need to discover which terms are dropping and why. To diagnose the problem, you can use Google Search Console in combination with Google Analytics, but that would mean switching between tools. And context-switching destroys productivity.
Since losing keywords is already hard as it is, let's smooth things out by using a single tool, SiteGuru, for comparing periods and identifying which keywords dropped.
In your SiteGuru dashboard, navigate to Insights > Content requiring attention.
This report shows you the pages that used to perform well in the selected period but, for some reason, got fewer clicks. The example below shows a page that got 50% less clicks during a 30-day period:
As for the reasons why this page dropped in the SERPs, you'll have to do some digging. Ask yourself the following questions:
- Does a competitor now rank for this keyword?
- Is the content outdated? Has the last update been a while?
- Is the on-page SEO missing any crucial factors? Can you do something to increase this page's click-through rate?
- Is there any seasonality at play?
On the right-hand side of the same report, you'll see a Page Score.
Click on it, and you'll get a list of on-page SEO checks and suggestions to improve the content on your page. This report can be useful if you're looking for fixes you can implement immediately.
If you'd like a single view of all of the keywords you're still ranking for but could be doing better, navigate to Insights > Keywords and click on the "Declined" field.
Or, if you'd like a complete list of the keywords you've lost, stay on the Keyword report and click on the "Lost keywords" button.
You'll see all of your keywords that aren't ranking anymore.
If the quality of your content is the problem, you'll undoubtedly need to improve it. But how?
Start by analyzing the content that's outranked yours: What does it do well? What does it do badly? And how can you make it better?
Study your competitors' tactics. What keywords are they ranking for? Are there any angles they haven't covered? You can use tools like Semrush to grab insights from their content plan and analyze their backlinks in combination with their current rankings.
Of course, that alone isn't enough. A helpful piece of content still needs to:
- Thoroughly answer your audience's questions while matching their search intent
- Get backlinks from trustworthy websites
- Include valuable personal perspectives
- Demonstrate all the signs of Experience, Expertise, Authority, and Trustworthiness (E-E-A-T)
- Include shareable elements, such as engaging visuals
Following the above tips will prevent you from creating content that's merely reproduced from other sources.
On-page SEO can be tricky to get right. Like one of those jigsaw puzzles, there are so many tiny pieces it's simple to lose them in the process.
You'd have to answer questions like:
- Are your meta tags (titles and descriptions) too long? Too short? Non-existent? For a higher shot at ranking, they need to be just right.
- Is there only one H1 on each page? Multiple H1s can make it harder for search engines to identify what the content is about.
- Do you have any problems with thin content? That's a red flag, and Google could strike your website with a manual action.
- Do most of the images on your site include ALT tags? If not, you could be making it harder for both visually impaired visitors and search engines to understand the context of your images.
You can easily answer all of those questions by running a full content audit on your website. If you're using SiteGuru, check out this step-by-step article on how to audit your content.
You won't build high-quality links overnight. That's a given. But you can speed up the process by following better practices (and avoiding shady techniques).
Here are a couple of don'ts:
- Don't pay for backlinks.
- Avoid linking to pages with thin, irrelevant content. The same way you only want great links coming in, you want great links going out.
- Cold pitching may rarely get you a "yes." If you're going down that route, make sure to build connections with whoever you'll be pitching your content to.
And here are a couple of dos:
- Share original research. Yes, it takes work. But people in your area of expertise are more than willing to link to fresh perspectives, especially if they're data-driven.
- Create a "hub section" on your site (like SiteGuru's SEO Academy) to show your expertise on an overarching topic. This is great if you're trying to build authority with topic clusters.
- If you're guest posting, only guest post on reputable sites.
Take note. You might need to come back to these later!
As SEO specialist and digital marketing campaign planner Zulqarnain Jabbar shared on LinkedIn:
"SEO is a constantly evolving field, and search engines like Google are continuously improving their algorithms to deliver more relevant and accurate search results. As a result, keeping up with the latest algorithm updates is critical for any website owner or SEO professional who wants to maintain or improve their website's search engine ranking."
If you don't currently follow Google's official blog, The Keyword, I'd recommend you do. If there are any core algorithm updates, you can get them straight from the source.
Catching up on any algorithm changes can prevent unwanted surprises, plus it helps you adjust your strategy accordingly.
Did your website get a strike? That happens to the best of us.
Manual actions will only become a problem in two cases:
- You're deliberately trying to trick your way to the top of the SERPs.
- You fail to solve the issue that got your website penalized.
Other than that, the worst-case scenario will be a notification in the Manual Actions report and the Search Console message center.
Fixing the issue will depend on...the issue in question. To find out what happened, all you have to do is expand the manual action description panel. You'll find which issues were detected and which pages were affected.
Then, just click "Learn more" for a step-by-step description of how to fix the issue.
Once you're done fixing the problem, request a review. In your request, make sure to explain the exact issue on your website, plus the steps you've taken to solve it (with documented proof of the outcome).
If all pages are affected, fix the issues on all pages. Importantly, remember that Google must be able to access and crawl the pages in order to review the issue.
Because reconsideration reviews can take several days or even weeks, the sooner you request it, the better. You can check out Google's step-by-step tutorial on Manual Actions.
Which areas of your technical SEO leave a lot to be desired? A proper technical SEO audit will tell you everything.
Again, you can't solve a single problem without asking some questions first. For technical SEO, those would include:
- Have you exhausted your crawl budget?
- Are there any indexation errors or redirects you aren't aware of?
- Are any pages showing 404 or 5xx errors?
- Is your website optimized for mobile search?
- Are your website structure, hierarchy, and categorization clear and intuitive?
- Are there any duplicate URLs?
- Are there any broken links you may have overlooked?
While that list may sound exhaustive, checking it off will be easier than you think with this Definitive Technical SEO Audit Checklist.
The reason why your rankings may have plummeted may be related to indexation or crawling issues. Fortunately, finding any blockages on your site is quick, and there are various tools you can use for this purpose.
To find whether one or multiple pages have been blocked by robots.txt, meta robots, x-robots-tag, or canonical tagscan, use the URL inspection tool in Google Search Console or the Robots Exclusion Checker extension for Chrome.
You can also use SiteGuru's free no-index checker. Simply add your URL, and in seconds, you'll see if your pages are indexable.
If your page is marked by any noindex metatags, the fix is also easy. Here's what to do:
Find the "noindex" Meta Tag in the HTML source code of your page and look for the <meta> tag that includes the "noindex" directive. It typically looks like this:
<meta name="robots" content="noindex">
If you want to allow indexing, change the content attribute like this:
<meta name="robots" content="index">
Or, depending on your case, simply remove the “noindex” directive to make it indexable by default.
Save and update the page, then test it to see if everything is working as expected.
To see if a page is indexable, use our Noindex Check tool.
If you're in SEO, you should expect to lose rankings. It's just part of playing the game.
But if a keyword is really important to you (as in, it brings in a lot of traffic and/or revenue), you should take small, steady steps toward recovering it. Remember: it's a marathon, not a sprint.
If you want a more low-touch experience with SEO and only step in when something requires your attention, try SiteGuru.
SiteGuru will automatically crawl your website every single week, giving you a weekly prioritized SEO to-do list. So you won't have to spend more than 5 minutes a day on SEO – not if you don't want to.