In most cases, cloaking is a black-hat SEO technique to avoid. And in this article, I'll tell you why you should steer clear from it (and how to double-check that you’re not accidentally cloaking your URLs).
Let's dive in!
Cloaking is an unethical SEO tactic where a website delivers one version to a user's browser while delivering a different version to search engine crawlers. (In some cases, users may even be redirected to completely unrelated URLs.)
Often done for malicious purposes, the purpose of cloaking is to trick search engines into thinking a website is more rank-worthy than it actually is.
For example: On a given website, crawlers may see a keyword-optimized, Google-approved page...while users may see a visually-appealing page with little to no valuable content.
Why would people do that?
Because black-hat SEO practitioners know that Google values content-rich websites. And if the site they really want to show is either deceptive or poor in user-friendly content, they'll exploit loopholes in SEO to avoid heavy penalties.
With that in mind, here are the most common types of cloaking you’ll find out there:
Based on their specific goals and expertise level, practitioners will implement different cloaking tactics, such as…
In IP-based cloaking, a website detects a user's IP address and displays different content to different users based on their address. As a result, visitors to that website might come across inconsistent information between the SERPs and the website. Such inconsistencies can also include product pricing and offers.
Now, when an IP address associated with a web crawler visits the site, the page may be better optimized for better rankings.
User-agent information tells a website about which browser, device, and operating system a visitor is using. This way, they can adjust and present the content accordingly.
Shady practitioners will harness that information for misleading purposes. Say, if a user were to access the same web page from different devices or browsers, they might come across completely different pages.
Depending on where a visitor came from, they may see a different rendition of a website. For example, if they click a link on the SERP to reach a site, that link is the source of their request. This source is also known as a "referrer."
Cloaking websites will check the user's HTTP_REFERER header to redirect different users to different pages. Whether those will be the cloaked or "uncloaked" version depends on the referrer.
Ever clicked on an enticing SERP link, only to discover the website was in a completely different language? If adjusting your language settings doesn't do the trick, that may be a case of cloaking.
HTTP Accept-Language header cloaking happens when a website displays different content according to the language settings of a user's browser. This one can be particularly frustrating for users who depend on language-specific information to navigate a website.
There are plenty of signs to watch out for when looking for a cloaking website. These include:
- Uncommon and irrelevant redirects to low-quality websites
- Keyword stuffing, naturally
- Content that's inconsistent with the SERP's title tags and meta descriptions
- Not seeing the bolded snippet from the SERPs anywhere on the page
- Poorly-written content that lacks relevance
- Excessive pop-ups and sketchy advertising
Again, those are signs – not proof! Avoid drawing conclusions from a single event.
Cloaking falls into black-hat territory (with a few exceptions, which I'll mention in a moment). That alone should be a reason not to do it. But if you're prone to dismissing red flags until it's too late, here are just a few ways cloaking could drag your website down:
Users who bump into mismatched pages, broken links, and a poor browsing experience aren't very happy campers. In short, cloaked websites create an inconsistent user experience and weaken visitor trust.
In Google’s 2023 SEO Office Hours, one of the website owners asked Google Analyst Gary Illyes whether "giving Googlebot (a crawler) a different HTTP status code from the one served to human visitors would be acceptable."
Illyes said that delivering different HTTP status codes would be classified as cloaking. According to Search Engine Journal,
"In response to the website owner’s question, Illyes strongly advised against cloaking status codes, stating it’s risky. He explained that multiple serving conditions could lead to potential issues, such as the site getting de-indexed from Google."
A high E-E-A-T score? Not possible when users and search engines spot ill-intentioned cloaking tactics.
Cloaking means breaking the golden rule: to provide users with people-first, transparent, and trustworthy content. Sooner or later, search engines will spot and penalize a cloaking site. Hard.
You'll find mixed opinions if you google anything similar to the above question. We've heard Illyes's take. Now, Rand Fishkin from Moz agrees that cloaking can be an effective tactic and even lists websites that implemented it successfully.
Granted, his list involves names like Amazon and Yelp, followed by more big guys. Fishkin tells us how big names get more leeway, even if they overstep a few boundaries.
So, if you're not a big name, if you don't "cloak" exactly as search engines expect you to, and if your technique doesn't fall into a pearly white category of SEO tactics...don't do it.
If you’re a news publisher using a paywall, you can show a different version of the site to Google. In your case, you’ll show a version that doesn’t include a paywall to Google while your visitors see a paywall.
Even as a potentially white-hat SEO tactic (in some ultra-specific cases), cloaking is by and large considered a black-hat tactic. And however "effective" it might be for a while, shady practices aren't worth a severe penalty.
In conclusion, do your best to keep the contents of your website transparent, regardless of who's seeing it or crawling it. If you want to gain an advantage, make it your own!