Twitter LinkedIn Google+
WP Greet Box icon

Welcome back, visitor!. You might want to subscribe to the RSS feed for online marketing info as Todd posts it.

Switch Reading StyleNighttimeDaytime

Top 11 Euphemisms for Cloaking

Online marketing information can change quickly This article is 14 years and 215 days old, and the facts and opinions contained in it may be out of date.

Euphemisms are used in many areas of politics. The definition of cloaking to an engineer, and to an SEO is marginally different in terms of semantics. Cloaking has been villafied by search engines when users and bots are served different content. Engineers believe bots are pretty smart (they normally are) – and SEO’s believe bots should be lead around by the nose only to appropriate areas. “Cloaking” often implies intent and extent that conflict with SE terms of service – but there are many very grey areas as far as what is acceptable and what isn’t. By definition – cloaking is NEVER acceptable – so be sure you are using the proper terminology. Of course this is a bit tounge and cheek – but the point is that there are certainly valid reasons for selectively delivering content – and that “cloaking” is mainly defined by intent. I’m pretty glad I’m not the guy at the SE’s that has to determine the intent of redirects.

  1. 1. IP delivery

  2. Geo-targeting
  3. Flash Detection
  4. Server speed analysis
  5. Duplicate content detection and reduction
  6. Member experience discovery
  7. User agent detection
  8. Browser extension
  9. Spider detection
  10. User experience maximization.
  11. Selective demographic delivery

What are the best reasons for “selective delivery” that you’ve heard? Do you think search engines would frown on that type of delivery if detected?

Thanks to Dan, Marshall, Brad, Neil, and Cameron for their contributions to the conversation that spawned this.

More information about Todd Malicoat aka stuntdubl.

Twitter LinkedIn 

  • scoreboard

    You forgot “Adsense Optimization”…

  • David Temple

    Here’s a reason voiced by Stephan Spencer to the search engines at SES Chicago last year.

    What is your current official position on simplifying the URLs selectively for bots like Googlebot, Yahoo Slurp, etc. by user-agent detection in order to drop session IDs and other superfluous parameters from the URL? Do you consider it cloaking? And if so, is it good cloaking or bad cloaking?

    Will the same page content display to the user if that user types into their browser the URL that was given to the bot? I responded with a “Yes,” then all four search engines all confirmed individually:

    No problem.

    Then Charles Martin from Google jumped in again with:

    Please do that!

  • httpwebwitch

    So easy to justify: paid advertisements from an external feed or source should be cloaked from bots. By removing the feed you are protecting your advertisers from inflated hit counts by non-human visitors. Non-humans don’t convert. You also don’t want those links getting indexed as part of your cached page.

    Likewise any content that is paid for by impression (CPM) should not be shown to non-human visitors, because it’ll get indexed, cached, crawled and abused.

  • Mike Papageorge

    Cloaking can be a great way to preview a new design/website in situ for stakeholders while delivering the current version to the general public.

  • Pingback: Starked SF, Unforgiving News from the Bay » Blog Archive » Talk of the Town: Tuesday

  • Pingback: GreenEye Wire»Blog Archive » Cloaking for Search Engines

  • Pingback: SEO Chicks |The SEO Blog with attitude