Twitter LinkedIn Google+
WP Greet Box icon

Welcome back, visitor!. You might want to subscribe to the RSS feed for online marketing info as Todd posts it.

Switch Reading StyleNighttimeDaytime

Under Manual Review: Editorial Control and the Search Engines

Online marketing information can change quickly This article is 8 years and 229 days old, and the facts and opinions contained in it may be out of date.

I am constantly baffled by Google’s insistence that there is such a low level of human discretion and editorial control in their organic search result determination. Isn’t the decision to leave editorial control up to a silicon based form of intelligence even SOME level of editorial control to begin with? Determining which forms of user data from the toolbar and other sources must certainly qualify as a certain level of editorial control based on human bias. There is nothing inherently wrong with human bias, and it becomes less of a bias if the views and decisions are directly expressed. I think Yahoo is a bit more open about their level of editorial control, but not really much. I really hope that the transparency of these decisions improves with both engines over time. Insight to this process would help to MEET those guidelines and stay within less ambiguous confines of the terms of service.

I am constantly questioned by clients or prospects on the “ethics” of search engine marketing. Three examples that tend to be recurring:

  • -Is it okay to create a page for every state?
  • -Can we create two sites to rank for the same terms?

  • -Can we use a domain for every one of our twenty niches?
  • I think these are all fairly valid questions of a business owner who is hungry for market share. I have seen instances where any of these three have worked, and I have seen instances where they have all caused problems. My answers generally sway towards the same theories of creating quality content and focusing on the user, but the subject itself leads me on a philosophical tangent into the ethics of editorial control and the search engines.

    It only matters if someone sees it (no one complains until it ranks)

    The number one spam fighter (other than Matt Cutts) is competitors who can’t compete. If you ever get the chance to listen to people talk to Matt at a conference you will understand this. EVERYONE whines about their competitor’s unethical tactics (among other things), and then goes and tries to get them banned. Leaving the irony of this aside, it is a testament to the fact that you are not a target until you are over the radar. No one cares if you have 8 sites in the top 10 for “fuzzy and furry alpha widgets in podunk”. The will care, however, if you have 3 of the top 5 sites for 10 different variations on “home loan” terms or other phrases with many competitors all trying to rank for the same SERPS. I’m sure Matt (and the rest of the engineering team) have become quite adept at siphoning out the intentions of a site, but the rest of the evaluation team may not have the same level of insights, and create “hand ban collateral damage”.

    This is still a working theory, as the search engines will generally not even confess to DOING hand edits, it is still pretty uncharted territory, and most of this is pure speculation. I must also state that I have personally never had a client site banned. Any client that wants to push the envelope with tactics is made aware of the risks ahead of time. This being said, I have SEEN plenty of banned websites. They are often prospects shopping around for an SEO who they think may be able to get them out of the doghouse.

    With all three of the above mentioned questions here is my usual answer:
    Is it okay to create a page for every state?
    Yes. Provided that you create at least SOME unique content for each page. You cannot just change the state name and expect it will work. Even if you DO manage to get by duplicate content filters, you will have to face quality control review if you start to rank for all of the “statename + keyword” variations. If you have only minimal content, your pseudo-directory will not last for long even if you do attain top rankings.

    Can we create two sites to rank for the same terms?
    Yes, but it will be twice as much work, and double the risk. I would strongly recommend putting all the effort in ONE site to start with. If you do wish to go the multiple site route, the two sites should remain completely separate, and association should be kept to an absolute minimum. Retail site owners often have multiple websites selling the same products under different names. Obviously competitors don’t like this, and are willing to report them for “spamming”. You can see where this would be a very fine line full of ambiguity for a quality control rep. THIS is a judgment call. There will NEVER be hard and fast guidelines for this type of activity, and if there is, we most likely will never know about them.

    Can we use a domain for every one of our twenty niches?
    Yes. You can also just use subdomains, or subdirectories. If you interlink your sites, the search engines will associate them together anyhow. If you interlink your sites, and rank too well, you may have problems upon manual review. I would stick to one site, a nice subdirectory layout, and a strategy to drive deeplinks into each of the twenty areas.
    So as you can see, the answer is generally “yes, but…”. This conveys the level of risk involved, and helps to open the eyes of the prospect or client that it really is about the user…whether that user be a search engine quality control rep, or a prospective customer, the site should reflect what it advertises, and should have a unique value proposition or in the case of retail, at LEAST be honest and true to how it is portrayed. Saavy internet users can smell something afoul from a mile away.

    Hand Ban Collateral Damage
    After several discussions with Jim on prospect sites that were banned, I’ve come to the conclusion that some of the quality control reps are not as educated as others. There is not much consistency with hand bannings that I have seen. The best insights we’ve had into the mind of a quality control rep was the information released from searchbistro on eval.google.com.

    Hand bannings are a tough pill to swallow. Several people a month contact me asking about problems with their rankings, and there are often sites I would potentially attribute to a hand penalty – some of which I would argue are not worthy of completely losing their rankings (though I certainly may be missing something) . This is a much tougher adversary than the algorithmic filters, and the penalties for failure last MUCH longer. It hurts to NEVER have a site rank well or have it drop briefly from glory during an algorithm update, it REALLY hurts when you get a site to rank well only to have it get wacked after a short period of time with the knowledge that it will probably NEVER return to its’ former status.

    Inside the mind of a quality control rep.
    This post is not designed to try to psychoanalyze Matt Cutts or Tim Mayer. It is is designed to better understand the “little cuttlings” all over the world that rate sites with only a fraction of Matt or Tim’s knowledge and understanding of the internet. They are basing their conclusions on a limited amount of understanding (comparatively to most SEO’s and engineers). Imagine teaching your brother, sister, or cousin how to evaluate a “quality” website. I’m not sure of the education requirements necessary to be a quality rater, but I’m guessing it’s not engineer level. This gives a substantial amount of power to the “average user” (which is probably quite logical, but a bit unnerving at the same time).
    Open Questions to the Search Engines on Behalf of SEO’s

    So the question becomes, what type of quality control guidelines are these raters given? The documents posted at search bistro gave a bit of insight into this process. I’m a bit reluctant to repost some of the documentation here, as I am not a huge fan of controversy even though the links are nice. My personal question is…wouldn’t it make sense to make this information publically available? Wouldn’t it aid search engine marketers to make sites that are more quality oriented? Perhaps more examples of what constitutes “quality value add”, as well as more extensive examples of sites that would fall into the “penalized/ not-penalized” categories in ambiguous areas may be of benefit. I think there has been too much emphasis placed on fighting spam vs. encouraging quality. As with most things, encouragement is a more positive long term approach in my humble opinion. I think Matt tries to do this on his blog on occasion, and I’m sure these types of discussions will be brought up on Tim and Jeremy Z’s new show on Webmasterradio.fm. While this post may sound negative, it is fact just the opposite. I certainly appreciate the valient efforts of the SE reps to appease the blood thirsty lynch mob of crazed webmasters and SEO. The purpose is to encourage their efforts and voice concern in an area with very little communication. There are certainly attempts being made, and I hope this post will contribute to encouraging discussion in pursuit of those efforts.

    I really hope that both search engines (and MSN even perhaps one day) will embrace the webmasters that help provide them with their mountains of content, rather than just catering to the media moguls with the advertising dollars. There are still glimmers of hope that the SE’s are really here to HELP webmasters and SEO’s, but there are still lots of examples (at least in my mind) of opportunities that they have missed. Helping us to better understand the manual review process, will help the SE’s to introduce the concept to the rest of the world that they are afraid of exposing it to. We are your link between the Mountain View ivory towers and the “general public”. We understand your need for keeping the results “untainted”, but just how long do you think you can maintain the appearance that there isn’t editorial control in the organic search results? Isn’t the decision to keep the editorial decisions as algorithmic as possible a editorial decision in and of itself?

    If you have examples or questions about any site that you think may have incurred a manual penalty, feel free to e-mail it to me for research purposes (it will stay completely anonymous). You also may want to give the anatomy of a successful reinclusion request, or the guide to filters, penalties, and bannings a look.

    Related reading:

    More information about Todd Malicoat aka stuntdubl.

    Twitter LinkedIn Google+ 

    • http://www.wolf-howl.com graywolf

      The silver lining of a hand edit means you’ve found a hole they can’t currently plug in the algo. Get a new domain, lather, rinse, repeat. Of course if the site that got banned was “brandable” one well you’re in deep doo-doo now Jar-Jar.

    • General Public

      When top search engine of the day has to do hand jobs by “Surf Monkeys” means there is scope for a new search engine.

      This was the exact situation for AltaVista when google made the entry with better algorithm.

      Now is the time for some one to step in and come up with a better alogrithm and let history repeat itself.

    • http://www.technologyevangelist.com Ed Kohler

      Great post. I’m sure all search engines do some hand editing, but they also realize that hand editing is basically a losing battle. Identifying loopholes and abuses, the closing them programmatically is really the only way to manage the large volumes of data they deal with.

      Sure, that involves humans reviewing and making changes to SERPs, but not at the scale that hand editing generally insinuates.

    • Pingback: Blackhat SEO: Who Else is thinking about this? - Mymotech

    Buffer