SEM – Search Engine Marketing? Shouldn’t we really call it “Search Engine Manipulation”

Google knows about this problem. This is of course why they rolled out the much talked about Panda Update late last year.

I talk about search engine optimization (SEO) and search engine marketing (SEM) on this blog and elsewhere. It’s important to note that I’m speaking about those topics with a different mindset than many SEO “experts” out there. First, I’m always talking so-called “white hat” techniques of course. But more importantly than that, I’m talking about SEO/SEM with the assumption that there is an underlying website or service of value, providing a service people actually want, and not simply a site getting revenue from clicks or ads etc. In other words, SEO / SEM isn’t the end, but only the means.

Google’s Panda update seems a bit like Yosemite Sam killing a fly with a shotgun – there’s a lot of collateral damage… and the flies are right back in the house again before long. I’m not sure the net result of Panda is positive or negative. They killed a certain class of what google considered “bad guys” but they may also have killed as many legitimate and useful sites as they did “bad guys”.

But the fact is, search results are still polluted with bogus sites and pages. You know what I’m talking about. You click a link in the top search results, only to arrive at a page that offers no content even loosely related to the search. What’s even more common is arriving at a page that is clearly “planted” through SEO/SEM tactics that contains content, sort of, but clearly really exists for some secondary purpose besides informing, usually to sell something else or drive you to other assosiated properties, products or services. It’s perhaps not full-on spam, but it’s not exactly legitimate either.

For many search terms, especially in Google’s case, the top few results appear almost hand picked, with Google clearly preferring Wikipedia pages. But after that, the next 50 top search results are often more than 50% junk. This sucks. The tricksters are out tricking those with legitimate sites with legitimate content that are also applying SEO and SEM but, for whatever reasons, the trickers still fool Google.

It’s easy to fall into the thinking that all SEO and SEM is “fooling” Google, but every site has to do what they can to get reasonable placement in search results. What’s sad is that a good site, with legitimately, highly related content, combined with a reasonable amount of SEO / SEM is not enough to beat the tricksters and their less-useful, less-legitimate sites. For the bogus sites, SEO / SEM is the business – for the rest of us, it’s just something we have to do to support the business.

It’s a huge business. There appears to be be far more money spent promoting all these bogus sites than there is spent promoting, perhaps small, but legitimately useful sites. People spend thousands of dollars on software and services that automate dastardly SEO / SEM tactics, like back links, link wheels, auto-posting to forums, blogs, etc. (what we would call spamming), automatically re-writing scraped text and re-posting (spinning), and all sorts of other ugliness. It boggles the mind.

For a given topic or category, there are some number of legitimate sites and services. And what I’d like to be able to see in the search results would be a reasonable and fair competition among those legitimate sites. Say there are 1,000 legitimate sites that would be appropriate in the top-1000 search results, where those sites should be ranked by a search engine and ordered fairly. What happens in real life is the bogus “search engine manipulated” sites increase that by some rather significant factor, so now there are maybe 10,000 sites being ranked by the search engine and those 1,000 legitimate sites are hiding somewhere in the 10,000 search results. And to make matters worse, the 9,000 bogus sites will often be better at applying “search engine manipulation” so that they get ranked higher than our original 1,000 “good” sites.

The Panda update shows that it is not easy for Google to identify the 1,000 “good sites” from the 9,000 “bad sites” through automation (and whatever other techniques they may be applying). Most attempts to fix it tend to throw the baby out with the bath water. The “bad sites” are better at “cheating” since most their energy is spent on SEO / SEM whereas most the energy of the legitimate sites is spent operating their service, improving the customer experience, and supporting their customers.

Getting your site seen among the 1,000 competitors is hard enough. But when you have to also compete with 9,000 “bogus” sites, it’s absolutely mind-numbingly frustrating. Instead of devoting your energy toward your customers, you end up having to devote a lot of energy to try to beat the tricksters. This sucks as a user and as an operator of websites and online services. As a user, I’d rather see a site that is poor at SEO / SEM but has related content than the bogus sites that are better at “search engine manipulation.” And as a site operator and service provider, I’d rather spend all my energy improving the service and supporting customers, instead of campaigning in the SEO / SEM battlefield. But that’s not how it works in real life.