Are You Guilty of Old-School SEO Web Spam? Part 1: Page Content

December 29, 2014

Digital Marketing Tansy OBryant By Tansy OBryant

I just hung up the phone with the third site owner in recent memory who gleefully showed me their new whiz-bang way of making on-page content invisible to the reader: Adding white text on a white background.

“Hidden text” is actually an old-school web spam tactic. It’s one of the first “black hat” SEO tactics (those that go against search engine guidelines and disregard a human audience) that initiated Google penalties nearly 12 years ago.

After 20 years of working in the search industry, I’m discovering a new wave of search marketers that, although aware of current spam tactics, are unfamiliar with the ancient SEO tactics that still cause penalties.

So let’s step into our time machine and revisit some of the oldest black hat SEO tactics you may be guilty of — without realizing they’re SEO web spam.

Hidden Content

To continue our white-text, white-background conversation:

Making a page’s text the same or similar color as the background is considered spam in Google’s eyes. Google will diminish the rank of pages containing hidden content and even those that link to pages containing hidden content.

When hidden text exists, the content is present in the HTML of the page — therefore visible to search engines but invisible to the reader. This breaks a fundamental rule of Google optimization: A user must be shown the same content that’s shown to Google.

Content Amount

You’ll get penalized by Google for having content that’s too heavy or too light. You want the content on your webpage to be just the right amount.

Online retailers frequently have a category page that contains ten or more rows of products bookended by a paragraph of content near the footer of the page. Regardless of how informative this content may be, it’s likely to have a low value to the search engines.

Once content flows over the 600-pixel mark, its value to Google diminishes dramatically. Pages longer than 1,800 to 2,400 pixels aren’t preferred. You should have 300 to 1,200 characters of content above the 600 pixel mark and as close to the upper left-hand corner of the page as possible. Minimize images and ads at the top your pages, and when possible, don’t embed text into your images.

Online retailers that are using “infinity scrolling” solutions (like Polyvore) may not need to worry about this issue. Most infinite scrolling solutions create a dynamic user experience, but the search engine sees pagination when the site is indexed.

Hopelessly Unoriginal Content

For the online retailer, unoriginal content is often “duplicate content,” “too similar content” or “plagiarized content.” Let’s explore:

Duplicating the Manufacturer

Google insists that content needs to exist on a webpage in order to participate in organic search. But just any content isn’t good enough — it must be original. This is a high bar for online retailers that carry thousands of products.

Say, for example, a retailer uses the product description directly from the feed provided by the manufacturer. This process is repeated by hundreds of online retailers, so that the product description is duplicated hundreds of times across the internet. The manufacturer’s content quickly becomes unoriginal as it’s used and reused. This issue alone drives many online retailers to rely solely on paid search to appear in Google’s search results.

To help combat this problem, write unique product descriptions for your top 20 products. Adding customer reviews to your page can also help, but make sure they render in your HTML code.

Duplicating Yourself

Many online retailers are the original manufacturers of their products, and they write their own original product description copy. But in the interest of time, copywriters often duplicate snippets of content.

For example, if a 50/50 poly-cotton sheet comes in both a 150–thread count and 300–thread count, a copywriter may use the same product description for both and change only the thread count number. Then, Google will select one of the two products for organic listing — not ideal. Simply changing the thread count from 150 to 300 isn’t enough to make the content appear original.

Online retailers will frequently duplicate themselves with boilerplate copy. Repeating a paragraph over and over on press releases, product pages or category pages will likely attract a duplicate content penalty.

In marketing, there’s an adage that says, “Tell them, tell them and then tell them again.” Google, on the other hand, would prefer that you say it just once. Avoid the boilerplate and designate this type of information to your About Us page.

Affiliates, News Sites and Blogs That Duplicate You

Online retailers that write their original product content can still discover that search engines aren’t giving them credit for their efforts. The culprit is usually their data feed. Though the product description is unique, because it’s in their data feed, it’s being sent to multiple channels, such as Amazon, eBay, affiliate networks and blogs.

The best solution isn’t to give all your content away. Break your product descriptions into a minimum of two parts in your web database: short descriptions and long descriptions.

Syndicate the short description in your feed to marketplaces and affiliates, and keep the long description for your site only. Instruct the affiliate to use the data feed you provide. If bloggers or affiliates scrape your site and use the long description, let them know that this is against your affiliate guidelines, and they’ll usually cease the practice. If they don’t, you can always discontinue their affiliate ID.

Doorway pages and cloaking

Doorway pages and cloaked pages are content-heavy pages stuffed with keywords and are programmed to load only for a second before redirecting to another page on the site. These pages are frequently created using automatically generated content that reads as gibberish.  While this does make the content “original,” it runs afoul of semantic search engine algorithms that grade the readability of copy.

Google, Bing and Yahoo easily sniff these pages out and will remove entire sites with these pages from their index.

Google penalties for all content spam are generally doled out by the Google Panda algorithm. If your site is suffering from a Panda penalty, you’ll likely notice an abrupt fall-off in traffic.

The solution?

Simply blocking offending, spammy content with a robots.txt will not solve the problem. The bad content must be removed before traffic is restored. The Panda algorithm updates regularly, and the penalty can usually be resolved within 90 days.

The best solution is to hire a product marketing copywriter to write unique and compelling content for your shoppers.

 

The second and final part of this series discusses old-school spammy SEO tactics surrounding hyperlinks.

 

Blog post by Tansy Obryant, ChannelAdvisor SEO strategist


wp-Foundations_LPNeed a crash course on SEO? Download this free eBook to ensure you’re doing everything necessary to be seen by customers looking for your products.