A Brief History Of Link Building Strategies

In the early days of the Internet, link building was easy. There were no rules, and you could do anything you wanted to try to gain visibility. That has all changed, thanks largely to Google’s war on subversive link building practices and bad site content, but let’s take a look at what life was like way back then, in the frontier days of the Wild, Wild Web:

Link building strategies in the 1990s

Back in the mid-1990s, unscrupulous site owners quickly discovered that all they had to do to get high search engine rankings and therefore high visibility was to have lots of links to their sites all over the Internet. Although many honest folks tried to get links honestly by sticking with relevant link swaps, there were no rules, so the “bad guys” generally had a leg up over the good guys.

Google’s impact on link building strategies beginning in 1998

Search engines began to develop more complex algorithms that couldn’t be manipulated easily, but it wasn’t until Larry Page and Sergey Brin developed pre-Google incarnation Backrub and then Google, in 1998, that algorithms truly began to be used to put a damper on humans’ attempts to artificially manipulate site rankings through dishonest link building strategies like bulk link swapping.

In the early 2000s

Link building strategies began to really take shape in the early 2000s. The following once-popular practices have fallen on hard times thanks to Google’s Panda and Penguin (see below), but were great ways to link build until site owners’ abuses knocked them out of the running:

Directory submissions

Directory submissions were once considered a great way to provide quality links to owners’ sites, but as with previous link building practices, they fell out of favor thanks to site owners’ abuse and Google’s subsequent crackdowns.

Article directories

Article directories accepted what were supposed to be good quality articles from site owners in exchange for linking to authors’ sites. Again, although the directories began as a great link building strategy for authors, quality began to drop as unscrupulous “authors” decided that they could simply submit low-quality content, with an author bio box attached that linked back to the author’s site. In their heyday, free and paid directories sprang up everywhere, but as the article directories filled with junk instead of useful content, they lost value with users. Too many site owners were using them just to improve ranking (exactly what Google wanted to avoid), and as a result, article directories are now largely deemed “low quality” sites by Google (even previously vetted sites like Wise Geek). Article directory search engine links have largely been dissolved since Panda’s inception, making them useless in link building practices.

Reciprocal links

Reciprocal linking was historically the means by which to website owners would simply exchange links with each other in hopes that each would benefit from the exchange with increased visibility. Today, although non-organic reciprocal links are banned, they’re still used in practice as natural occurrences when bloggers link to external resources, etc.

Blogroll links

When blogging came into favor, bloggers boosted visibility by posting “blog rolls” of links to other blog sites that they endorsed or read. Again, although this is not a problem if done as a matter of true interest to the blog owner, as with reciprocal linking, it’s a practice that can be easily abused, and Google generally frowns on it.

Forum links

Forum links remain popular, whereby an anchor is optimized within a forum poster’s signature. Historically, these have worked because the more a particular user posts on a given forum, the more links will appear in that particular domain. Natural links within text are still valued, but anchor-optimized links within forums will likely be deemed artificial and eventually be banned.
Enter Google Panda and Google Penguin
Google Panda and Penguin continue to tighten link building strategy controls for site owners:

Google Panda

Introduced in 2011 and first updated in 2012, Google Panda’s purpose was to weed out so-called “content farms” with artificially high rankings but low quality or “thin” site content so that high quality sites would regain their deserved high page rankings, rankings currently being crowded out by content farms. Originally said to impact as much as 12% of the content on the Internet, the last public update in 2012 only affected approximately 0.7%. (As of March 2013, Google has stated that it will simply roll Panda updates into general algorithm updates and will no longer confirm them.)

Google Penguin

First announced in April 2012, Google Penguin is an algorithm meant to catch and thwart sites participating in”black hat” SEO techniques such as keyword stuffing, buying links, using invisible text, and deliberately duplicating content. Notably, sites that have remedied their spamming practices may be able to regain rankings, rather than being permanently blacklisted by Google.

Chris Countey is Senior Digital Marketing Specialist for Delphic Digital, a Sitecore certified partner in Philadelphia. Chris covers topics a wide range of inbound marketing topics.

Enhanced by Zemanta