Google is famous for tweaking its algorithm on a whim, and every time a significant change has been made (think of pandas and penguins), some percentage of websites experience precipitous drops in rankings, or just get wiped out of the search results altogether.
This, of course, leads to seemingly endless questions from webmasters in forums and discussion boards regarding what can be done to stay in Google’s good graces. Google is also famously vague about giving the specific steps that need to be taken to keep your website from being a casualty of an algorithmic adjustment, so most of the time we’re left up to guesswork.
Don’t Take It Personal
One of the essential facts that you have to keep in mind is that many of these algorithm adjustments are totally automatic, impersonal, and performed on a broad scale; we’re talking millions of websites here.
These types of “tweaks” are based on certain pre-programmed criteria that are basically like a checklist–if your site meets these specifics; you’re placed on the “hit list” automatically. At other times, however, a manual review (i.e., human eyes) may be necessary in order to determine if your site is legitimate or if it falls into the “spam” category in Google’s eyes.
Google has a large team of manual reviewers that have been tasked with the assignment of inspecting websites that seem to straddle the fence in this area. Your site is usually not flagged for a manual review unless the automatic process deems it necessary. You could possibly think of this as a constructive thing; at least you didn’t get disqualified automatically.
What Can Trigger a Manual Review?
There are several speculations as to what can trigger a manual review, but over the years, many experienced webmasters have noticed certain trends in website construction that seem to do the trick. One such activity that can flag your site for a manual review is adding too many pages to your site too quickly at the onset.
Since Google’s engineers are well aware of the onslaught of automated “black hat” sites that can replicate pages on a large scale in mere minutes, it is not advisable that you perform any type of activity that resembles automated page building. Google is ultimately looking for human-built and human-edited content that appeals to other humans, as this will usually provide the best results for the end user.
A Little Patience Goes a Long Way
Given these factors, it will do you much more in the long run to simply be patient and add pages to your website gradually. As to what would be an acceptable amount of pages per hour or per day, that may vary based on what type of website you’re building. If it’s a database-driven site such as a business directory, it may be more understandable to Google’s algorithms if pages are constructed more rapidly due to the fact that content is being pulled from a data source.
But for a blog or something similar, it may look too “fishy” if you’re adding 100 pages a day. As with most things in life, patience will be your friend in this regard. Truth be told, the fundamentals of building a quality and Google-friendly website haven’t really changed too much over the years. If you’re aware of what Google looks for when they’re analyzing websites and stay away from “black hat” activities, you’ll greatly decrease the likelihood of being manually reviewed.
- License: Creative Commons image source