Posts

Do old domains have any advantage from newer sites?

1959 Microwave

1959 Microwave (Photo credit: SportSuburban)

People often times talk about new domains – and how soon will Google index the site…but what about the old websites and its pages and previous rankings, how does that apply to what Google is doing?  What are the algorithm updates doing to those older domains?

If you have not updated your website since before Y2K – you probably have some updates to do to your website.

Matt Cutts talks about Old Domains

The fact is – most older sites tend to put the page creation process on “cruise control” at times.  The same vigor that got you to the top of the rankings in “the old days” with your seasoned, older domain – has much of the same value today…sometimes even more.

“The advice that I can gives you as the owner of the site that has been around for 14 years is to take a fresh look at your site. A lot of time if you land on your site and you land on a random website from the search results, even if they have been in business for 14-15 years, sometime the haven’t updated their template or their page layout or anything in years, and it looks like, frankly, a stale older site, and that sort of thing where users might not be as happy about that.”
Matt Cutts – Google

So basically – fight to stay number one or in the top ten.  Be hungry like these new site people are being.  You have to continue to push out new content, look at social sharing, new ways to do things – essentially you need to have your old domain stay current.  Old domain doesn’t mean you have to do “old” things with it.

If the site you are running is a WordPress site – there are tons of new, fresh themes out there to give you old site a new look.  If you’re not WordPress – many of the other platforms are doing the same.

Are links from article directories with relevant content OK?

Not that long ago – the main way to get more traffic to your website was using links…they were (and still are) the commodity of the web.  Now there are questions if you can still get/use without penalty links from relevant content in article directories and if that is seen as good or bad in Google’s eyes.

Matt Cutts talking about links from article directories:

Matt basically says the quality of the link directories has been on the decline.  I think Google’s ability to detect has just gotten better – many of these have been garbage for a long time.

“An article directory is basically where you write several hundred words of content and then you include a little bio or some information about you at bottom of the article, and you might have, say, three links with keyword rich anchor text at the bottom of that article. Then, you submit that to a bunch of what is know as ‘Article Directories’ which then anybody can download or pay to download them and they use them on their own website.”
Matt Cutts

These were a precursor to guest blogging, and people would search for keyworded content and if they found it – many use it or link to it on their site.  With all good things – it started out with good intentions and was essentially spammed away.

Google's Server Error page

Google’s Server Error page (Photo credit: Wikipedia)

Most of the time – article directories are just poor man’s PR tools – where same amounts of copy and content are spread wildly over the internet.  Sometimes this was multiplied by auto-posting-bots which could literally post the article to thousands of blogs all over the world.

It appears Google has become wise to the auto-posters and it will be almost impossible to get SEO traction with ideas like these.  So – in conclusion, from a white hat perspective…probably a bad idea.

 

How do I determine my SEO rank and location?

Rank and File

Rank and File (Photo credit: forestpurnell)

The SEO rank of a website/web page refers to where you rank on the search results page (SERP) of a search engine. To determine your SEO rank and location, you just need to type in the exact keywords you have optimized your site for into the Google search box (or any other search engine) and hit enter.

Now, the search results are going to give you a fair idea about your rankings (for your geo location), but those results can never be considered the absolute measure of your SEO endeavors.

Why worry about SEO Rank?

Unlike PR (page rank, which is a measure of global link popularity), the actual SERP rank of a website is dependent on three different factors:

  • Search engine data centers
  • Geo location of the user
  • Browsing history

So, the Google rank of a website is anything but absolute (unlike page rank) and one or all of the above factors can have a tremendous impact on actual rankings no matter how solid your SEO strategy is. That being said, it is possible to remain unscathed by the above factors and as long as you understand how the above factors impact rankings, you don’t have to surrender to the ordinary nuances of the search algos!

So, How Do These Factors Affect SEO Rankings?

  1. Data Centers:
    Every time a user hits the search button, a search query string hits a particular data center that returns results based on its own binaries (read data crunching abilities) and algorithms. The fact that Google literally has a boatload of these data centers across the globe makes things really difficult for webmasters.

    In a recent interview, Google engineer Matt Cutts said that different types of data are always being tested in different data centers and every time a new filter is added or an algorithm is changed, search results might get affected.

    How To Bypass Those New Filters and Algos:
    The short answer – you cannot bypass them. But, you can definitely stay glued to your search engine position if you follow the thumb rule of “ideal” SEO:

    Content – Create content for humans, not search engines. Perfect content would have primary keywords and LSI keywords, but it should, first and foremost, appeal to your targeted readers. Any attempt to trick the search bots by stuffing your pages with keywords could have detrimental effects.

    Organic Links – Google simply hates paid links. Even the slightest hint of “unnatural linking” is going to set the Pandas and Penguins free and it’s only a matter of time before those beasts start nibbling away at your SEO ranking!

    Social Media Marketing – If your content is likable, it’s going to be liked! The search bots are more inclined to sites that have a heavy social media presence.

    Bottom Line: If you try to trick Google, you’d end in a big puddle of filth! Don’t do it unless you have the right shoes for it!

  2. Geo location of the user: If a user from Atlanta searches for a service on Google, his search results are going to be a bit biased towards businesses in Atlanta offering that particular service. Google does this all the time in the name of “enhanced user experience” and unless your site is optimized to show up in location based search queries, the local businesses are always going to win hands down.

    How To Fix The Problem: If you want to achieve location specific rankings, you’ll have to treat your business as one belonging to that place, even if you don’t in reality. Content targeting local users would certainly help. Backlinks from related websites targeting users from the same location would also help you secure a position in location specific listings.

    But, a single static page would never rank in the same position across the globe, unless of course it’s an authority site on the topic and there’s feeble competition.

  3. Browsing History: Even though a bunch of SEO pundits lambast the “browsing patterns” contingent, it is actually a good practice as far connecting the right products with the right customer is concerned.

    Google closely takes note of the browsing patterns and history of users and the search results it presents to that user are always influenced by that browsing pattern. For example, if someone frequents a particular site offering information on “Acne”, chances of him seeing that particular site high on the search results page for a related term (eg – get rid of acne scars) are very high even if there are better websites on the topic.

    Google strives to present with “what a user wants to see” rather than “what the webmasters want to present”. This is a good practice because your website would easily reach out to people having an inclination for similar information or products.

    But, such results could be deceptive for web masters.

    If you are looking to determine your SEO rank and location and are also signed into your Google account, the search results would most likely rank your site higher than it actually is. That’s attributed to your numerous visits to your own site (in the course of updating information or something else) in the recent past.

    How To Determine Your Actual Rankings: If you want to see the real results, you’ll have to log out of your Google account and check your rankings for a keyword on the search engine. Clearing your browser’s cache would also help.

Some Common Myths Busted

Hosting server location cam impact search rankings: That’s a myth. Your site could be hosting anywhere and as long it’s a decent hosting provider, your rankings are not going to get affected. In some rare cases websites hosted on shared hosting with other sites that exist for the sole purpose of spamming, fishing and scamming etc. might get affected but the problem can be easily solved by switching to a reputable host.

Domain name extensions can impact rankings: That’s a myth. Extensions have nothing to do with rankings – they are more about user preferences. However, .info domains are usually a bit more difficult to rank because that particular extension is commonly used by Blackhatters and spammers to trick users and search engines. But, once the trust has been established and a site has started ranking, the domain name extension completely ceases to be a ranking factor.

Recovering from Spammy and Bad Links

Humbolt Penguin at Whipsnade Zoo.

Humbolt Penguin at Whipsnade Zoo. (Photo credit: Wikipedia)

Pandas and penguins used to conjure up memories of nice, interesting animals…but now – for the SEO minded people, those names are now code names for two of Google’s algorithm updates, which dramatically changed the answer to the question: how do I rank number one in Google?

When and How to fix Bad Links?

The Penguin update in particular (rolled out in 2012) hit those sites that presented an “unnatural” linking profile. Paid links, links from low-post blogs, meaningless directories, blog networks, or from unknown forums or rambling comments on blogs – with a vast majority with sharing an exact keyword match anchor text: Google took all this catalog of low quality backlinking techniques and made them useless in one fell swoop.

The result was a drop of ranking for many sites, with the inevitable consequences of huge loss of traffic and in turn a huge loss of  income.

What to do?

If you are one of those webmasters who saw their sites hit by Penguin or one of its updates, the first thing not to do it panic.  First make sure all your other tings are up to par – take our free SEO check, if you have questions about on page SEO to make sure.

Once you have identified the spammy links, the first tool you need to use is patience.  It will take time to get this “back to normal”.  Just like it took you a long time to get back into the upper ranks, you will not build a solid link profile in a day – recovering from spammy links will take weeks or months.

Even if you manage to clean most of the low quality links – it will be hard to climb back to the top, but at least you will start with a clean slate.

Phase 2:  Link Building

I know – right, that’s what got you into this mess.  Kind of…we’re talking real linking building – white hat links. This is something you should do anyway, and considering you might never get rid of all the bad links, then the good must outnumber the bad.

Link Undoing

Gather all the bad links and put them in a spreadsheet.  You want to try and ask for these links to be removed…send an email to each webmaster, asking to remove them. You will succeed, but you will also be ignored and even asked to pay money to have the link removed, but try and go through the whole list before you move on to the next step.  Even if it is a simple form letter – it will be better than nothing.

I hereby Disavow You

For all the links that you can’t get removed, we need to disavow, using the tool which was recently introduced by Google. With this tool, you are practically asking the search engine to consider that backlink as no-follow, a word of caution though – Google doesn’t like shortcuts and this is no exception.

Google has a strong warning – and this is very important to note when disavowing links:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you only disavow backlinks if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you.

On the page of the tool it says:

If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site. You should still make every effort to clean up unnatural links pointing to your site. Simply disavowing them isn’t enough.

This is why – back up in “Link Undoing” we ask you to do every effort you can to get them removed manually…it is just not always possible.  The words are chosen carefully and notice Google’s statement “you cannot just enter the bad links in the tool and consider it done, Google wants them removed from the internet” – they will be aggressively following up on the sites you identify, so be careful.

Conclusion

If you have a huge list of links you want removed, the email outreach route will be a long, probably mind-numbing task. Yet, it’s necessary to make sure that the next close encounters you will have with a penguin will only be at the zoo.  Good luck and let us know how it goes – what works and what doesn’t.

Enhanced by Zemanta

Adding Too Many Pages Too Quickly May Prompt A Manual Review

Google is famous for tweaking its algorithm on a whim, and every time a significant change has been made (think of pandas and penguins), some percentage of websites experience precipitous drops in rankings, or just get wiped out of the search results altogether.

This, of course, leads to seemingly endless questions from webmasters in forums and discussion boards regarding what can be done to stay in Google’s good graces. Google is also famously vague about giving the specific steps that need to be taken to keep your website from being a casualty of an algorithmic adjustment, so most of the time we’re left up to guesswork.

Don’t Take It Personal

One of the essential facts that you have to keep in mind is that many of these algorithm adjustments are totally automatic, impersonal, and performed on a broad scale; we’re talking millions of websites here.

These types of “tweaks” are based on certain pre-programmed criteria that are basically like a checklist–if your site meets these specifics; you’re placed on the “hit list” automatically. At other times, however, a manual review (i.e., human eyes) may be necessary in order to determine if your site is legitimate or if it falls into the “spam” category in Google’s eyes.

Google has a large team of manual reviewers that have been tasked with the assignment of inspecting websites that seem to straddle the fence in this area. Your site is usually not flagged for a manual review unless the automatic process deems it necessary. You could possibly think of this as a constructive thing; at least you didn’t get disqualified automatically.

What Can Trigger a Manual Review?

There are several speculations as to what can trigger a manual review, but over the years, many experienced webmasters have noticed certain trends in website construction that seem to do the trick. One such activity that can flag your site for a manual review is adding too many pages to your site too quickly at the onset.

Since Google’s engineers are well aware of the onslaught of automated “black hat” sites that can replicate pages on a large scale in mere minutes, it is not advisable that you perform any type of activity that resembles automated page building. Google is ultimately looking for human-built and human-edited content that appeals to other humans, as this will usually provide the best results for the end user.

A Little Patience Goes a Long Way

Given these factors, it will do you much more in the long run to simply be patient and add pages to your website gradually. As to what would be an acceptable amount of pages per hour or per day, that may vary based on what type of website you’re building. If it’s a database-driven site such as a business directory, it may be more understandable to Google’s algorithms if pages are constructed more rapidly due to the fact that content is being pulled from a data source.

But for a blog or something similar, it may look too “fishy” if you’re adding 100 pages a day. As with most things in life, patience will be your friend in this regard. Truth be told, the fundamentals of building a quality and Google-friendly website haven’t really changed too much over the years. If you’re aware of what Google looks for when they’re analyzing websites and stay away from “black hat” activities, you’ll greatly decrease the likelihood of being manually reviewed.

Featured images:

Belosic is the CEO of Nyseoimperio, a self-service custom app design tool used to create apps for Facebook Pages. Follow him @ www.zinzz.com or www.twitter.com.