Why does my site need SEO? (Part 1)

And how is that going to help me get better rankings, more visits and more money?  And who needs SEO anyway?

If you want to be number one in Google for a given term, you need to start from a couple of things that SEO cannot give you:

  • Get a mission
  • Build a brand
  • Produce amazing content
  • Share it with the rest of the world.

Oh, and don’t forget to let Google know. That’s where SEO can help: if relevance and importance are what you need to be at the top, you don’t want to shoot yourself in the foot by blocking all that from Google’s view.

In this two-post series, you will learn about the possible SEO pitfalls that may limit your ranking in the search engines, and how to fix them.

SEO Issue:  You have too many non-text elements

Flash Javascript, AJAX, they make your site look fancy, and your designer sure did a good job, but they have a problem: they’re not text, which is what search engines crawl to understand what a page is about. So unless you are a very well established brand like Armani or KFC (in that case visitors will use your name to get to your site), remove those elements, or at least keep them to a bare minimum.

SEO Issue: You use the same title tags and meta description in every page

Every page needs to have its own piece of content with its own unique title. Google already knows the name of your site from the home page, so don’t use it every time. The same applies to the meta descriptions (the snippet that appears under the link in the search results): make them unique and compelling to click on.

SEO Issue: Your URLs are dynamic

OK, we’re getting a little technical here, but bear with me. An URL like is static (meaning that the page doesn’t change unless you edit it), and will give the search engine a pretty clear idea of the content of the page. On the other hand, when the content is pulled from a database and is the result different parameters, you will get a dynamic URL with those parameters in it. There are two issues with that: first the search parameters are non-descriptive (they read something like item, sectionid, option  and so on), and this won’t give Google any information about the content. Also, when different parameters give the same result, there can be two or more URLs for the same content, leaving the search engine wondering which one should be indexed.

SEO Issue: You have one home page, but different URLs

Even when your site is not database-driven, your home page might still have different URLs pointing at it. For example:  http://example .com – http://www.example .com – http://example .com/index.php – http://www.example .com/index.php

The result of this division is that the page rank for the home page will be divided in four, thus diluting your efforts of building authority to your site.  The solution to dynamic and multiple URLs is an HTTP code called 301, which redirects all the different versions to the canonical one, making it more clear for the search engine.

The list is not complete – stay tuned for the second part…

Enhanced by Zemanta

Recovering from Spammy and Bad Links

Humbolt Penguin at Whipsnade Zoo.

Humbolt Penguin at Whipsnade Zoo. (Photo credit: Wikipedia)

Pandas and penguins used to conjure up memories of nice, interesting animals…but now – for the SEO minded people, those names are now code names for two of Google’s algorithm updates, which dramatically changed the answer to the question: how do I rank number one in Google?

When and How to fix Bad Links?

The Penguin update in particular (rolled out in 2012) hit those sites that presented an “unnatural” linking profile. Paid links, links from low-post blogs, meaningless directories, blog networks, or from unknown forums or rambling comments on blogs – with a vast majority with sharing an exact keyword match anchor text: Google took all this catalog of low quality backlinking techniques and made them useless in one fell swoop.

The result was a drop of ranking for many sites, with the inevitable consequences of huge loss of traffic and in turn a huge loss of  income.

What to do?

If you are one of those webmasters who saw their sites hit by Penguin or one of its updates, the first thing not to do it panic.  First make sure all your other tings are up to par – take our free SEO check, if you have questions about on page SEO to make sure.

Once you have identified the spammy links, the first tool you need to use is patience.  It will take time to get this “back to normal”.  Just like it took you a long time to get back into the upper ranks, you will not build a solid link profile in a day – recovering from spammy links will take weeks or months.

Even if you manage to clean most of the low quality links – it will be hard to climb back to the top, but at least you will start with a clean slate.

Phase 2:  Link Building

I know – right, that’s what got you into this mess.  Kind of…we’re talking real linking building – white hat links. This is something you should do anyway, and considering you might never get rid of all the bad links, then the good must outnumber the bad.

Link Undoing

Gather all the bad links and put them in a spreadsheet.  You want to try and ask for these links to be removed…send an email to each webmaster, asking to remove them. You will succeed, but you will also be ignored and even asked to pay money to have the link removed, but try and go through the whole list before you move on to the next step.  Even if it is a simple form letter – it will be better than nothing.

I hereby Disavow You

For all the links that you can’t get removed, we need to disavow, using the tool which was recently introduced by Google. With this tool, you are practically asking the search engine to consider that backlink as no-follow, a word of caution though – Google doesn’t like shortcuts and this is no exception.

Google has a strong warning – and this is very important to note when disavowing links:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you only disavow backlinks if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you.

On the page of the tool it says:

If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site. You should still make every effort to clean up unnatural links pointing to your site. Simply disavowing them isn’t enough.

This is why – back up in “Link Undoing” we ask you to do every effort you can to get them removed manually…it is just not always possible.  The words are chosen carefully and notice Google’s statement “you cannot just enter the bad links in the tool and consider it done, Google wants them removed from the internet” – they will be aggressively following up on the sites you identify, so be careful.


If you have a huge list of links you want removed, the email outreach route will be a long, probably mind-numbing task. Yet, it’s necessary to make sure that the next close encounters you will have with a penguin will only be at the zoo.  Good luck and let us know how it goes – what works and what doesn’t.

Enhanced by Zemanta

Three Things You Need to Know about SEO

In order for businesses to be successful, it is crucial that they develop an effective online presence. This presence must maintain a good reputation, include professional-looking website content and much more. To ensure this content is as visible as possible, meaning it receives good rankings on a search engine’s results page, it must include a variety of search engine optimization (SEO) tactics.
Businesses oftentimes have more than just a company website on the Internet. In addition, they may have blogs, social media profiles and more. All of these online formats need to include SEO techniques because the number one advantage gained when doing so is that potential customers will be directed to a particular website, helping to convert them into existing and returning customers.

When it comes to implementing SEO tactics, there are several techniques that can lead to negative consequences. First of all, SEO tactics should always be carried out by a professional who has experience with this type of marketing strategy. Secondly, “black hat” SEO methods should never be implemented; they can lead to loss of much money as well as a ruined reputation for a business.

Vroom Digital says that before implementing SEO methods, there are three very important things that all businesses need to know:

  1. Poor Quality Leads to Negative Consequences

    There have been many businesses that have spent thousands of dollars on SEO and never obtained any type of positive ROI; this is mostly because the SEO content they are investing in is of poor quality.

    It is important to keep in mind that the implementation of SEO tactics does not necessarily guarantee a number one position on Google’s search results. In fact, poor quality can lead to the wrong types of traffic being directed to a particular website as well as to a site being banned from major search engines. All of this will lead not only to a loss on the amount of money invested, but also a loss on the amount of money that could have potentially been gained.

  2. Importance of Off Page SEO

    Optimizing the content of a company’s website (on page SEO) is of course very important, but it is equally as important to optimize content found on company’s blogs and social media profiles (off page SEO). In doing so, these online outlets will include keywords and links that direct traffic back to the company’s website.

  3. Importance of Keywords

    Effective SEO tactics are those that involve the correct placement of particular keywords in web content. The keywords must be properly placed in titles, headlines, articles, introductory statements, closing statements and more. There is also a certain percentage that the words need to be used; however, the exact percentage is determined by the number of words included in the content that is being displayed. The best way to go about ensuring keywords are effectively placed in the appropriate spots, as well as used in a natural manner, is to hire a SEO company that has much experience developing SEO content.

Enhanced by Zemanta

Getting To Grips With Google Webmaster Tools

Whether you have only just started your own website, or are an experienced webmaster, or an SEO professional, Google Webmaster Tools can be your friend.  However, as the old saying goes, with great power comes great responsibility. If you’re serious about pursuing an SEO strategy and developing your site so that you are able to reach page one for your chosen keywords, then webmaster tools will be key to that quest. Misuse them, however, and they can be as bad as the most horrific black hat search strategy you could ever dream up.

Rather than be a ‘warts and all’ guide to all of the ins and outs of Google Webmaster Tools, this looks at some of the more important aspects, why you should pay attention to them, and how they will impact on your search success in both the short and longer term.

For easy reference with the webmaster tools interface, so you can click across and check them for yourself as you read this, the points we explore will start at the top of the dashboard interface and work their way down.

Check Your Messages

We’d suggest doing this on a daily basis. You don’t need to check anything that you deem to be low priority, but don’t fall into the dangerous trap of “I never receive anything of value.”

The reason?

Chances are that the day you take that outlook is the day Google themselves have sent you a message warning of potential spam, or that unnatural links are popping up across the internet pointing right at your site.

This could be nothing to worry about, or it could mean that you’re about to be hit with a heavy penalty that could damage your site and business. Either way, it is something you need to know about.

Tailor Your Settings Accordingly

Altering your settings is one of the few opportunities you get to tell Google what you want, rather than taking action to fit in with what they’re looking for.

The settings function of webmaster tools gives you three options.

  • Geographic location – If you have a general top level domain, such as a .com or .org one, this can have a detrimental effect if your targeting a specific location, such as the UK. Selecting the location you want your site to target will mean it is marked as relevant for that area, and ensures that you are appearing in the results you want to, as well as those that mean the most to searching businesses or consumers.
  • Preferred domain – How do you want your website name to appear in search engines? The choice is simple, either have the www dot before the name of your site and top level domain, or have You might have to verify – and check! – that you own both, but choose the way you would like Google to index your site.
  • Crawl rate – It is usually best just to leave this setting to Google’s recommended rate. However, if you have issues with bandwidth this can cause your site to become significantly slower, so you may need to limit the crawl rate. We do recommend solving any website issues first before doing this, however, as if they are crawling your site with less frequency, you could be missing ranking improvements.


A sitelink, if you weren’t already aware, is the links that appear beneath your domain name in search results.

Sitelinks are small or large, and as such appear in two ways.

  • Small sitelinks are probably the ones you are most familiar with. This is when a series of links appears in one line, almost like a breadcrumb trail, in blue after the page or site description
  • Large sitelinks are when you see the title page of the website, the domain, and then a selection of six pages, and their descriptions, beneath the homepage listing.

Unfortunately, you cannot control whether you have a small or large sitelink, or any at all, on your search ranking. It is dependent on the level of authority and credibility your site has in relation to the search term. For example, if you are a professional SEO, and you acquire a large sitelink for “professional SEO services,” you can be happy that your content and search strategies are working.

So where does webmaster tools come in? The purpose of webmaster tools for sitelinks is that you can demote a particular sitelink. For example, if Google is showing the following pages in your sitelink,

  • About Us
  • Products
  • Services
  • Testimonials

But you don’t want one of them to appear, you can ask them to lose it. However, think carefully about doing this. What if the content on that particular page is what won you the sitelink in the first place? Losing a sitelink can happen and, although it is not a traditional Google penalty, your site can still suffer as a result.

URL Parameters

You can use this tool if there are certain pages on your site you do or do not wish for Google to index and appear in search results. However, for a novice using webmaster tools for the first time, it is easier than you’d think to make a mistake and find that suddenly 70% of your website is off the radar.

If in doubt, don’t use it, and instead make use of easier to implement no index/no follow or robots.txt in order to remove the pages you don’t want the search engines to see. If it’s the case that you’re still developing certain pages, use your content management system to set them up as ‘hidden.’

Change of Address

When you started your business or launched your website, you might not have been able to get the domain you want. Whether you win an auction at a site like GoDaddy, or have managed to secure a backorder for a particular domain, use this tool to tell Google about it.

This way, all of your existing SEO efforts will remain relevant and ensure you maintain your existing search position.

If you fail to tell Google, well, you’ve probably worked out what happens.

Crawling Errors

If we weren’t working our way down Google’s dashboard, we would probably feature this somewhere nearer to the top of the list.

This page details any problems with your site that Google has identified that you need to deal with. Don’t be fooled into thinking that Google are just picking up on problems for the sake of it; this is something that should be near the top of your agenda on a daily basis.

Ultimately, you don’t know if any of these errors are potentially affecting the user experience of your site. Why risk losing a customer because of a broken link or page when you can probably fix it in minutes?

Crawl Stats

Use this feature to analyse how many pages Google are crawling each day, as well as how much is downloaded, and the time taken to do it.

You cannot do much in the way of direct action with this report, but you should use the following as indicators:

  • If you are seeing a lot of pages crawled, that is positive, meaning Google is taking an interest in your content
  • If lots of content is being downloaded, this is another positive sign. Be sure to check that these downloads are in areas where you have submitted fresh content recently – why would Google be spending time downloading an ‘About Us’ page that hasn’t changed in months?
  • Download time is the key, and will indicate how quick your pages are, which Google will use to directly influence your search ranking.

Which URL’s are Blocked?

We find it strange this page doesn’t follow on from the URL Parameters page, but we’re not about to start questioning Google’s might. Use this to see the pages that Google aren’t crawling and indexing.

If you find that you are blocking a page filled with valuable, fresh new content, get it unblocked so it can begin to be indexed immediately!

Fetch As Google

If you ever wanted to put yourself in Google’s shoes, then this is your chance. All it does is show you whether pages on your site are accessible. It doesn’t sound much, but if you have a large site in the hundreds in terms of page numbers, it can be a great way to identify any errors quickly.

Indexed Status

This is a great report to use in conjunction with the two previously mentioned and the URL Parameters tool. It shows you how many pages of your website are indexed against how many there actually is in total.

If these numbers are far apart, what could it show?

  • You have several pages of duplicate content
  • Some pages are inaccessible entirely or are too slow to download
  • You haven’t used the Blocked URL’s report correctly or taken action
  • You need to hire a professional webmaster to take a look at your site

Okay, so we said the last thing in jest, but it should underline just how serious a problem you could have.


Nothing flashy about what goes on here, but again it is something you need to check regularly.

Obviously, you’re hoping not to find any issues, but if you do, you can usually deal with them quickly.

Search Queries

This screen doesn’t do all that much in terms of value. Rather than paying any great attention to it in terms of your search ranking, or who is searching from where, just look out for peaks and troughs in search queries and investigate why they might have happened. If you have an issue, however, it is likely to be visible in several other webmaster tools.

External Links

You will already know about the power of link building and the impact that both good and bad links can have on a website. This tool is great for discovering the bad links that are likely to be undermining your search strategy, so you can then head off and potentially do something about it.

Here, you can see a list of all the sites linking to you, and the anchor text that they use.

How you deal with incoming links you don’t want is your choice. If you have a shedload of them and want the quickest solution, showing browsers coming from there a 404 page or redirecting them elsewhere might be the best course of action.

Webmaster tools does contain a disavow links tool, which you’ll find at the bottom of the list.

Internal Links

One huge mistake that many novice SEOs make is underestimating the power of internal links. Another is going too far and flooding their sites with internal links, an action that then sees them land a penalty or simply not be indexed at all.

Use internal links as much as you can but only where, and when, it is relevant to do so. If you’re going to flood all of your pages with links, you might as well go submitting your site to a link farm, as it’ll have the same impact.

This page will show you your internal linking pattern, and give you an idea of whether you’ve gone overboard or not.


Google love a sitemap, and having a good one is a great way to pick up a quick win in terms of SEO. Ensure you submit at least one, the XML sitemap, to Google, and check this page regularly to look out for any errors that need resolving.

Removed URL’s

While it might seem that this should be with the other URL related tools, we love the fact that this is kept out of the way. This feature allows you to remove pages of your website from Google’s index.

Like URL Parameters, we’d recommend novices staying away from it, but even if your experienced, mistakes are so easy to make that it might be worth avoiding altogether.

HTML Improvements

Are you being penalized for duplicate content but have no idea where it all is? This might sound outlandish, but on a large website, it can happen easily.

This tool is your friend, however, as it will point you towards any duplicate content on your site, as well as tags that are too long, too short, and any problems with page titles.

Content Keywords

You probably already know the keywords you are using on your site, but this function will allow you to see your most common keywords, enabling you to come up with a plan for using other relevant words that will make your overall site content clearer in the eyes of Google.

Structured Data

Some industry analysts are predicting that structured data is set to become a huge part of SEO algorithms through 2013 and beyond.

This tool is a great way to begin getting to grips with it. They will tell you how many pages feature structured data, as well as the types of structured data within these.

Other Resources

This small section of webmaster tools features three features that you could find useful:

  • Rich Snippets is a testing tool for Google to check that it can understand any mark up, such as Schema, you have used on your site.
  • Google Places – now called Google+ Local, but shockingly Google haven’t updated it yet! – can be used to input your location if you haven’t yet started your local optimization activities.
  • Google Merchant Center is for e-commerce sites to upload their product data. This is now integrated with AdWords, so ensure you open an AdWords account, or connect them if you already have one, to rank well on the ‘Shopping’ pages of search results.

Author Stats

Your Google+ account is probably the best place to gauge your authoring successes, but this will show how much your articles are being read if your author account is attached to your website.

Custom Search

If you have a Google search bar on your site, this is where you can see the most popular search terms. This might sound needless, but it could give you some idea to whether your landing pages convey what your site is about well enough.

Instant Previews

This is similar to Fetch as Google, but can show you what your site will look like on a mobile device, which could be useful as web design becomes more mobile-orientated over the coming months.

Site Performance

We’ve only listed this because it is still in the webmaster tools, but Google don’t actually use it anymore.

Instead, use the Analytics ‘Site Speed’ tool or the PageSpeed Insights tool.

Disavow Links

We’d only suggest you use this tool if you’re an experienced SEO and know the real differences between a good and bad link. If you’re unsure, then stay away from it. You don’t want to disavow a link thinking it is coming from a bad site when in fact it is good!

Enhanced by Zemanta

7 Reasons You Should Design Your Website In WordPress

seoWordPress has quickly become a world-renowned name for those that are looking to make an affordable and dynamic website, but many are still intimidated by creating their own site. It often seems easier to simply go with the first web designer that one finds, but WordPress offers a number of features that allow anyone to carve out their own piece of the internet with an eye-catching, useful, and profitable site. For those that are still unsure about using WordPress, here are seven of the top reasons business owners, private parties, and seasoned web developers may wish to make the change.

1. Complete Freedom with Opensource Code
One of the biggest advantages to using WordPress is the fact that almost all code involved in downloading, creating, and refining one’s website is completely opensource. This means that all users can not only see the code at any time, but change it to fit their own needs and purposes. For those that are taking on web design for the first time, this feature will only be used lightly at first. As one’s knowledge grows, so will their ability to completely customize their website and manipulate the code in any manner that they wish.
2. Advanced Security Plugins
A major concern for those that are creating a business website is the security of their own company, their employees, and their customers. Even a single breach of one’s security could harm a company for months or years on end. While negligent website owners will always have an issue with security, there are a number of free and paid plugins that can protect one’s hard work and can be downloaded directly through the user interface.
3. Simplified Content Management
Over time, every website will begin to gather huge amounts of data including written documents, user information, and media files. With WordPress, these files are organized in an easy-to-use manner with unique links to all new files that are uploaded. Those that are working on the website can even create various categories and sub-folders based off of the keywords, the subject, or unique folders.
4. Affordable Themes
While the experienced web designer can completely develop each page from scratch or use their previous code, thousands of themes are also offered around the internet. These themes come with a wide array of options and users can search out themes according to the website that they would like to create. Some basic themes are free to use while others can be purchased for a single website or the user can buy the all rights to the code.
5. Complete Support
WordPress is continually refined and improved by thousands of developers from around the world. Due to the sheer amount of support, users will be able to find assistance for any problems that they may encounter any time of the day. In addition to an integrated support and help system, there are a number of WordPress-created and third-party forums and support groups for those that have questions about their own website.
6. Control Over Search Engine Optimization
No matter how well-built a website is, it will never be able to keep up with the competition without SEO, or search engine optimization. Website optimization has become a unique and complex industry in and of itself, but WordPress has taken much of the guesswork out of this field. Users can quickly copy and paste bits of code into their website to improve their standing within search engines as well as utilize tools such as Google Analytics.
7.Flexible Server and Hosting Options
WordPress does offer limited hosting options and domain names at absolutely no cost, and this is a great way for private parties to begin the process of creating their own personal website. For businesses that would like a bit more freedom, they have the options to purchase their own domain names and then utilize their own servers or purchased server space.

Featured images:

Kyle Sanders is a WordPress enthusiast and founder of Complete Web Resources, a professional web strategy firm based out of Denver, Colorado. When he’s not writing for the web and designing WordPress plugins, he enjoys the outdoors and craft beers.

Enhanced by Zemanta