Top 10 Obstacles to SEO Success
The following list discusses the top 10 obstacles faced by today’s interactive marketers when pursuing search engine optimization success. These problems result from a lack of strategic or technical optimization, but do not discuss the so-called “black hat” or unethical optimization practices still being utilized by many SEO practitioners. These methods would include link farming, paid link exchanges, hidden keywords and many more.
This article discusses both the issues associated with these organic search engine optimization obstacles, as well as how to take corrective measures and/or implement these effectively.
In many cases a few industry leaders will dominate a web space. If you are starting a new website, or simply trying to topple a giant, you need to be aware of what the competition is doing well, and what it is not. Often the highest traffic keywords will have a group of 5 to 10 major competitors competing for them. In many industries, the large players will either have full-time SEO staff or will have SEO agencies on retainer, performing intensive organic search optimization. This can make it a time intensive and costly proposition to try to “dethrone” one or more of the leaders by getting a top 5 ranking in Google for a highly competitive keyword. To circumvent this problem, it’s often a good idea to start with a ranking analysis of your competitors. Determine what keywords they are ranking well for, and which ones they are not ranking well for. This will indicate areas of great opportunity for organic optimization. From there, spend time optimizing the keywords your competitors are not ranking well for. This will give you the advantage in niche areas where your website can be a leader.
2. Excessive Optimization
Excessive optimization, often called “keyword stuffing” is just as detrimental in 2010 as it was in 2009, 2008, 2007 and 2006. People often ask me “what is the percentage of keywords” or “keyword density” that you can have on a page before the page should be considered “stuffed”. While we utilize a simple formula at Lucid Agency to give a general barometer, to be honest, it’s just as easy to follow this simple rule of thumb. If you can read a page and not tell that it is written for SEO purposes, and it sounds informative and readable for humans, then chances are it is not over-optimized. For example, did you realize that this section is optimized for “excessive optimization”? Probably not, because it was written to be informative and just happens to contain that keyword two times. Organic search rankings are intended to list the most useful and credible websites at the top of the list, so write with that objective in mind.
3. Not Utilizing Social Media in SEO Strategy
Social media is a great way to garner both traffic and inbound links, both of which are helpful for the objective of SEO – getting more qualified traffic to your website. While many search engine optimization strategies focus on the on-page and off-page elements of organic optimization, they often neglect social media. There are simple ways to utilize social media in search. A simple way to start is with a blog and a twitter account. Write a few good blog posts on topics relevant to your industry of expertise, providing useful information. These articles could be articles, picture posts, videos, interviews, white papers; really anything that your intended reader would find useful. Submit these blog posts via XML sitemap to the search engines. Then utilize a following of those interested in what you have to say on Twitter, and put up teasers to your blog posts. If you do this well, with any luck, some of your followers will put up links to your blog on their websites and blogs, gaining you both direct traffic from their reads, and also in bound links from their web properties.
4. Lack of Original Content
As they say in the world of search engine optimization, “content is king”. This has been the rule for many years, and is likely to prevail as one of the foremost guidelines in SEO. Search engines like content that is unique and new. And they like a lot of it. So just create a lot of good content and you’ll end up working wonders for your SEO campaign. If possible, lightly optimize this content and let the rest take care of itself.
5. Lack of Quality Links
Nothing new here…links are the foundation of the Google algorithm and what originally separated them from the other search engines. If you consider Google a great “democracy” of sorts, and each link from a credible website as a “vote” for your website, then it’s simple to see that he with a lot of votes and a clear, in-depth message shall be the winner. Recently, other search engines have followed suit, and it’s no longer just Google valuing these links, but the other search engines as well. In addition, it’s also important to make sure you have links from many websites and that the websites are credible (not link farms). Further, there is some value to the outbound links on your website. When it comes to outbound links, search engines like to see a few links to valuable and related websites.
6. Slow Page Load Times
From many accounts, the new algorithm Google released last year, fondly named “caffeine”, we noticed an importance placed on page load times. This makes sense. If Google’s mission is to organize the world’s information, and provide the best results to those searchers looking for this information, they wouldn’t be doing a great job if the top ranking websites loaded very slowly, or didn’t load at all – thus providing a terrible experience to searchers. To remedy slow load times, try cleaning up code, taking out extraneous flash or third party loading galleries. If this doesn’t help, try a “fresh rebuild”, i.e. having a developer go through and rebuild the website in nice clean new code.
7. Website Structure Issues
If your website has a complicated or unintuitive directory structure, you may want to think about redoing it. The directory structures that work best for top organic rankings are the ones that are simple and intuitive, as this will help search engines and website visitors identify where in the website they are. Search engines use a directory structure to try and determine a relationship between different categorical items and pages on your website. If your structure makes no sense, it can only be detrimental for your rankings, and often the user as well. Fortunately, this is easy to fix up-front, and if you have a website running on a popular CMS platform such as WordPress or Expression Engine, there are many free plugins you can get to quickly and easily adjust the directory structure of your website. If you have a website that sells products for the construction industry, you might have a directory structure like ##www.plumbing-co.com/supplies/welding-supplies.html##, where you would list all of your welding supplies, which would be clear to both search engines and users alike.
8. Inaccurate or Duplicate Page Titles
Search engines give significant weight to website pages that are clear in their content focus. Titles are one of the best ways to indicate the intended focus of a website page. It is a best practice to include a unique, descriptive and lightly optimized title for your website pages. Don’t utilize keywords that are not relevant to the content of the web page, just write something that is useful and accurate, and if it can include the keyword then perfect. Remember titles are shown in search engine listings, so you want your title to be informative and compelling. After all, you are not only trying to get the top listing in Google, but you want people to click on your listing as well.
9. Missing XML sitemaps
XML sitemaps are a very simple way to make sure websites with many pages, and frequently changing content, are thoroughly and accurately indexed by search engines. Since search engines love fresh content, it is advantageous to make sure your website utilizes an XML sitemap, to ensure your “fresh” content is found and indexed quickly. XML sitemaps are simple to make and most web developers can implement them quickly.
10. New Websites
If you work in the SEO field, you’ve most likely had more than one potential client come to you with little more than a checkbook and a domain name. Often this potential client will want to “rank number 1 in Google” for their keyword-de-jour. While most agencies will gladly develop a plan to accomplish this, one of the largest barriers is something that cannot be fixed with any amount of money. Google and Bing (MSN and soon-to-be Yahoo) value the length of time a domain has been in existence. This is something often referred to in SEO slang as “the sandbox”. According to many industry experts Google reportedly puts new websites in something of a sandbox for a while, in an attempt to thwart large scale link efforts for new websites. While this is more theory than fact, Google engineer Matt Cutts has said that there are elements of the Google algorithm that may have an effect such as what has been described as the sandbox. What we can tell you is that we’ve noticed it certainly takes a little longer to get rankings for a new website than an older website. This is partially because of the generally lower number of existing links going to new websites, as well as the variety of other issues that are usually present with many new websites. To best counter this problem, get a new website up quickly, and slowly build up links to the website after the initial launch.
In closing, many of the issues hindering successful optimization aren’t particular difficult to solve, nor do they take a PHD in mathematics to figure out. Follow some simple tried and true optimization techniques, develop great content, get the word out, track your results and the rankings will take care of themselves.