By Arnie Kuenn published December 18, 2014

10 Most Common SEO Pitfalls

SEO-Pitfalls-CoverJust over 20 years ago, internet marketing was not on the radar for most businesses. Nowadays, every company has a website; believe it or not, 571 new websites are created every single minute. With that, most industry markets have become increasingly competitive, compelling companies to provide the best online experience for their audience while ensuring their content can be found by search engines.

However, many businesses weren’t aware of SEO best practices when they first published their website, and since then have received a great deal of misinformation regarding SEO tips and tactics. Because of this, they experience the same SEO pitfalls over and over, costing them search-engine rankings and traffic. Here are the top 10 most common SEO mistakes we see, and how to correct them:

1. Unintentional duplicate content

Most of us know that duplicate content online is a big no-no. But did you know that most duplicate content is unintentional? Unfortunately, search engines don’t take intent into consideration when ranking sites, so identifying and editing duplicate content is necessary.

Unintentional duplicate content can happen for a number of reasons – secure HTTPS pages, URL parameters, and CMS templates. Fortunately, many tools can detect duplicate content, including Screaming Frog, Link Sleuth, and Moz Crawl Test. Once it is discovered, you have to decide how to best to tell search engines not to index certain duplicate pages either through the Noindex, Nofollow, or rel=canonical tags.

2. Bad backlinks

Though link building is still an important part of SEO, not all links are good links. Bad backlinks, such as links from irrelevant pages, link directories, and spammy websites, can actually hurt your search-engine rankings.

To fix bad backlinks, gather your backlink data from Open Site Explorer or MajesticSEO and remove links that look unnatural. This isn’t an easy feat as it is a very manual outreach process asking webmasters to remove links on their site. But it is imperative to uphold your search-engine rankings and credibility in Google’s eyes. Also, use Google’s link disavow tool to tell Google not to consider certain links when ranking your site.

3. Over-optimization and cannibalization of target keywords

Though putting your target keywords on every page may seem like a great idea, it can be damaging to your SEO strategy. Think about it: Search engines need to display the most relevant page based on a search query. If you have multiple pages optimized for the same target keyword, you’re leaving it up to the search engine to decide which page to show users.

Over-optimization and cannibalization can be remedied through a robust canonical SEO strategy, which clearly assigns a keyword to a sole canonical page. This sends a clear message to search engines as to which page is the most relevant for a given keyword and is a better experience for the user, as the most pertinent page will be displayed.

4. Titles and title tags

Web page titles and title tags are important for several reasons: Title tags can help with page rankings, the title shows up in the search engine results page (SERP) itself, and the title is shown when the page is shared on social sites like Twitter and Facebook.

There are many common mistakes associated with titles and title tags, including:

  • Too long: Search engines only display the first 60 or so characters of a web page title tag, so a title tag that’s too long will be cut off.
  • Too short: Though you don’t want the tag to be cut off, you also want to make sure to maximize the available text space so you don’t give up prime SEO real estate.
  • Irrelevant/wasted space: Including words like “home” or your domain name can be a waste of space. With a finite number of characters, your title tag needs to include the most important information – the relevant keyword/term for that specific page.

5. Meta descriptions

Similar to a title tag, the page’s meta description is displayed on the query’s results page. However, all too often pages have poorly written meta descriptions that are full of keywords, or worse, no meta description at all.

A good meta description is useful to the reader, and should explain what the page is about in a short sentence. It should include the primary keyword or phrase, but still make sense to the person reading it.

6. Image optimization

Image optimization is also a significant part of an SEO strategy. Search engines can’t read images (unlike text) so they aren’t easily indexable. Best practices include optimizing image meta data with relevant keywords and phrases:

  • Alt tags serve as the text shown when an image cannot be displayed because of a slow internet connection or text-reading software. The alt tag explains what the image is.
  • Image tags are the words that appear when the user scrolls over an image, providing additional contextual clues.
  • File names go one step further in providing context – more specifically, how the image relates to the other content on the page.

7. Page-load time

Page-load time is one of the 200 ranking factors that Google uses. However, page-load time also is a huge factor in usability; if a page takes too long to load, users won’t stick around.

There are many ways to optimize page-load time, including using appropriate image sizes and formats, avoiding unnecessary plug-ins, CSS, and HTML, and reducing redirects.

8. Poor content

Thin content and/or disregard for the user are the more common mistakes we see. Write more. Answer your audiences’ questions through useful data, tips, and visuals. Since the first Panda update, Google has been doing its best to disregard results that aren’t useful to users. This can be easily combatted by always keeping the audience in mind when creating and publishing content.

9. Keyword misfocus

Keyword misfocus is when a primary keyword in the title tag doesn’t match the H1 heading, page URL, body content, or images. Page improvement and keyword misfocus can be effectively corrected through strategic internal anchor text links, content consolidation, and proper rel=”canonical” tag implementation to effectively strengthen the primary page with its intended keyword.

10. Indexability issues

It’s simple: If a page can’t be indexed, it won’t be included in the SERPs. Everything from blocked and missing pages to broken links and redirects can cause indexabilty issues, making it crucial for you to consistently monitor each page. If a search engine cannot get to your page, neither can your users. As a result, the page won’t be indexed or viewed, which can cause an increase in bounce rate and a decrease in traffic.

Each of these 10 SEO pitfalls can result in lower website rankings, as well as a poor user experience. From unintentional duplicate content and bad backlinks to slow page-load time and low-quality content, all are fixable. Without these mistakes holding you back, your website will be on its way to the top of the SERPs.

What is the most common SEO pitfall you see?

Want more expert advice on addressing SEO and other content marketing challenges? Check out all the fantastic CMW sessions that are available through our Video on Demand portal.

Image courtesy of Joseph Kalinowski/Content Marketing Institute

Author: Arnie Kuenn

Arnie Kuenn is the CEO of Vertical Measures, a content marketing agency with an SEO foundation, focused on helping their clients get more traffic, more leads, and more business. Arnie has held executive positions in the world of new technologies and marketing for more than 25 years. He is a frequent speaker and author of Content Marketing Works. In 2014, Arnie was honored as the Interactive Person of the Year in Arizona. You can find Arnie on Twitter, Facebook, Google Plus and LinkedIn.

Other posts by Arnie Kuenn

Join Over 150,000 of your Peers!

Get daily articles and news delivered to your email inbox and get CMI’s exclusive e-book Launch Your Own Content Marketing Program FREE!

  • Jason Lax

    I would add not validating with Google Webmaster Tools (or Bing’s service). There’s great FREE insight and nice features like specifying how the bots should interpret URL parameters, 404 errors, penalties, search queries with how many impressions and clicks are generated + rank, etc.

    • Arnie Kuenn

      Thanks Jason – good add. The list above was what we saw most in 2014, not sure how often our team saw companies not validating with GWT tools though.

  • Melodee May Dinh

    which tools to identify the 404 errors pages in mysite?

    • Arnie Kuenn

      Google webmaster tools is probably the best tool to use for this.

      • Melodee May Dinh

        Thanks Arnie Kuenn

    • Shai Geoola

      If you’re using WP CMS, there is also a plugin that provides you that information on your dashboard.

  • http://REASONTOUSE.COM/ Ahmad Imran

    As a new blogger with increasing content, the most I am worried about is the duplicate content. Can anyone guide me to a good source of information or plugin to deal with the duplicate content ?
    Great article by the way and thanks for sharing Arnie,

    • Arnie Kuenn

      Assuming your blog is on WordPress, I would recommend the Yoast SEO plugin which has great built-in settings to NOINDEX the majority of duplicate content sources such as tag and archive pages.

      Yoast also provides great SEO guides for WordPress and goes into depth with handling dupe content issues.

      Also recommend that you implement Google Webmaster Tools and watch the HTML improvements section which will report duplicate titles and descriptions which will lead to specific sources as they arise.

      • http://REASONTOUSE.COM/ Ahmad Imran

        Brilliant, I have actually installed the YOAST plugin and really like it. I think it is time now to dig deeper and use it more religiously. Thanks for your advice. Appreciated.

      • AJ

        Hello Arnie, are you saying that it’s best to have tag and other archive pages in WordPress not indexed by Google? What about category pages? Thanks.

  • Title Master

    Having title tags that are “too long” is not necessarily a bad thing. Google may cut off part of the title tag in the search results, but it still reads the entire thing. That doesn’t mean you should jam it with 100 keywords or anything ridiculous like that, but you don’t have to be really uptight about 60 characters.

    • Arnie Kuenn

      User experience is important too, so if you want it to display correctly in search results the length of the title tag does matter.

  • Ray Cameron

    Thanks Arnie! At a time when people are wondering what’s still relevant, what isn’t and what’s new, this is a great list that I think nails it!

    • Arnie Kuenn

      Thanks Ray – appreciate it.

  • http://gatelogix.com/blog/ Sami

    Duplication of content is often happens, which makes most of SEO efforts useless. There for creating fresh informative content is important. These duplication can be avoided by adopting the different styles of writing. There for do some research before writing ,read some related articles and then start writing keeping in mind the other posts or article you have read and try some different representation for your content.

    http://gatelogix.com/blog/marketing/seo/seo-tips-not-to-ignore-while-blogging/

  • http://robertgibb.me Robert Gibb

    As a member of the web performance industry, I really know the importance of page load time. For instance, I know that about 50% of people expect a website load in 2 seconds or less and that 80% of that 50% abandon a page if it takes more than 3 seconds to load.

    Website visitors hate slow, therefore Google hates it. After all, if the page don’t load, those little AdWords ads don’t get seen or clicked on. And if they don’t get clicked on, Google goes broke!

    Hence the reason Google is ranking fast-loading website higher. (They also really do care about the user, too. No doubt about that.)

    Anyways, until I started working for a content delivery network (CDN), I only vaguely knew about the importance of web performance (and the impact it had on Google rankings). Now I know a faster website is a must. A CDN is a must.

    In case you don’t know what a CDN is, check this video out: https://www.youtube.com/watch?v=nle1q0qSYmA. And if ya have any questions, hit me up.

    Solid post, Arnie!

  • http://www.andykuiper.com/ Andy Kuiper – SEO Analyst

    pretty basic tips here… however the fundamentals often get overlooked, so It’s practical to ensure things are done right – thanks for sharing Arnie :-)

  • http://websiteauditreport.co.uk/ Hanshika Kk

    Thanks for providing detailed information on this “all important” topic.
    I personally have noticed that during some of the major updates,
    clients lose ranks for certain key terms.
    Thanks once more for such a great posting.