If your website traffic dropped dramatically in late September, it could be you got bit by Panda 4.1, Google’s latest algorithm update. This update specifically downgraded sites with poor or no content, duplicate content and keyword stuffing. It did this on September 23. Check your analytics and see if you are safe.
A couple of people I know got hammered. One man came to me yesterday to redo his site after he experiences a dramatic drop in business on his very profitable e-commerce site. The problem is he has very few words on his site which was created in Microsoft Front Page quite a number of years ago and was completely discontinued. Because of that he can’t easily go in and update his content.
We are going to recreate his site in WordPress so he can go in and add descriptions to all of his content. Hopefully he can recover and get back into Google’s good graces.
Another client has filled out her All in One SEO fields with lots of keywords and tags for each post and page. Google however, sees this as spammy. They want people to write content that is about one particular thing. Therefor we should use the one relevant keyword phrase that most describes what the post or page is about. When people load up multiple keywords, that are not even mentioned in the content it is seen like they are trying to rank for topics they are not expert in.
If you have been doing this, go in and remove all but one keyword phrase from your posts or pages and add the real thing that the post is about. Make sure that your keyword phrase is actually used a couple of times in the content.
Sites having duplicate content were also targeted. WordPress sites out of the box, can be targeted for duplicate content. Just think about all of the places a particular piece of content can appear on a site. For example, you enter a post once, but then it appears on the single post view as well as the category archive page and the author archive page just to name just a few. You can easily fix this potential issue by going to your SEO plugin and setting it to tell Google to only index the single post or page view of your content.
Duplicate content appearing on other sites than your own is considered an even greater no no. Google thinks this means someone has tried to create multiple sites with the same content or is writing content that is placed on multiple sites. They will surely smack you down for this.
The same client who had the huge list of keywords actually has great content. It was so good that Forbes asked her to write for them. We talked about the dangers of duplicate content issue at the time. She told me that she posts the articles on Forbes and then later adds them to her site. I didn’t think that was going to fly. Duplicate content is still duplicate content a week later. What to do about this? Check the settings on your on page SEO plugin to find and check yes to Robots Meta NOINDEX. This will prevent Google from crawling the post or page content, thus effectively not seeing it as duplicate content but your readers can find and read it on your site.
If you did get caught by this algorithm update and had your website traffic drop dramatically, it is going to take some time to fix the issues and then have Google re-index the pages and posts on your site. It is so sad to see your traffic plummet. I wish you patience and perseverance to get it back up.