This is a follow up post to my previous posts about my friend’s Google ranking drop. As you may remember, his Google ranking was restored a few weeks after he blocked the proxy website from copying his entire website and submitted a Google reinclusion request. As you may have guessed, he was quite thrilled to see his SERP ranking shoot up again.
Well, as luck would have it, I received a phone call last night from my friend telling me that his website was bombing again. I Googled his favorite keywords and they seemed to rank fine over at my end, but he explained that he traffic stats from Google was flat. They nosedived a day or two ago. I chalked up the results I was getting to Google adjusting the results.
This new twist got me thinking. What in the world could be making this website’s ranking bounce around like this? Looking back, the proxy website may not have been 100% at fault. There has to be something else.
I began doing a little research and learned about few things about duplicate content. The reason I looked at that particular area is because there is absolutely nothing else I can find wrong with this website. Duplicate content seems to be a rather popular culprit.
I came across a pretty well laid out website called “Google Rankings Diagnostics” that describes a whole heck of a lot of issues you might be having with your website. This website validated what I pretty much already knew…that if you have multiple URLs (on a domain) with the same exact content, Google has trouble figuring out which page is the original and may throw all of them out.
I took a very close look at my friend’s website. Again, I took a unique line of text from his homepage and searched for it in Google (inside quotes). A funny thing happened. I saw the homepage result, but there were a few extra results as well, all on his domain. There were about 5 extra pages in total.
Now, some of these extra results have been there for years, so I don’t attribute the issue to those pages being duplicate content. What struck me was one of the extra pages.
A few months ago, my friend moved one of his pages. He put a 301 redirect in his .htaccess file, which was the correct thing to do. So now, the old directory where the page was held forwarded to a new page. It looked something like this:
Redirect 301 /olddirectory/ http://www.hiswebsite.com/newpage.php
The redirect worked fine, but here is what that extra page in the search results looked like:
http://www.hiswebsite.com/newpage.phpoldpage.php
Guess what page was showing at that URL…yup, the homepage. The dynamic nature of his website sends unknown page results like this to the homepage. This was a fluke. My friend forgot that there were pages inside the old directory he redirected to the new page. Every old page in that old directory was tacked on to the new page, like you see above. To make matters worse, there were a bunch of links from other websites pointing to the old pages in the old directory.
I am not sure if this would cause the ranking drops that he is experiencing, but the timing certainly lines up with when the issue began. It is also certainly considered duplicate content.
So, here is what I did to deal with the issue this time. I deleted the redirects in the .htaccess file and blocked the URLs of all those extra results in the robots.txt file. Hopefully, this will tell Google to not spider or index those pages and it will also tell Google that those links into the site are dead.
Now, we have to wait. I am not going to submit another reinclusion request to Google because I want to see if the ranking returns naturally. If it does, this was the problem for sure.
Leave a Reply