A friend of mine contacted me asking my opinion on why Google isn’t loving Celebrity Cowboy. Celebrity Cowboy is a celebrity blog that should be ranking well for a variety of terms is, for some reason, continually under-performing for its niche.
I told him that I would take a look at it, and while my speciality isn’t really search engines, I did notice a few things right off the bat.
Code
Positioning
One of the first things I noticed about the xhtml generated by the theme used at Celebrity Cowboy is that the blogroll is near the top of the page, with more than twenty items linking out to other sites. While this is only on the front page of the site now, it wasn’t always like this and could have lead to a black mark for the site.
Then there is the content, and then the list of internal links to each one of the more than two dozen categories. Could Google be penalizing the site for having so many outbound links at the top of the page of code, and so many links near the bottom? Could they see this as an attempt to effect search engine rankings by stuffing links in a site?
Things like this have happened before and Google has always been harsh on such things. The flip side though is that all of these links are relevant. Google doesn’t penalize for relevant links, do they?
With Google’s war against paid links, I would be surprised if a few sites got caught in the crossfire, and with these links being site-wide, Google may have mistaken them as paid links.
No doubt they would like sites to make sure to no-follow their blogrolls and other external links that aren’t part of the normal daily content, despite being relative.
Validation
The theme that Celebrity Cowboy is using doesn’t validate. Google has proved time and time again that if you don’t work hard on making your code valid, you can cause yourself to drop in the rankings, and even sometimes to be marked as a “bad” site.
Sometimes sites get listed on stopbadware.org just because their JavaScript doesn’t work correctly, or advertising doesn’t load properly. I have seen this happen to more than a few sites.
Fixing up as many validation issues as possible, could help remove the penalty placed on the site, as Google’s indexing bots might then be able to index the content more efficiently, and without error.
One of the things I first noticed was that there is an ID used more than once, something that probably doesn’t effect the Google search bots, but something that is not correct in xhtml. Classes should be used for repeating items, not ID’s.
Correcting such things should also improve how various browsers render the site, which could have the side effect of increasing traffic, page views, and even links to the blog.
Just Plain Strange
There was one more thing about the coding of the site that really got me scratching my head. It seems that the header image is displayed via CSS, and so rather than showing an image with the proper hyperlink code around it, the coder chose to use JavaScript to make the div that the header is shown thanks to, into a clickable item that uses location.href to bring the visitor back to the index page.
To me this seems like a very bad way to do this effect, and probably not one that Google looks highly on. Continue reading →
Originally posted on January 9, 2008 @ 9:36 am