The Good and the Bad of SEO

Written By Rob Sullivan, SEO Specialist and Internet Marketing Consultant
October 10, 2005

The Good and the Bad of SEO, from Googles Mouth!

In this article, I highlight some of the points made during the call, so you know what Google thinks.

You know it’s bad when you take time from your holidays to come into work to attend a conference call.  But that`s what I did a few weeks ago.

You see I had to because I was going to have the opportunity to ask some Google employees specific questions on things that I`d been pretty sure about, but wanted to hear it right from the horse’s mouth.

The call lasted less than an hour, but at that time I found that there were many things I figured were indeed true.  So let’s start with the most obvious:

Is PageRank still important?

The short answer is – yes! PageRank has always been important to Google. Naturally, they couldn’t go into details, but it is as I suspected.

Google still uses the algorithm to help determine rankings.  Where it falls in the also mix, though, is up for speculation.

My feeling, however, is that they’ve only moved where the PageRank value is applied in the grand scheme of things.

If you want to know what I think, be sure to read this article.

The Good and the Bad of SEO: Are dynamic URLs wrong?

Google says that a dynamic URL with two parameters should get indexed.

When we pressed a bit on the issue, we also found that URLs themselves don`t contribute too much to the overall ranking algorithms.

In other words, a page named Page1.asp will likely perform as well as Keyword.asp.

The whole variable thing shouldn`t come as a surprise.

It is true that Google will indeed index dynamic URLs and I`ve seen sites with as many as four variables get indexed.

The difference, however, is that in almost all cases I`ve seen the static URLs outrank the dynamic URLs, especially in highly competitive or even moderately competitive keyword spaces.

Is URL rewriting OK in Google’s eyes?The Good and the Bad of SEO

Again, the answer is yes, provided the URLs aren’t too long.

While the length of the URL isn`t necessarily an issue, if they get extremely long they can cause problems.

In my experience, long rewritten URLs perform just fine.  The important thing is the content on the page.

Pay also attention when creating Multilingual Website: SEO Best Practices.

That was a common theme throughout the call – content is king.

Sure optimized meta tags, effective interlinking and externalizing JavaScript all help, but in the end, if the content isn`t there the site won’t do well.

Do you need to use the Google Sitemap tool?

If your site is already getting crawled efficiently by Google, you do not need to use the Google sitemap submission tool.

The sitemap submission tool was created by Google to provide a way for sites that do not get crawled efficiently to now become indexed by Google.

My feeling here is that if you MUST use the Google sitemap to get your site indexed, then you have some serious architectural issues to solve.

In other words, just because your pages get indexed via the sitemap doesn’t mean they will rank.  In fact, I`d bet you that they won`t rank because of those technical issues I mentioned above.

Here I’d recommend getting a free tool like Xenu and spider your site yourself.

If Xenu has problems, then you can almost be assured of Google bot crawling problems.

The beautiful thing with Xenu is that it can help you find those problems, such as broken links, so that you can fix them.

Once your site becomes fully crawlable by Xenu, I can almost guarantee you that it will be crawlable and indexable by the major search engine spiders.

Does clean code make that much of a difference?

Again, the answer is yes. By externalizing any code you can and cleaning up things like tables you can significantly improve your site ranking.

First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more valuable.  Your keyword density goes up which makes the page more authoritative.

Similarly, minimizing the use of tables also helps reduce the HTML to text ratio, making the text that much more important.

Also, as a tip, your visible text should appear as close to the top of your HTML code as possible.

Sometimes this is difficult, however, as elements like a top and left navigation to appear first in the HTML.

If this is the case, consider using CSS to reposition the text and those items appropriately.

Do Keywords in the domain name harm or help you?

The short answer is neither. However, too many keywords in a domain can set off flags for review.  In other words, blue-widgets.com won`t hurt you, but discount-and-cheap-blue-and-red-widgets.com will likely raise flags and trigger a review.

Page naming follows similar rules while you can use keywords as page names, it doesn’t necessarily help (as I mentioned above) further, long names can cause reviews which will delay indexing.

How many links should you have on your sitemap?How many links should you have on your sitemap

The Good and the Bad of SEO: Google recommends 100 links per page

While I`ve seen pages with more links get indexed, it appears that it takes much longer.  In other words, the first 100 links will get indexed right away. However, it can take a few more months for Google to identify and follow any links greater than 100.

If your site is larger than 100 pages (as many are today) consider splitting up your sitemap into multiple pages that interlink with each other, or create a directory structure within your sitemap.

This way you can have multiple sitemaps that are logically organized and will allow for complete indexing of your site.

The Good and the Bad of SEO: Can Googlebot follow links in Flash or JavaScript

While Googlebot can identify links in JavaScript, it cannot follow those links.  Nor can it follow links in Flash.

Therefore I recommend having your links elsewhere on the page. It is OK to have links in flash or JavaScript, but you need to account for the crawlers not finding them.

Therefore the use of a sitemap can help get those links found and crawled.

As alternatives I know there are menus that use JavaScript and CSS to output a very similar looking navigation system to what you commonly see with JavaScript navigation yet uses static hyperlinks which crawlers can follow.

Therefore do a little research, and you should be able to find a spider able alternative to whatever type of navigation your site currently has.

Overall, while I didn`t learn anything earth-shattering, it was good to get validation from the horse’s mouth so to speak.

I guess it just goes to show you that there is enough information out there on the forums and blogs.

The question becomes to determine which of that information is valid and which isn’t.

But that, I`m afraid, usually, comes with time and experience on The Good and the Bad of SEO

I Need More

Enter your Email Address to Join the
Gang of Curious and Life Loving
People!

Related Articles