Saturday, February 2, 2008

A Very Unusual Pagerank Update

Last week, many webmasters noticed that Google’s public pagerank system (a rough measure of each site’s importance in the search engine) was displaying higher than expected rankings for some sites.

Many virtually unlinked and unheard of sites rose significantly on the scale, without any official explanation from the search giant. Two of my sites that were affected were handsetreviews.com and carriermix.com. Both were ranked 0 out of 10 before the change, but rose to PR6 afterwards. Other sites have been known to rise as high as PR8 as a result of this update.
The strange thing is that this was not a normal update, as it only made a difference to newly created PR0 websites, and didn’t affect most internal pages (at least in my experence). Reaction from other webmasters can be found at this SitePoint forum thread.
Many webmasters believe that this unusual update could further devalue the meaning of PR (which is meaningless enough already for SEO purposes), but I am currently mulling a theory that Google could have a different opinion altogether.

I find it somewhat unlikely that the world’s biggest search engine would deliberately destroy the entire PR system. Rough and inaccurate as it is, the PR scale is the only way to get a simple reading of a site’s backlink weight, directly from Google.
Instead, my theory is that Google is turning PR itself into an elaborate experiment. They could be attempting to make it a more accurate measure of a site’s value, updated on a frequent, or even constant basis. Such a figure can’t be expected to be totally accurate right away, but if Google is trying to introduce such an improvement, I commend them for it.
Now, I must warn you, this is just a theory, and I make no guarantees that it is correct, but it’s quite a possible explanation in my opinion. Whether Google has success in implementing it is another matter altogether. Share/Bookmark

Google Strives to Further Improve Search Functionality

Despite Google’s dominant position in the search industry, the internet giant’s decision makers insist that search is not a “solved problem,” and that there is still much room for improvement.
“Our position is that search is a very hard problem. We have still a lot of work to do,” commented internal engineer, Douglas Merrill, noting that 70% of Google’s efforts still go into improving search, as opposed to developing other services.

“It is not enough to have the information, the information should be right,” Merrill went on to say. “Sometimes the problem is figuring out what the users mean, not what the user said.”
At this point, some of Google’s main projects include improving mobile web search, personalized search, and language translation features, as well as finding new ways to combat SEO spam.
By keeping its focus on core search functionality, the internet giant is demonstrating its belief that no search algorithm can be “too good,” while recognizing the continual progress of competitors. This goes to show that even the mighty Google must work hard to maintain the upper hand against rivals like Yahoo and Microsoft. Share/Bookmark

Facts About Title Keyword Density

If you’ve done much SEO work for your website, I’m sure you’ve realized just how important it is to include the right text in the <> tag of each page.
As discussed in this article, it is a good idea to build each of your pages around its own primary keyphrase, and somehow incorporate that keyphrase into your <> tag. The question is, of course, what’s the best way to integrate it? The problem is that each search engine has its own unique answer.
MSN (aka Live.com) is generally thought to reward very high keyword density, and often grants top-five rankings to pages with 100% density in the title (that is, pages where the primary keyphrase is the only thing in the title bar).
Google, on the other hand, seems to make a point of devaluing pages on keywords that exactly match their <> tag. This measure was most likely introduced as a way to fight search engine spammers who over-optimize for a single phrase, by excessively placing it in their content, headings, and title.

Overall, you need to make an informed decision about which optimization route you want to take for each of your sites. As mentioned in the algorithm summaries, MSN is a good choice for driving short-term traffic and revenue, while Google has a lot more potential for long-term sustainable content websites.

If you want to optimize for Google, my advice would be to go for title keyword density of around 50%, and no greater than 75%. For example, if your primary keyphrase is three words long, you many wish to add another three-word phrase to your title, consisting of secondary keywords. Share/Bookmark
Google