Search Engine Watch
SEO News

Go Back   Search Engine Watch Forums > Search Engines & Directories > Google > Google Web Search
FAQ Members List Calendar Forum Search Today's Posts Mark Forums Read

Reply
 
Thread Tools
Old 07-06-2005   #1
AussieWebmaster
Forums Editor, SearchEngineWatch
 
AussieWebmaster's Avatar
 
Join Date: Jun 2004
Location: NYC
Posts: 8,154
AussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant future
Visitor Numbers Impact Algorithm, SERPs

There is mention over at WPW that the alogrithm is impacted by the number of visitors to a site and the amount of time they stay.
http://www.webproworld.com/viewtopic.php?t=48136

I find the length issue absolutely unattainable by Google, though the number of visitors may be something else.

Has anyone else seen or heard anything about this?
AussieWebmaster is offline   Reply With Quote
Old 07-06-2005   #2
GuyFromChicago
Member
 
Join Date: Jun 2005
Location: Chicago, IL
Posts: 125
GuyFromChicago will become famous soon enoughGuyFromChicago will become famous soon enough
If traffic/visitors to a site were factored into how a site placed in the serps woudln't that create a perpetual loop of sorts - ie the top 10 sites in the organice listings will always receive more traffic than sites that appear on page 50. You need to get to page 1 or 2 to get decent traffic, but you can't get there until you get more traffic.

There's some some specific mention of Adwords in that thread...my experiences have shown that Adwords has zero impact on the organic serps. Google has stated the same if I'm not mistaken.
GuyFromChicago is offline   Reply With Quote
Old 07-06-2005   #3
AussieWebmaster
Forums Editor, SearchEngineWatch
 
AussieWebmaster's Avatar
 
Join Date: Jun 2004
Location: NYC
Posts: 8,154
AussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant future
I agree that if the same degree of accuracy of the comments about AdWords and search are being made about the visitor number then it is all wrong.
AussieWebmaster is offline   Reply With Quote
Old 07-06-2005   #4
dannysullivan
Editor, SearchEngineLand.com (Info, Great Columns & Daily Recap Of Search News!)
 
Join Date: May 2004
Location: Search Engine Land
Posts: 2,085
dannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud of
Measuring time has typically in the past been done by clickthrough. You click on something, don't like it, click back to the results, click on something else, spend a little time, go back, then click on a third thing and don't go back again. Measure the time between click 1 and 2, that's the time estimated on site 1. click 2 and 3, time on site 2. as for site 3, you don't have a click 4 so you assign it some overall estimate.

The conversation over there is really about AdWords. I've never heard Google is trying to do time on site estimates to impact AdWords rankings. As for impacting regular results, it might, but it has never fessed up to that.
dannysullivan is offline   Reply With Quote
Old 07-07-2005   #5
Robert_Charlton
Member
 
Join Date: Jun 2004
Location: Oakland, CA
Posts: 743
Robert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud ofRobert_Charlton has much to be proud of
Quote:
Originally Posted by AussieWebmaster
I find the length issue absolutely unattainable by Google....
I've seen murmurs on various forums that this might be done via the Toolbar, but I don't know whether it's capable of tracking this kind of information. If it can track page visits, conceivably it might consider the interval between two page visits to be duration on the first page, but that would be a sloppy measurement at best.
Robert_Charlton is offline   Reply With Quote
Old 07-07-2005   #6
dannysullivan
Editor, SearchEngineLand.com (Info, Great Columns & Daily Recap Of Search News!)
 
Join Date: May 2004
Location: Search Engine Land
Posts: 2,085
dannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud of
Measuring via the Toolbar would be easy -- better than clickthrough. Anyone who has the PageRank meter enabled makes a call to Google for each page they view. Just watch the time between calls, and you know the time spent viewing a page.

You'd end up with some skews, however. Lots of SEOs will have the PR meter enabled. Are they spending too much time on their own sites? What about deliberate attempts to manipulate.

There are things you could do to adjust for this stuff, as well, of course. Overall, it comes back to content. If you've got good content, you should have good viewing times. If that's being measured in some way, then great.
dannysullivan is offline   Reply With Quote
Old 07-07-2005   #7
Wakaloo
Member
 
Join Date: Jul 2005
Posts: 5
Wakaloo is on a distinguished road
I was reading part of that coversation at the other forum, where they tend to mention that for some reason they believe the url's from the adwords ads, where a static text link direct to the site, giving each adwords site a boost foresay, though the links actually look like:

Code:
<a id=aw1 href=/pagead/iclk?adurl=http://www.clicktracks.com%3Fsource%3Dgoogle%26campaign%3D18%26group%3D1%26creative%3D1&sa=l&ai=BLwxFaRjNQvjgJMrmYM6JiOQOhd24Co30g8gB08yZBYD70QEQARgBKAg4AED0D0iFOVCTv7sLmAHSS6ABwdm5_wPIAQE&num=1
So that rules that one out.

Whilst Google could very well measure page times through the toolbar, isn't that then becoming something more like Amazons toolbar, where it is simply manipulated more than used for intended purposes, to gain a better rank?

Wouldn't that sort of statisical data be more inaccurate, than accurate, for the overall ranking of webpages? Just curious. I can't see how it could even be remotely accurate, nor go untainted from abuse. We would simply leave our browsers on with webpages in each and click through the at multiple intervals during each day... or even write a program to do it for us... with the Google toolbar turned on offcourse.
Wakaloo is offline   Reply With Quote
Old 07-07-2005   #8
AussieWebmaster
Forums Editor, SearchEngineWatch
 
AussieWebmaster's Avatar
 
Join Date: Jun 2004
Location: NYC
Posts: 8,154
AussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant future
The reason I started this post was I did not believe it possible and figured if someone knew about anything they would comment. From what I have seen so far at best we are offering possibilities of ways this could be very inaccurately done. We all know that occasionally the engines use elements in the algo that can be attacked - and they usually change them quickly.... if this is an element we will all be leaving our browsers open until they fix it.
AussieWebmaster is offline   Reply With Quote
Old 07-09-2005   #9
Mel
Just the facts ma'm
 
Join Date: Jun 2004
Location: Malaysia
Posts: 793
Mel is just really niceMel is just really niceMel is just really niceMel is just really nice
I find it rather difficult to believe that Google is tracking in detail all of the millions of searches done daily and somehow incorporating that information into the ranking algorithm.

Seems like way too much work for too little gain, especially when you consider that the seasonal traffic a doll site gets in December may be twenty times its normal traffic and should not effect it rankings during other periods.
__________________
Mel Nelson
Expert SEO Dont settle for average SEO
Singapore Search Engine Optimization and web design
Mel is offline   Reply With Quote
Old 07-09-2005   #10
sully
Member
 
Join Date: Nov 2004
Posts: 50
sully is on a distinguished road
Quote:
Originally Posted by AussieWebmaster
There is mention over at WPW that the alogrithm is impacted by the number of visitors to a site and the amount of time they stay.
Length of time...yes...and what actions are taken during those visits. This is my interpretation of certain parts of the G patent published online in March.

As was mentioned
Quote:
If you've got good content, you should have good viewing times.
Sounds like a logical way to determine the quality of a site.

Quote:
If that's being measured in some way, then great.
Absolutely great.
sully is offline   Reply With Quote
Old 07-09-2005   #11
Chris Boggs
 
Chris Boggs's Avatar
 
Join Date: Aug 2004
Location: Near Cleveland, OH
Posts: 1,722
Chris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud ofChris Boggs has much to be proud of
surprised that no one has mentioned Urchin in this conversation yet. I get the funny picture of the evil Google scientists taking apart Urchin right now to make it divulge all the secrets of any site running Urchin stats...but i guess that's only because we have about 150-200 of them
Chris Boggs is offline   Reply With Quote
Old 07-09-2005   #12
AussieWebmaster
Forums Editor, SearchEngineWatch
 
AussieWebmaster's Avatar
 
Join Date: Jun 2004
Location: NYC
Posts: 8,154
AussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant futureAussieWebmaster has a brilliant future
Quote:
Originally Posted by Chris Boggs
surprised that no one has mentioned Urchin in this conversation yet. I get the funny picture of the evil Google scientists taking apart Urchin right now to make it divulge all the secrets of any site running Urchin stats...but i guess that's only because we have about 150-200 of them
That is why analytics need to be independent... but even Yahoo sees this aggregation of information as the future move for the engines.
AussieWebmaster is offline   Reply With Quote
Old 07-09-2005   #13
sully
Member
 
Join Date: Nov 2004
Posts: 50
sully is on a distinguished road
Quote:
Originally Posted by Chris Boggs
surprised that no one has mentioned Urchin in this conversation yet.
I was surprised too, and didn't want to be the first one to say it.

I brought up the possibility (on another forum) of this when the Urchin acquisition was announced, particularly after having read the patent. Everyone thought it was just too far-fetched.

Last edited by sully : 07-09-2005 at 10:54 PM.
sully is offline   Reply With Quote
Old 07-10-2005   #14
Mel
Just the facts ma'm
 
Join Date: Jun 2004
Location: Malaysia
Posts: 793
Mel is just really niceMel is just really niceMel is just really niceMel is just really nice
But the problem with such a ranking scheme is that it creates an automatic feedback mechanism - the better a site is ranked the more traffic it gets, which helps the rankings, which increases the traffic, which helps the rankings....

I am sure the folks at Google must be more than simply aware of this and if they are in fact implementing such a scheme (is there aany real evidence that it is being implemented?) it must have a minimal impact on rankings, otherwise new sites will never stand a chance.
__________________
Mel Nelson
Expert SEO Dont settle for average SEO
Singapore Search Engine Optimization and web design
Mel is offline   Reply With Quote
Old 07-10-2005   #15
Wakaloo
Member
 
Join Date: Jul 2005
Posts: 5
Wakaloo is on a distinguished road
Quote:
Originally Posted by Mel
otherwise new sites will never stand a chance.
Exactly... unless they leave their browsers open 24/7 and click internally every now and then... but again, everyone else would also be doing this! Lose lose...
Wakaloo is offline   Reply With Quote
Old 07-10-2005   #16
Jill Whalen
SEO Consulting
 
Join Date: Jul 2004
Posts: 650
Jill Whalen is just really niceJill Whalen is just really niceJill Whalen is just really niceJill Whalen is just really niceJill Whalen is just really nice
Quote:
I am sure the folks at Google must be more than simply aware of this and if they are in fact implementing such a scheme (is there aany real evidence that it is being implemented?) it must have a minimal impact on rankings, otherwise new sites will never stand a chance.
Well, one of those latest patent applications did have a lot of info on time spent on a site, so I'm quite sure it's something they are doing, or moving towards doing. I'm sure they will try to measure it by all means they can, clickthroughs from adwords and organic results plus length of time visited before coming back to the SERP, their toolbar spyware, and whatever info they will be getting from Urchin.

But like everything they do, it will be just one factor out of the hundreds. I doubt constantly clicking on your site at the SERPs and never going back to Google will have any impact, but if you have a great site that satisfies lots of searchers queries because they find exactly what they need, it may certainly help you in the long run with your rankings. And even if not, you've got nothing to lose!
Jill Whalen is offline   Reply With Quote
Old 07-10-2005   #17
sully
Member
 
Join Date: Nov 2004
Posts: 50
sully is on a distinguished road
Quote:
Originally Posted by Mel
But the problem with such a ranking scheme is that it creates an automatic feedback mechanism - the better a site is ranked the more traffic it gets, which helps the rankings, which increases the traffic, which helps the rankings....

I am sure the folks at Google must be more than simply aware of this and if they are in fact implementing such a scheme (is there aany real evidence that it is being implemented?) it must have a minimal impact on rankings, otherwise new sites will never stand a chance.
But you are assuming that G will continue to display results as it has in the past. There is an indication that it may give a new document a chance by keeping score of how often it is selected when displayed with a set of results.
sully is offline   Reply With Quote
Old 07-10-2005   #18
Mel
Just the facts ma'm
 
Join Date: Jun 2004
Location: Malaysia
Posts: 793
Mel is just really niceMel is just really niceMel is just really niceMel is just really nice
Well lets hope that they continue to display results in the same way they did before ( is there any indication that they are not going to?) since in the past they have displayed results by perceived relevancy.

The problem with the idea that new pages will have some sort of a special chance is that in order for the new document to even be seen in the results it has to rank above the old documents which rank highly based partially on a history of clicks, with the ultimate result that the only ones who are going to see it at all are those who are willing to wade trough fifty pages of results to find it.
__________________
Mel Nelson
Expert SEO Dont settle for average SEO
Singapore Search Engine Optimization and web design
Mel is offline   Reply With Quote
Old 07-10-2005   #19
Mel
Just the facts ma'm
 
Join Date: Jun 2004
Location: Malaysia
Posts: 793
Mel is just really niceMel is just really niceMel is just really niceMel is just really nice
Wink

Quote:
Originally Posted by Jill Whalen
Well, one of those latest patent applications did have a lot of info on time spent on a site, so I'm quite sure it's something they are doing, or moving towards doing. I'm sure they will try to measure it by all means they can, clickthroughs from adwords and organic results plus length of time visited before coming back to the SERP, their toolbar spyware, and whatever info they will be getting from Urchin.

But like everything they do, it will be just one factor out of the hundreds. I doubt constantly clicking on your site at the SERPs and never going back to Google will have any impact, but if you have a great site that satisfies lots of searchers queries because they find exactly what they need, it may certainly help you in the long run with your rankings. And even if not, you've got nothing to lose!

Yes Jill, that and about a hundred other things were mentioned in that same patent and I would guess that you would not imagine that they are going to implement them all.

IMO the fact that Google moots something in a patent application certainly may be indicative of how some people are thinking, but it does not mean that it is a part of the alogrithm or is even being considered for that role.
__________________
Mel Nelson
Expert SEO Dont settle for average SEO
Singapore Search Engine Optimization and web design
Mel is offline   Reply With Quote
Old 07-10-2005   #20
sully
Member
 
Join Date: Nov 2004
Posts: 50
sully is on a distinguished road
Quote:
Originally Posted by Mel
Well lets hope that they continue to display results in the same way they did before ( is there any indication that they are not going to?) since in the past they have displayed results by perceived relevancy.
It hasn't worked so great historically. Sure, SEs want to display the most relevant but they also want to display quality results to their users. A logical way to determine this (at least partially) is to score how often a document is chosen and the length of time visitors spend on that site.


Quote:
Originally Posted by Mel
The problem with the idea that new pages will have some sort of a special chance is that in order for the new document to even be seen in the results it has to rank above the old documents which rank highly based partially on a history of clicks, with the ultimate result that the only ones who are going to see it at all are those who are willing to wade trough fifty pages of results to find it.
There is no indication that it won't (or doesn't) pull up a new document to display with older documents for scoring/testing purposes. Ever had a new site displayed at the top of the serps for 3-4 weeks and then tank?

Last edited by sully : 07-10-2005 at 01:27 PM.
sully is offline   Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off