Search Engine Watch
SEO News

Go Back   Search Engine Watch Forums > General Search Issues > Search Technology & Relevancy
FAQ Members List Calendar Forum Search Today's Posts Mark Forums Read

Reply
 
Thread Tools
Old 09-17-2005   #1
shion
Curious.
 
Join Date: Sep 2005
Location: Melbourne
Posts: 2
shion is on a distinguished road
Question

Pardon if I've ungraciously bumbled into your digerati-parti but there is something disturbing thats been happening with Google over the years. And this latest action highlights it. (And I want to get it off my chest)

First read http://www.google.com.au/intl/en/profile.html

Ok. Get it, Google is supposed to be a search engine. A search engine, dedicated to finding and indexing information on the web for easy retrieval.

But; because of its own success it does not search or index passively. Google now does not index the web, it AFFECTS it.

For example: SEO.

Search engine optimization is not a good thing, It is bad because when you write you are supposed to target an audience, but instead you target search engines, so the 'intended' audience recieves lower quality or relevant information. SEO, reduces quality and relevance (information is affected).

Why bother with standards such as WC3 , or newbies such as Dublin Core, when Google has decided that it will be the setter of standards, big cheese, head honcho, BIG BROTHER.

By emailing websites telling people they are banned because they do not cow-tow to Googles standards (and basicly the whole idea of the Communications Initiative), Google has made a statement: We are no longer interested in accuratly indexing the web, we are only interested in indexing those parts of the web that we agree with. We shall now only provide you with biased and censored results.

I don't really think there is much sinister about Google, its just getting fat and lazy. I.e. instead of using their renowned research and software engineering to overcome problems, its easier to just play the heavy and hope those problems go away.

I think it comes down to relevance, a good search engine should 'assist you' to find relevant information, a bad search engine will 'tell you' what is relevant. Is Google dumming down the internet?
shion is offline   Reply With Quote
Old 09-17-2005   #2
projectphp
What The World, Needs Now, Is Love, Sweet Love
 
Join Date: Jun 2004
Location: Sydney, Australia
Posts: 449
projectphp is a splendid one to beholdprojectphp is a splendid one to beholdprojectphp is a splendid one to beholdprojectphp is a splendid one to beholdprojectphp is a splendid one to beholdprojectphp is a splendid one to beholdprojectphp is a splendid one to behold
Quote:
But; because of its own success it does not search or index passively. Google now does not index the web, it AFFECTS it.

For example: SEO.
Isn't that Life? Arthouse cinema went mainstream when there was a buck in it, the WWW went mainstream. Things exist, people react. Web developers wouldn't have jobs if the WWW didn't exist, is that a bad thing?

Quote:
By emailing websites telling people they are banned because they do not cow-tow to Googles standards
See, now which side are you on? You just pooh poohed SEO, but then when Google attempts to approach the subject of dodgy SEO, you say that is bad. Google do not, AFAIK, make any site cow-tow to anything. This is an attempt to stop sites doing stuff just for Search Engines, which I thought, from the first quote, you agreed with.

IMHO, the problem with this argument is that it is contradictory.
projectphp is offline   Reply With Quote
Old 09-19-2005   #3
dannysullivan
Editor, SearchEngineLand.com (Info, Great Columns & Daily Recap Of Search News!)
 
Join Date: May 2004
Location: Search Engine Land
Posts: 2,085
dannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud ofdannysullivan has much to be proud of
Shion, I split your post and the one response it gained from the Google Testing Ban Notification -- Could New Webmaster Tools Come? thread because really, it's going into a separate issue. That is, should search engines have standards about what they'll crawl at all.

You're not really "crashing" any "party" here, but as I'd tell anyone, you'll probably find people are more receptive to what you have to say if you don't immediately assume they are all of the same mind. They aren't here nor in many other places.

We've had plenty of debates over whether SEO is a good or bad thing here, and if you spend some time exploring threads (try the Top Threads link), you'll find plenty of discussion that might be worth reading if you want to discuss further here.

You might also check out my Worthless Shady Criminals: A Defense Of SEO which explains why the search engines have their own natural flaws and why SEO is not at all necessarily delivering "lower quality" content.

Here's a good example. The Red Cross recently created a new site to help those find missing people in the wake of Hurricane Katrina. Unfortunately, they built it in a way as not to be friendly to search engines. The spiders couldn't get into crawl it. So good, crucial information people need -- but they can't find because some basics of SEO weren't followed. That's what a lot of SEO is -- cleaning up the messes that the imperfect search engines create, as well as designers who often don't consider search engines at all.

Further to your point, Google is not the first search engine to have guidelines over what it does and doesn't consider to be acceptable content. Search engines long before this have set their own standard over what they consider "spam." You single out Google, but it's just doing what search engines have long done.

As for Google suddenly making a "statement" with the emails, heck, it's had these standards on its site for years. It made a statement with them back then. In fact, it was a bigger statement when it finally added these guidelines years ago considering that one of its founders famously quipped at the beginning that Google didn't feel there was such a thing as "spam" that it later added guidelines.

Whether any search engine should have such guidelines and what those should be is open to debate. And we've debated it. White Hat - Gray Hat - Black Hat is a big, fat list of past discussions I'd really encourage you to read, so that you are able to advance any conversation on this topic here, rather than rehashing past points.

In the end, this point:

Quote:
I don't really think there is much sinister about Google, its just getting fat and lazy. I.e. instead of using their renowned research and software engineering to overcome problems, its easier to just play the heavy and hope those problems go away.

I think it comes down to relevance, a good search engine should 'assist you' to find relevant information, a bad search engine will 'tell you' what is relevant. Is Google dumming down the internet?
Suggests that you don't understand some of the arms race mentality that has long occupied the search space. Search engines indeed do try to find technological ways to find the best pages. Link analysis is one example of that. But over time, technologies become dated, especially as people reverse engineer ways to beat the system. That can be either an aggressive black hat SEO person who finds an effective hole or just a group of bloggers who decide to embark on a linking campaign to help a site rank well for a term in response to a political action, rather than whether the site is "relevant" for that term or not. The search engines, as long as they remain effectively open systems taking content from anyone, have to keep refining their own technology.
dannysullivan is offline   Reply With Quote
Old 09-19-2005   #4
xan
Member
 
Join Date: Feb 2005
Posts: 238
xan has a spectacular aura aboutxan has a spectacular aura about
Hi guys,

first, the dublin core innitiative has been around since October 1994. It's the semantic building block of Web metadata.

Google is a search engine. It's job is to retrieve information which is relevant to a query. That, as far as I can tell, is what Google has always strived to do.

You have to appreciate it from a technical point of view, rather than a consumer point of view. It is very hard to achieve good results in IR. It has been researched for a long long time, and still we haven't achieved perfection, or rather what some researchers would call "even an acceptable level of performance". New solutions are necessary. The semantic web and other such ideas could be very helpful. We'll have to see.

It's not a case of excluding sites which don't follow SEO, it's just that the way the algorithm works means that unfortunately sites which don't make it past those filters won't be returned. The idea is not to stop sites getting in the results but rather trying to get the relevant ones out.

It's like strawberry picking, some good ones are hidden under a leaf or something so they don't get picked.

Consumers have a very high expectation of what technology, not only search, should perform to. Voice recognition is deemed a faliure if it doesn't catch everything,despite the fact that it has seen an enormous improvemenet, which has been very very hard to achieve. The same goes for search and other technologies too, such as your in car navigation system.

We'll get there in the end, but it's not going to be perfect straight away
xan is offline   Reply With Quote
Old 09-19-2005   #5
PerformanceSEO
"Be Accountable"
 
Join Date: Aug 2005
Location: Los Angeles, CA
Posts: 78
PerformanceSEO will become famous soon enoughPerformanceSEO will become famous soon enough
More to Danny's question:

I believe search engines can set standards until the cows come home. The only people who would be the slightest bit concerned about those standards are the people who are trying manipulate their site to acheive top Organic rankings.

So why would a search engine whose paychecks depend on PPC want to help a website acheive Organic rankings?

And if they ARE going to set standards, why tell people what they are? Why would a search engine care who is following their guidelines or who isn't?

In my opinion, setting standards is admitting your technology is imperfect and vulnerable. If an engine was 100% confident that they were able to crawl the ENTIRE web and show only what was relevant, there would be no need to set standards.

It appears as if setting standards is basically ASKING for help disguised as OFFERING help. They know their technology isn't grown up enough yet to be able to crawl every website on the internet. (And if you think about it, isn't that what the job of the engine is? To crawl the ENTIRE web?) But they can't...yet. So they need help.

The engines should really get back to basics and continue to PERFECT their technology instead of attempting to have the entire Web to comply to their standards...which could really be considered "reverse engineering"
PerformanceSEO is offline   Reply With Quote
Old 09-19-2005   #6
xan
Member
 
Join Date: Feb 2005
Posts: 238
xan has a spectacular aura aboutxan has a spectacular aura about
Hey,

it's not about any search engine needing help, it's the methods we use that are being further investigated. The idea of marking up websites in a particular way is to try and improve their performance. RDF or stuff like that or any other type of markup would be in the interest of all of us if it actually helps to achieve better results.

They never said they had a perfect search engine either. Nobody has. Everyone is to an extent working together on this. Standards are set by W3C, IEEE and things like that.


I wrote something in my blog about a recent paper released by Google and MIT
which touches on encouraging the semantic web by getting users to markup their RSS feeds.
It does explain why this is a sound idea, and doesn't mean that Google are taking over the world or anything like that.

"So why would a search engine whose paychecks depend on PPC want to help a website acheive Organic rankings?"
-If you don't return meaningful results, nobody is going to use your engine anymore and therefore you PPC efforts are swept away.

We don't even know exactly how big the web is. How can we know we've crawled all of it?
xan is offline   Reply With Quote
Old 09-19-2005   #7
PerformanceSEO
"Be Accountable"
 
Join Date: Aug 2005
Location: Los Angeles, CA
Posts: 78
PerformanceSEO will become famous soon enoughPerformanceSEO will become famous soon enough
Quote:
Originally Posted by xan
"So why would a search engine whose paychecks depend on PPC want to help a website acheive Organic rankings?"
-If you don't return meaningful results, nobody is going to use your engine anymore and therefore you PPC efforts are swept away.
That's very well and obvious, Xan, but I think you misunderstood the question.

I'll rephrase it:
In what way(s) would a search engine benefit by setting guidelines and helping people to follow those guidelines (including displaying more relevant results)?
PerformanceSEO is offline   Reply With Quote
Old 09-19-2005   #8
xan
Member
 
Join Date: Feb 2005
Posts: 238
xan has a spectacular aura aboutxan has a spectacular aura about
I'm sorry about that. I might be a bit slow today but I don't get the question. What do you mean? Do you mean advice on how to search, or what site owners would do?
xan is offline   Reply With Quote
Old 09-19-2005   #9
PerformanceSEO
"Be Accountable"
 
Join Date: Aug 2005
Location: Los Angeles, CA
Posts: 78
PerformanceSEO will become famous soon enoughPerformanceSEO will become famous soon enough
A more simple question:

What would be the motive behind a search engine setting guidelines?

Once you can answer that, then ponder:

What do they have to gain by doing so?

Why would people following (or not following) those guidelines have anything to do with the relevancy of that engine's results?


Sorry if I'm not making myself clear.
PerformanceSEO is offline   Reply With Quote
Old 09-20-2005   #10
xan
Member
 
Join Date: Feb 2005
Posts: 238
xan has a spectacular aura aboutxan has a spectacular aura about
I think that they already go issue guidelines:

http://www.google.com/webmasters/guidelines.html

If people follow them, they are able to make their sites more attractive to the spiders and also maybe rank a little better by making certain changes. These changes do not alter the initial relevance, but they make it easier for it to be ranked or found.

I think people would appreciate what they are being told by a search engine engineer and would probably choose to markup their site in a particular way because "TAGSearchEngine" works using tags, and the developers explain that it can only list websites marked in a certain way.

I don't see what guidelines you mean otherwise.
xan is offline   Reply With Quote
Old 09-20-2005   #11
PerformanceSEO
"Be Accountable"
 
Join Date: Aug 2005
Location: Los Angeles, CA
Posts: 78
PerformanceSEO will become famous soon enoughPerformanceSEO will become famous soon enough
{SIGH}

Anyone else?

PerformanceSEO is offline   Reply With Quote
Old 09-20-2005   #12
mcanerin
 
mcanerin's Avatar
 
Join Date: Jun 2004
Location: Calgary, Alberta, Canada
Posts: 1,564
mcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond repute
Sometimes I think that "standards" is too strong of a word for search engine guidelines. And "guidelines" too weak.

I prefer to think of it as "compatible" with a specific type of software - in this case a combination of a bot and an indexing system.

If you are compatible, there will be few problems spidering and ranking. If you are not, then you will probably have issues.

If the ranking system looks at certain things in a certain way, then you need to be compatible with those metrics in order to rank well. If you don't care about being ranked, then you don't need to be compatible, and there is nothing really wrong with that.

Ian
__________________
International SEO
mcanerin is offline   Reply With Quote
Old 09-21-2005   #13
xan
Member
 
Join Date: Feb 2005
Posts: 238
xan has a spectacular aura aboutxan has a spectacular aura about
Thanks Ian,

I totally agree.
xan is offline   Reply With Quote
Old 09-21-2005   #14
PerformanceSEO
"Be Accountable"
 
Join Date: Aug 2005
Location: Los Angeles, CA
Posts: 78
PerformanceSEO will become famous soon enoughPerformanceSEO will become famous soon enough
Maybe I'm from another planet, but I'm looking for someone to answer this question:

"What do the search engines themselves have to gain by setting standards (or guidelines)"?

If they are in control of their own algorithm, and are confident they are delivering the most relevant results, what would be the point of them setting standards and helping people to follow them (i.e. Google's new notification system). What does Google have to gain by doing this?

But if I'm a complete idiot for asking, I'd be happy to hear it.
PerformanceSEO is offline   Reply With Quote
Old 09-21-2005   #15
Alan Perkins
Member
 
Join Date: Jun 2004
Location: UK
Posts: 155
Alan Perkins will become famous soon enough
Quote:
Originally Posted by PerformanceSEO
"What do the search engines themselves have to gain by setting standards (or guidelines)"?
Search engines have standards. They use automated and manual means to exclude certain pages or sites from their indexes. They use other automated and manual techniques to prevent those pages that ARE in their index from ranking where they should not rank for certain queries entered by certain classes of searcher.

As a rule, search engines do not publish those standards.

Once upon a time, they did not publish guidelines either. When there were no guidelines, a lot of people suffered by designing or following bad practices - i.e. practices designed to deceive search engines, in the opinion of those search engines. Some people may have been deliberately doing this, and were prepared to accept the consequences (banning or some lesser penalty). Others had no clue that what they were doing was wrong, and felt pretty put out that search engines didn't put any effort into preventing them making mistakes. So search engine guidelines were born. Now at least search engines were making an effort to keep people on the right path.

Now the guidelines exist, but still people don't read them. Still they (deliberately or accidentally) do things that search engines don't really want. This initiative is designed to help those people who do these things accidentally.

This is not the search engine telling the Webmaster how to design a Web site. This is just the search engine saying what's wanted and unwanted in its own index. If you don't want to be in that index, then don't follow the advice.

The advice is given not because a search engine wants to create the Web in its own image, but for precisely the opposite reason. i.e. the advice is an attempt to stop you doing things solely because that search engine exists.
Alan Perkins is offline   Reply With Quote
Old 09-30-2005   #16
claus
It is not necessary to change. Survival is not mandatory.
 
Join Date: Dec 2004
Location: Copenhagen, Denmark
Posts: 62
claus will become famous soon enough
A classic

Quote:
Originally Posted by Alan Perkins
This is not the search engine telling the Webmaster how to design a Web site. This is just the search engine saying what's wanted and unwanted in its own index.
Which, in turn, is an index of the whole web, or so they say. So, on one hand they try very hard to index the whole web, or at least as much as they possibly can get away with, and on the other hand they tell you that your specific page (out of the many billions out there) is not welcome, unless you do so-and-so.

The original poster had a question or two: (Why) have they gone from indexing the web to influencing it? And should they do that?

It's a classic in methodology classes, whereever they teach that stuff (Do they still teach that? Far too often it seems like they don't). I'll try to explain in plain language, but it is a little bit complicated until you grasp it, then it becomes quite intuitive:

By measuring some object you may unintentionally influence it so that it changes, and because of that change your measurements will become wrong. For that reason researchers sometimes go to extremes to make sure that their research is "neutral", ie. that it influences the object of research as little as possible.

So what we have is a search engine. That's basically an automated research mechanism operating on the research object "internet pages". If the maintainers of the engine want high quality output they would want to register things as they happen naturally, and figure out ways to deal with the many different kinds of observations they will encounter. Those observations may include, eg.: New sites, old sites, large sites, small sites, graphical sites, text heavy sites, keyword poor sites, keywod rich sites, well linked sites, poorly linked sites, commercial sites, non-commercial sites, etc. etc. and ad nauseam

As long as they remain neutral they can set up filters, hooks, traps, rules, mechanisms, algorithms, and whatever they like to call it, to identify the kind of sites that they prefer. These "rules" will be reasonably stable over time, as only the natural development in the underlying system will cause changes.

However, as soon as they influence the system themselves it becomes harder. Now some part of the system will adapt to their observation, and arrange itself to be more easily observed than the rest of the system. This means that they will have to make their measurements "suspicious" - ie. they have to ignore those parts of the positive factors that they have introduced into the system themselves.

And here the chain usually breaks. Because, you don't influence the whole system. You influence some part of the system, but other parts still behave exactly as they would do otherwise. So it's like having two traffic lights that are both green. One is green because, well, it's green. The other is green because it assumes that you like green lights. So, if you didn't know that you could easily end up in an accident. But do you know? How certain is your knowledge? And how will you sort out which one is really the one you want?

In this simple example there's a 50/50 chance of making a mistake, but with the web you have billions and billions of pages, and it's not just a case of "green or not", as in stead there are many shades of grey.

I'm not sure if I expressed myself clearly, but the essence is that by influencing the system you make it a whole lot harder for yourself to retrieve the information that you really want.
claus is offline   Reply With Quote
Old 10-03-2005   #17
shion
Curious.
 
Join Date: Sep 2005
Location: Melbourne
Posts: 2
shion is on a distinguished road
Should Search Engines Get To Set Standards?

Pardon the bluntness of my original post; I was just a bit annoyed that the thread seemed a bit monotone or something (I immediately assumed they were all of the same mind). I now realise I saw it as paradoxical, dodgy-seo and non-dodgy-seo are both seo. I'm quite heartened to see some of the issues I was thinking about articulated in some stunningly intelligent and thoughtful posts in both threads.

I will preface this by saying that my interest in search engines (and this site) is about 'Equitable access to information for all'. So when I see any question I ask myself who are stakeholders and how will the issue affect them all.

So to the question, should search engines should get to set standards?

Well.... yes. Why not, they have the resources, both tangible and intangible, that could generate terrific ideas for standards.

BUT. Already in this thread the question has been expanded to: "What do the search engines themselves have to gain by setting standards".

If I may I'll flip that question to read "What does everybody have to gain by allowing search engines to set standards?" As I see it the caveat is what do all the stakeholders have to gain? If the answer is 'Something' and not 'Nothing' then the answer to the original question is Yes.

I was almost going to say that it is irrelevant what the search engines have to gain but that would be false, they need to gain something. When you get down to it profit in Dollars or Altruism-Points, are the main things they could gain and I dare to venture the major search engines now listed on the stock exchange, may perhaps lean towards dollars.

Let me wail on about stakeholders a bit more with some examples:

A search engine comes up with a terrific idea with a standard.
- The conscientious content producers, gain more kudos or $ by following the rules.
- The unscrupulous content producers gain more kudos or $ without needing to be so annoying.
- The Ignorant content producers gain more kudos or $ and think gee whiz life is good.
- The Content consumers think gee whiz this search engine is good.
- The Search engine gains more kudos or $ because the consumers think they are good.
- The compelling and innovative environment that is the WWW is enriched for the benefit of all.


Here's another example.
A search engine comes up with a dumb idea like banning sites and threatening them via email.
- The conscientious content producers, lose kudos and $ by being restricted by annoying rules.
- The unscrupulous content producers gain more kudos or $ by manipulating the rules.
- The Ignorant content producers lose kudos and $ because they got banned and they don't even know it.
- The Content consumers think hmm what happened to the results. This is rubbish.
- The Search engine lose kudos or $ because the consumers think they are rubbish.
- The compelling and innovative environment that is the WWW is stifled (dummed down), a loss for us all.
You could also read into those examples how each party affects the other in light of the new standard.

Should search engines should get to set standards?
Yes, for whatever reason, but it must be done in consultation and with the acceptance of the whole community. If it is imposed thoughtlessly or bombastically it will ultimately fail. C'mon guys. Group hug

Quote:
Originally Posted by claus
I'm not sure if I expressed myself clearly, but the essence is that by influencing the system you make it a whole lot harder for yourself to retrieve the information that you really want.
I'm still reading this stuff and trying to learn myself something, but I think the paradox I mentioned relates to this. They (search engines) don't just influence the system they are a part of it. Their very existence influences it. If you are in a system, taking yourself out and then trying to influence it is an error.
shion is offline   Reply With Quote
Old 10-03-2005   #18
mcanerin
 
mcanerin's Avatar
 
Join Date: Jun 2004
Location: Calgary, Alberta, Canada
Posts: 1,564
mcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond repute
Quote:
If you are in a system, taking yourself out and then trying to influence it is an error.
Agreed. At the point where you stop saying "hey WE should do this" and start thinking "hey, YOU should do this" you've lost focus. This not only goes for search engines, but also webmasters.

Every time a search engine does something, the people affected by it react. Every time the people react, a search engine has a different paradigm to work with. If you want to win a chess game, you don't only think one move ahead - you think several. Whoever thinks the furthest ahead wins, usually.

I'm in the process of writing a book called "SEM and the Art of War" (based on Sun Tzu's classic) and the section I'm on is related to this. You cannot win anything if you are following along behind the pack. If a search engine comes out with a new way to do things, you can assume that there will be thousands of people sitting there trying to figure out how to take advantage of it shortly thereafter.

The only way to consistently win in that scenario is to not only figure out what people are going to start to do, but to figure out what the search engine is going to do in response. Then prepare for that.

Otherwise, the only way to come out ahead is to react so quickly that the SE doesn't have time to react before you are set up and ready to go. Then you are in downtime waiting for the reaction so that you can come up with a response to deal with that. Pretty stressful, and because you are focusing on the short term and (always) without the full story in front of you, it's easy to make a mistake. If you are two steps ahead, then you have the time and luxury to adapt your plan as new information becomes available.

The bottom line is, yes, search engines can set standards, preferences and rules for their business, just like anyone else can for their own. But the fact is that all those standards, preferences and rules are based on webmaster behavior, and therefore webmasters can and do influence these rules indirectly. And vice versa.

Finally, lets not forget the 500 pound gorilla with the "veto" power in the corner - the users.

If at some point this back and forth between webmasters and search engines loses focus on the users, they will learn about that mistake quickly, probably in a very unpleasant and expensive manner. It's the users that stop this process from (barely) spinning out of control, IMO.

Ian
__________________
International SEO
mcanerin is offline   Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off