Search Engine Watch
SEO News

Go Back   Search Engine Watch Forums > General Search Issues > SEM Related Organizations & Events
FAQ Members List Calendar Forum Search Today's Posts Mark Forums Read

Reply
 
Thread Tools
Old 08-09-2005   #1
rustybrick
 
rustybrick's Avatar
 
Join Date: Jun 2004
Location: New York, USA
Posts: 2,810
rustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud ofrustybrick has much to be proud of
Session Two - Day Two; Fun with Dynamic Sites

Moderator Detlev Johnson.

Mikkel Svendsen was up first, and I was a bit late, sorry Mikkel. What is not a problem with dynamic sites? Storing content in a database is not a problem. Question-marks are not a problem in itself, but it is an identifier to the search engines to spot a dynamic site that is template driven. SSI are not a problem in itself. Extension names are also not a problem (i.e. php, asp, cfm, .mikkel). Indexing Barriers; long and ugly URLs, duplicate content (session IDs, click IDs, time stamped URLs), spider traps (infinite loops), server downtime or slow responses. Some indirect related issues include; required support of cookies, javascripts, flash and so on. GEO-targeting and personalization is an indirectly related indexing barrier as well. Form based (post method) based navigation. Issues not related at all with dynamic sites, robot.txt issues, frames issues, password protected pages and so on all have nothing to do with dynamic site issues. Solutions that Work; many solutions are available, there are always more then one solution out there. The best strategy is to first see if you can fix your current system, that is the best. If that is not possible then move into a "bridge layer" like rewrite, if all fails, then replicate the content. Mikkel's favorite fix is the one-parameter web site. Normal dynamic URls contain all the necessary information in variables, you can just use one parameter and shows how the id matches up with the database. Tips and Tricks for Dynamic Sites; Automated Titles and META-tags based on existing database fields, Content related cross linking using a category or subject database, Dynamic headlines based on referring queries (works well for pages that rank well for many different terms), RSS feeds, Site Maps (google xml sitemaps), and "Spider Identification" (cloaking, personalization and geo-targeting).

Laura Thieme from Bizresearch is now up on the panel. She will cover Site Optimization, Data Feeds, and Paid Inclusion Programs. Are question marks ok to have in a URL? Yes it is ok, & are ok, and multiple variables are ok now. How do you optimize a dynamic site? Provide benchmarks in a ranking report, then do some traditional home page optimization, dynamic category, subs, product levels, dynamic meta tag generation, consider rewriting URLs if more then three variables, review log file spider reports & update robot.txt files as needed and watch rankings improve. Expand these now...Provide Benchmark; review URL structure, Review Google & AOL index, Review Yahoo, MSN, and Ask index. The Google index, they use site:www.site.com site.com; review number of pages indexed now, and it is amazing how this can improve positive or negative. She noted the index count for dynamic sites increase and decrease with Google at any point (very true). She then brought up a slide comparing some large sites and the number of pages indexed at Google vs. Yahoo vs. MSN. URL Rewrites, no rewrite tools are required, all custom rewrite programs. Problems: URLs had 6 or more parameters. Solution: Remove unnecessary and unused parameters from the URL, avoid passing redundant parameters, remove parameters that do not directly affect the content pages, create custom URL rewrites. One of the best things you can do is optimize the homepage by creating text links, text dynamically from generated categories, client administrates categories from web based admin tool and so on. Dynamically Generate Metatags; heading tags, description tags, titles, metatags and so on. Miva Merchant Indexing Problems? Not getting indexed, talk to her. Optimizing category pages, market multiple items on one page, optimize pages, add text links the search engines can follow, dynamically created metatags using products. Then optimize down to the product level; same deal as on the category pages. Spider Log File Analysis; review log files for spiders, she likes the ClickTracks Web analytics software for this. Creating Data Feeds; retailers:: shopping.com, froogle, etc. She has a big data feed matrix which I can't type here. Data Feed Observations; every feed is unique, so dont assume spider programs will get and understand it. Editing Data Feeds; cant expect to create data feed, automate, upload and be done, ongoing data feed maintenance is still required. Well optimize title and description of the data feed and the source (database). Paid Inclusion; its very easy to do with a dynamic site.

Jake Baillie from TrueLocal to discuss rewrite rules. Rewrite rules are regular expression based statement that tells a web server to do something. Most common use is when you map virtual URLs with a physical resource. Essentially, provides a fast and consistent way to manage URLs. Be careful with duplicate content issues and infinite loops. He then goes over "Tasty Recipes" ; moving a folder from old location to new one, 301 redirect. Serving content transparently, shortening URLs for search engines. Deny Access to an IP, prevents serving to a particular IP. Serve different content to a UA/IP, he shows how you cloak. Serve image based upon referrer, preventing image theft. Serve different content based upon time of day. Dirty lies; dropdowns can be a downer; search engines can read but not parse JavaScript, search engines will not post forms, search engines cant deal with any sort of dynamic interactivity when it comes to usual elements. If you want to use dropdowns, go with CSS solutions. Remember the basics; most search engines are actually good at indexing, most errors are silly, if all else fails (validate your code, check robot.txt, check that links are well formed, check for disallowed characters). Nosey competitors; unnatural traffic; guess what; people who type in "Allinachor" in Google are not your target visitors, people who type in "link" in google are not target visitors, people who come through the cache are not normal users. People coming from the same search 20 times in 2 minutes is not a target visitor, people who come in from whois.sc are not target visitors and so on. Track these people's referrers, you can cookie track along sites and you can track with graphics (it wont be 100% accurate). Serve these guys a 403 access forbidden, different page than everyone else, fun pictures to get them in trouble at work, own web sites, annoying them with MIDIs and WAV files, annoying infinite JavaScript popups, and the same thing as everyone else and just silently track them. Resources WMW forum92, httpd.apache.org, official documentation and URL Rewriting Guide.

Other sessions at www.seroundtable.com to be discussed here...
---- RSS, Blogs & Search Marketing (Ben Pfeiffer)
---- Ad Management: Do Humans Matter? (Chris Boggs)
rustybrick is offline   Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off