Peter Blue - Web Developer - Technologist - Entrepreneur HomeAbout MeContactArticlesProjectsTwitterEvents
 

An Intro to SEO - Chapter 8 - Opportunities with dynamic pages

Its Alive !!

Pages that have dynamic, database driven content are an increasing trend and present a real problem for search engines and web masters trying to promote them. Search engine 'bots' can't normally get to the contents of your database directly so indexing its contents is tricky at best.

The problem is further compounded by the need for users to log-in. Search engine 'bots' can't log-in but with a few sneaky tricks, you can arrange a feast of content for them when they come to index your site.

There a few methods to create dynamic pages like Perl / CGI scripts, ASP (Active Server Pages) but my favourite is PHP because it is similar to standard HTML and can be optimised in the same way - all the methods described in previous chapters can be applied here.

One advantage of dynamic pages is that they allow you to have one page eg book-show.php that can display thousands of book details from a database like this book-show.php?book=1, book-show.php?book=2, ~ book-show.php?book=3. Most search engines would see the last three as separate pages (because of the query string ?book=x) even though the file name is the same (book-show.php).

If your database contains thousands of records you have a wonderful opportunity to get your site seen as a massive (and popular) resource. Here is an example on how RareList does it :-

  1. If a user clicks on link book-show.php?book=1

  2. The PHP script sees book=1 and gets the first record from the book table.

  3. It starts to write out the HTML header information (<HTML><HEAD>).

  4. When it comes to the title tag it uses the book title and author.

  5. When it comes to the description tag it uses the book description.

  6. It finishes the header (</HEAD><BODY>) and writes the top menu.

  7. It then writes one complete page about that book using information from that books record.

  8. It finishes by writing the end tags (</BODY></HTML>).

It creates a page called deeplink.html every night that links to every book record like this :-

<A HREF="book-show.php?book=1">A call to Arms</A><BR>
<A HREF="book-show.php?book=3">A cruel courtship</A><BR>
....
<A HREF="book-show.php?book=10000">Abinger Harvest</A><BR>

It does that for every book record and therefore creates the impression of a site with thousands of unique pages - just what search engines love.

Google now has a service called "SiteMap". This allows you to submit a file that lists all the above URLs so every entry can be found easily. They can be a simple text file with a URL on each line or an XML file. More details can be found here SiteMaps including the XML file format.

If you are hosting on a UNIX based system like Linux, BSD or Solaris you can setup a Cron Job to write this file automatically for you every night.

Intro | Chapter 7 | Chapter 9

  Site Designed - PJ Blue   Directory   Stats