Archive of UserLand's first discussion group, started October 5, 1998.

Re: Junk in URLs and link-rot

Author:Jorn Barger
Posted:3/9/1999; 7:45:12 AM
Topic:Unique vs. Generic URLs
Msg #:3834 (In response to 3826)
Prev/Next:3833 / 3835

There's a really elegant solution to these problems that doesn't require beating up any webmasters: create parsing browsers that can access a web-based database where every website's peculiar url-syntax is decoded. The browser can then figure out where to look for something that's moved, or how to turn a bad bookmark into a good one (at the time of bookmarking). This should work even with infinitely idiotic sites.

The same database would know how to format search-patterns, and where to look for archives, for new content, etc etc etc.

(This is Bucky Fuller's theory of design-- don't change the user, change the environment. His example was an endlessly useful trick for getting bees or wasps out of your livingroom-- don't run around swinging things at them, instead close all the shades except where a window or door is open to outside. The light attracts them, and voila...)

I also have some web design awards in particular categories that don't yet include URL-format, but do mention that problem in passing. Worst would be netsyn.com URLs, which have to be seen to be believed.

jorn


There are responses to this message:


This page was archived on 6/13/2001; 4:48:34 PM.

© Copyright 1998-2001 UserLand Software, Inc.